WO2016058043A1 - Identifying method and apparatus - Google Patents
Identifying method and apparatus Download PDFInfo
- Publication number
- WO2016058043A1 WO2016058043A1 PCT/AU2015/050624 AU2015050624W WO2016058043A1 WO 2016058043 A1 WO2016058043 A1 WO 2016058043A1 AU 2015050624 W AU2015050624 W AU 2015050624W WO 2016058043 A1 WO2016058043 A1 WO 2016058043A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- images
- category
- display
- user
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
Definitions
- This invention relates to a method and apparatus for identifying targets of interest and in one particular example for allowing users to review images of experiences and locations to identify destinations of interest, for example for use in travel planning.
- the present invention seeks to provide an apparatus for use in identifying destinations of interest, the apparatus including one or more electronic processing devices that:
- an experience category b) select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image location and a reference image experience and wherein the display images are selected at least partially in accordance with the at least one selected category;
- the one or more electronic processing devices select the display images in accordance with a category structure defining relationships between location categories and between experience categories.
- the one or more electronic processing devices select display images having at least one of a reference image location and reference image experience including at least one of a location category and experience category within the selected category.
- the one or more electronic processing devices in response to display of at least some of the display images, the one or more electronic processing devices:
- the one or more electronic processing devices in response to display of at least some of the display images, the one or more electronic processing devices:
- the list of categories includes a list of location categories and a list of experience categories.
- the list of categories is displayed in accordance with at least one of:
- the one or more electronic processing devices cause a user interface to be displayed to a user on a client device, the user interface including:
- the one or more electronic processing devices perform at least one action associated with one or more of the display images.
- the one or more electronic processing devices determine at least one action associated with a respective display image in accordance with at least one of:
- the at least one action includes at least one of:
- the one or more electronic processing devices share at least one display image by:
- the one or more electronic processing devices flag at least one display image as a favourite image by:
- the one or more electronic processing devices include a metadata server that:
- a) receives an indication of at least one selected category
- the one or more electronic processing devices include an image server that: a) receives an indication of at least one image identifier;
- ii) provides the image to the client device via the communications network.
- the one or more electronic processing devices include an account server that performs actions at least in part using image identifiers received from a client device via a communications network.
- the one or more electronic processing devices include an account server that:
- a) receives an action request associated with at least one display image from a client device via a communications network
- c) determines at least one of a location category and an experience category using the image identifier
- d) performs an action at least in part using the at least one of a location category and an experience category.
- a) determines selected display images in accordance with user selections made via a client device
- the one or more electronic processing devices a) select display media from a plurality of reference media, wherein each of the reference media are associated with metadata indicative of at least one of a reference media location and a reference media experience and wherein the display media are selected at least partially in accordance with the at least one selected category; and,
- the present invention seeks to provide a method for use in reviewing images of destinations, the method including, in one or more electronic processing devices:
- each of the reference images are associated with metadata indicative of a reference image location and a reference image experience and wherein the display images are selected at least partially in accordance with the at least one selected category;
- the present invention seeks to provide an apparatus for use in identifying targets of interest, the apparatus including one or more electronic processing devices that:
- each of the reference images are associated with metadata indicative of at least one of a reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category;
- the target includes at least one of:
- the categories include at least one of:
- the present invention seeks to provide a method for use in identifying targets of interest, the method including, in one or more electronic processing devices:
- each of the reference images are associated with metadata indicative of at least one of a reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category;
- Figure 1 is a flowchart of an example of a method for use in identifying destinations of interest
- Figure 2 is a schematic diagram of an example of a distributed computer architecture
- Figure 3 is a schematic diagram of an example of a processing system of Figure 2;
- Figure 4 is a schematic diagram of an example of a client device of Figure 2;
- Figure 5 is a flowchart of an example of a method of assigning images to categories
- Figure 6 is a schematic diagram of an example of a category structure
- Figure 7 is a flowchart of an example of a method of browsing images to identify destinations of interest
- Figure 8 is a schematic diagram of an example of a user interface for use in the method of Figure 7;
- Figures 9A and 9B are a flowchart of a further specific example of a method for use in browsing images to identify destinations of interest;
- Figure 10 is a schematic diagram of an example of a user interface used in the method of Figures 9A and 9B;
- Figure 11 is a flowchart of an example of a method for selecting and viewing favourite images
- Figure 12 is a flowchart of an example of a method of sharing images
- Figures 13A and 13B are a flowchart of an example of a method for viewing maps
- Figure 14 is a schematic diagram of an example of a user interface for use in the method of Figures 13A and 13B;
- Figure 15 is a flowchart of an example of a method for booking a trip
- Figure 16 is a schematic diagram of an example of user interface showing an itinerary
- Figure 17 is a flow chart of an example of a business intelligence process. Detailed Description of the Preferred Embodiments
- the term “reference image” is used to refer to an image of a destination that is typically associated with defined metadata.
- metadata will be understood to include information regarding the image, including at least a location shown in or otherwise associated with the image and/or an experience shown in or otherwise associated with the image, as well as any other relevant information.
- the metadata can be stored either together with the image or separately from the image and may be defined through a combination of automatic processes, such as during capture of the image and manual processes, for example as part of an image curation process.
- display image is used to refer to a reference image that is presented to a user on a display device associated with a processing system, client device or the like.
- destination is used to broadly refer either to a location or an experience to which an individual may wish to travel.
- a “location” generally refers to a geographic location
- experience refers to an event, activity, attraction, occurrence, landform, or the like, in which individuals can participate, observe or encounter.
- the terms are used for the purpose of illustration only and are not intended to be limiting.
- At step 100 at least one selected category is determined by the one or more electronic processing devices.
- the at least one selected category includes at least one of a location category and an experience category, and can include both a location and experience category.
- the at least one selected category can be determined in any one of a number of manners depending upon the preferred implementation.
- a user interface can be displayed by the one or more electronic processing devices, which includes details of available location and/or experience categories, allowing these to be selected by the user.
- categories could be selected at least in part through automated processes, for example on the basis of previously viewed categories, default categories, categories associated with selected images, on the basis of keyword searches or the like.
- the one or more electronic processing devices select one or more display images from a plurality of reference images.
- Each of the reference images is associated with metadata indicative of at least one of a reference image location and/or a reference image experience, allowing the display image(s) to be selected at least in part in accordance with the at least one selected category.
- a hierarchical category structure can be defined allowing display images to be selected that are in currently selected categories, or subcategories thereof, as will be described in more detail below.
- At step 120 at least some of the display images are displayed to the user allowing the user to review the display images.
- the user can then refine the one or more selected categories, in turn allowing the user to iteratively update and review the display images to thereby identify destinations of interest.
- This can be achieved using any suitable technique and could include selecting additional or alternative categories, or by deselecting categories. Alternatively, this could be achieved by selecting images of interest and then deriving category selections from the selected images.
- the above described method utilises categories to search and browse images relating to a range of different destinations, allowing the user to review images and use this to refine selected categories thereby allowing additional relevant images to be displayed. This allows the user to go through an iterative process of progressively refining categories by identifying images of interest, in turn allowing them to identify one or more destination of interest.
- one or more additional actions can be performed, such as sharing images, identifying favourite images, viewing maps showing locations of destinations of interest, viewing additional information associated with destinations of interest, or the like. Additionally and/or alternatively, destinations of interest can be used in developing a travel itinerary and/or booking an excursion, such as a holiday, or the like.
- the above described process allows users to go through a process of browsing images relating to various destinations, allowing them to select destinations of interest, including different experiences and/or different locations. This can then be used in performing further actions, including preparing a travel itinerary and/or booking.
- the one or more electronic processing devices select the display images in accordance with a category structure defining relationships between location categories and between experience categories. This allows the display images to be selected based on relationships between categories, for example between experience and location categories, and/or based on a hierarchy, such as parent/child relationships.
- the one or more electronic processing devices select display images having at least one of a reference image location and reference image experience including at least one of a location category and experience category within the selected category.
- these can be assigned to multiple categories using the category structure, so that the reference image location/experience could include multiple location/experience categories.
- the one or more electronic processing devices select display images having at least one of a reference image location and reference image experience within the selected category or within a child category of the selected category.
- the selected categories can include multiple categories, for example, by selecting a location and experience category, or by selecting multiple complimentary experience categories.
- a user could select a beach and a surfing category, with images being displayed that relate to both surfing and beaches.
- categories could be explicitly excluded, so that a user could select a beaches category, but exclude the surfing category, thereby excluding surfing related images.
- the one or more electronic processing devices determine user interaction with at least one of the images and at least partially refine the at least one selected category in accordance with the user interaction. For example, the user could select to "like" or “dislike” an image, in which case additional categories could be added or excluded. For example, if a user selects a beach category and then "likes" a surfing image, this could cause a surfing category to be added to the selected categories.
- the one or more electronic processing devices determine at least one selected category in accordance with user input commands and select display images from the plurality of reference images at least partially in accordance with the at least one further selected category.
- the user can add further categories or deselect existing categories, allowing the images that are displayed to be modified accordingly.
- Selection of categories can be achieved in any appropriate manner, but in one example, this involves having the one or more electronic processing devices cause a list of categories to be displayed to a user and determine at least one selected category in accordance with user interaction with the list of categories. Thus, the user can select/deselect categories in the list allowing these to be added or removed from the selected categories.
- the list typically includes a list of location categories and a list of experience categories, and is optionally displayed in accordance with at least one of a number of images in each category or a popularity of each category or a category structure. Whilst the list could be displayed alphabetically, by ordering this based on number of images and/or popularity, this allows the list to be prioritised to enhance the effectiveness of the list and maximise the number, relevance or appeal of images presented to the user.
- interaction is via a user interface that is displayed to a user on a client device, the user interface including a number of display images and at least one toolbar including a list of categories.
- the user interface including a number of display images and at least one toolbar including a list of categories.
- any suitable arrangement could be used.
- the one or more electronic processing devices perform at least one action associated with one or more of the display images.
- the action could be of any suitable form and example actions include, but are not limited to sharing at least one display image with a recipient, recording at least one display image as a favourite image, displaying additional information relating to at least one display image, such as a map or text based explanation of the experience or location, or updating an itinerary at least partially in accordance with at least one display image.
- Selection of the action can be performed in accordance with user input commands and/or selection of an icon displayed coincidently with the respective display image. This in turn provides a mechanism for users to easily perform actions as part of the process of browsing images, in turn increasing the likelihood that browsing of images will progress into further interaction with the system.
- this can be achieved in any suitable manner, and in one example, this involves the one or more processing devices determining an image identifier associated with the at least one display image and transferring an indication of the image identifier to a destination.
- a client device at the destination can then use the identifier to retrieve the image, for example from an image server or the like.
- the one or more electronic processing devices can record an indication of at least one display image as a favourite image by determining an image identifier associated with the at least one display image and recording an indication of the image identifier as part of user data associated with the user. This allows an indication of the images to be stored and/or shared easily, without requiring multiple copies of images to be created and stored, thereby minimising storage requirements.
- the one or more electronic processing devices select display images by searching a metadata store including metadata for the plurality of reference images, retrieve image identifiers for the display images from the metadata store and use the image identifiers to retrieve display images from an image store.
- the one or more electronic processing devices include a metadata server that receives an indication of at least one selected category, selects the display images, determines an image identifier associated with each display image and provides an indication of the image identifier.
- the one or more electronic processing devices also typically include an image server that receives an indication of at least one image identifier, uses the at least one image identifier to retrieve at least one image from an image store and provides the images.
- this allows the images and metadata to be stored separately allowing searching and hosting of images to be performed using separate servers, which can assist balancing workflow, whilst also allowing images to be provided solely on the basis of image identifiers, making actions such as sharing of images more streamlined.
- This also allows indications of selected images to be stored by third parties in the form of an identifier, without any further information being required, such as details of selected categories, which assists image owners in retaining control of their images and the associated metadata.
- the metadata and image servers could communicate directly, more typically communication is via a client device, such as a computer system, tablet, mobile phone or the like.
- the metadata server that receives an indication of at least one selected category from the client device via a communications network and provides an indication of the image identifiers to the client device via the communications network.
- the image server receives an indication of at least one image identifier from the client device via the communications network and provides the image to the client device via the communications network.
- the one or more electronic processing devices can also include an account server that performs actions at least in part using image identifiers.
- the account server can be used to perform sharing of images, recording indications of favourite images, adding images and/or destinations to itineraries, booking travel, or the like.
- the account server can receive an action request associated with at least one display image from the client device via the communications network, determine the image identifier of the at least one display image, determine at least one of a location category and an experience category using the image identifier and perform an action at least in part using the at least one of a location category and an experience category.
- the account server typically transfers the image identifier to a metadata server and receives an indication of the at least one of a location category and an experience category from the metadata server. This can be performed directly, or via the client device, depending on the preferred implementation. In either case, the account server need only know the image identifier of an image selected by the user, and can then retrieve required information, such as the location and experience associated with the image as required. This reduces the storage requirements of the account server, whilst limiting access to the image metadata, allowing commercial value in the metadata to be maintained.
- the account server determines selected display images in accordance with user selections made via a client device, determines locations and experiences associated with selected display images and uses the locations and experiences to at least partially prepare a trip itinerary and/or book a trip.
- the account server can receive identifiers of images of interest to the user, and then use these to receive location and experience information from the metadata server, with this in turn being used to prepare a travel itinerary.
- the one or more processing systems can also display other media, such as videos, music, or the like. This can be performed concurrently with, or separately and in addition to the display of images.
- the one or more processing devices select display media from a plurality of reference media, wherein each of the reference media are associated with metadata indicative of at least one of a reference media location and a reference media experience and wherein the display media are selected at least partially in accordance with the at least one selected category and cause at least some of the display media to be displayed to the user.
- video segements could be presented instead of images, whilst music or other audio relevant to a location and/or experience could be presented together with images.
- the process is performed by one or more processing systems operating as part of a distributed architecture, an example of which will now be described with reference to Figure 2.
- a number of base stations 201 are coupled via communications networks, such as the Internet 202, and/or a number of local area networks (LANs) 204, to a number of client devices 203.
- communications networks such as the Internet 202, and/or a number of local area networks (LANs) 204
- client devices 203 are for the purpose of example only, and in practice the base stations 201 and client devices 203 can communicate via any appropriate mechanism, such as via wired or wireless connections, including, but not limited to mobile networks, private networks, such as an 802.11 networks, the Internet, LANs, WANs, or the like, as well as via direct or point-to- point connections, such as Bluetooth, or the like.
- each base station 201 includes one or more processing systems 210, each of which may be coupled to one or more databases 211.
- the base station 201 is adapted to be used in performing actions associated with identifying destinations of interest by presenting images, and optionally performing additional ancillary actions, such as managing the categories and images, sharing images, recording favourite images, or the like.
- the client devices 203 are typically adapted to communicate with the base station 201, allowing images to be viewed, request that further actions are performed, or the like.
- the base station 201 Whilst the base station 201 is shown as a single entity, it will be appreciated that the base station 201 can be distributed over a number of geographically separate locations, for example by using processing systems 210 and/or databases 211 that are provided as part of a cloud based environment. However, the above described arrangement is not essential and other suitable configurations could be used.
- the processing system 210 includes at least one microprocessor 300, a memory 301, an optional input/output device 302, such as a keyboard and/or display, and an external interface 303, interconnected via a bus 304 as shown.
- the external interface 303 can be utilised for connecting the processing system 210 to peripheral devices, such as the communications networks 202, 204, databases 211, other storage devices, or the like.
- peripheral devices such as the communications networks 202, 204, databases 211, other storage devices, or the like.
- a single external interface 303 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (eg. Ethernet, serial, USB, wireless or the like) may be provided.
- the microprocessor 300 executes instructions in the form of applications software stored in the memory 301 to allow the required processes to be performed.
- the applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like.
- the processing system 210 may be formed from any suitable processing system, such as a suitably programmed client device, PC, web server, network server, or the like.
- the processing system 210 is a standard processing system such as an Intel Architecture based processing system, which executes software applications stored on non- volatile (e.g. , hard disk) storage, although this is not essential.
- the processing system could be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
- the client device 203 includes at least one microprocessor 400, a memory 401, an input/output device 402, such as a keyboard and/or display, and an external interface 403, interconnected via a bus 404 as shown.
- the external interface 403 can be utilised for connecting the client device 203 to peripheral devices, such as the communications networks 202, 204, databases, other storage devices, or the like.
- peripheral devices such as the communications networks 202, 204, databases, other storage devices, or the like.
- a single external interface 403 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (eg. Ethernet, serial, USB, wireless or the like) may be provided.
- the microprocessor 400 executes instructions in the form of applications software stored in the memory 401 to allow communication with the base station 201, for example to allow for selection of categories, to receive image identifiers and images, or the like.
- the client devices 203 may be formed from any suitable processing system, such as a suitably programmed PC, Internet terminal, lap-top, hand-held PC, smart phone, tablet, PDA, web server, or the like.
- the processing system 210 is a standard processing system such as an Intel Architecture based processing system, which executes software applications stored on non- volatile (e.g. , hard disk) storage, although this is not essential.
- client devices 203 can be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
- a microprocessor microchip processor
- logic gate configuration logic gate configuration
- firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
- FPGA Field Programmable Gate Array
- processing system 210 acts to host webpages allowing the user to browse images and refine categories using one of the client devices 203.
- the processing system 210 is therefore typically a server which communicates with the client device 203 via a communications network, or the like, depending on the particular network infrastructure available.
- the processing system 210 of the base station 201 typically executes applications software for hosting webpages and images and performing other actions, with actions performed by the processing system 210 being performed by the processor 300 in accordance with instructions stored as applications software in the memory 301 and/or input commands received from a user via the I/O device 302, or commands received from the client device 203.
- GUI Graphic User Interface
- the user interacts with the processing system 210 via a GUI (Graphical User Interface), or the like presented on the client device 203, and in one particular example via a browser application that displays webpages hosted by the base station 201.
- Actions performed by the client device 203 are performed by the processor 401 in accordance with instructions stored as applications software in the memory 402 and/or input commands received from a user via the I/O device 403.
- the above described configuration assumed for the purpose of the following examples is not essential, and numerous other configurations may be used.
- the partitioning of functionality between the client devices 203, and the base station 201 may vary, depending on the particular implementation.
- an image is initially received. This is typically performed by having the image supplied by a photographer, although the image may alternatively be retrieved from a repository, or other source such as a social media feed or the like, depending on the preferred workflow.
- the image is reviewed, with this being used to optionally assess image suitability, for example to determine if the image meets quality and/or content requirements.
- the image is allocated to one or more categories at step 510.
- this is performed in accordance with a category structure so categories are applied consistently across images.
- An example of a portion of a category structure will now be described with reference to Figure 6.
- the category structure includes a number of location categories 601 provided in a hierarchy, so that locations are progressively sub-divided into continents, countries, states or regions and cities or other sub-categories.
- experience categories 602 may be sub-categorised according to different types, such as nature, activities, attractions, or the like, with each then being further sub-divided to define specific experiences, such as particular activities.
- the locations include continents including Asia, North America and Europe, with North America being divided into countries including Mexico, USA and Canada. USA is further divided into states including Florida, California and New York, with California including cities San Diego, San Francisco and Los Angeles. Experiences are divided into Nature, Attractions, and Activities, with Nature being sub- divided into Beach and Bush, whilst Activities include Surfing and Cycling. It will be appreciated that this example is for the purpose of illustration only and that in practice many hundreds of categories would typically be utilised.
- an image identifier is assigned to the image.
- the image identifier is typically a unique alpha-numeric code that can be generated automatically by one of the processing systems 210.
- the image identifier is used to subsequently retrieve the image, and can therefore form part of or be related to location information indicative of where the image is to be ultimately stored. Accordingly, the image identifier could form part of a URI (Uniform Resource Identifier), URL (Uniform Resource Locator), or the like, allowing the image to be retrieved.
- URI Uniform Resource Identifier
- URL Uniform Resource Locator
- any other metadata to be associated with the image is typically generated, either by creating and/or importing this from an appropriate source.
- the metadata can include any one or more of experience and location categories, child location categories, image counts for each experience, image ID, image URL, image credit information, image ordering information, location latitude and longitude, image origin information, or the like.
- the image and associated metadata is stored.
- the metadata could be encoded as part of the image file, and stored together with the image, although more typically the image and metadata are stored in respective image and metadata databases allowing these to be accessed independently. This allows the image to be provided, whilst the metadata is securely retained, thereby allowing commercial value in the metadata to be maintained.
- step 700 the user utilises the client device 203 to access an interface, such as a webpage hosted by one or more of the processing systems 210.
- an interface such as a webpage hosted by one or more of the processing systems 210.
- the one or more processing systems 210 Upon initially accessing the webpage the one or more processing systems 210 typically select default categories at step 705, which may correspond to broad categories within a particular category structure, previously selected categories, or the like.
- an image search is performed utilising the selected categories. This typically involves comparing the selected categories to metadata associated with the images with image identifiers being retrieved at step 715. The image identifiers, and/or location information retrieved using the image identifiers can then be used to retrieve images at step 720. It will be appreciated that steps 715 and 720 allow the image identifier and image to be stored separately, for example in respective metadata and image databases, but that this is not essential, and instead these steps could be combined into a single image retrieval step in the event that the metadata is stored as part of or together with the image.
- the interface 800 typically includes an application toolbar 810, image windows 620 and a categories toolbar 830.
- the application toolbar 810 includes menu options, such as "File”, “Edit”, “View” and “History” menus, as may be found in any application, such as a web browser application or the like. These are typically standard and will not therefore be described in any further detail.
- the image window 820 displays a number of display images, with three images 821, 822, 823 being shown for the purpose of illustration.
- the arrangement of images will vary depending on factors such as the image sizes, screen resolution or the like, and these could be laid out dynamically, in accordance with templates or the like.
- a scroll bar 824 can also be provided allowing additional images to be displayed should these be available, thereby allowing a user to browse any number of images.
- the category toolbar 830 includes location, experience categories 831 and 832 with controls 833, 834 being used to allow users to browse through different categories presented in the toolbar as shown.
- the selected categories include "LA”, “Beach” and “Surfing”, in which case the image of the surfer at Santa Monica Beach, which is mentioned above, could be displayed as one of the images 821, 822, 823.
- step 730 the user can revise the categories selected, for example by selecting additional categories, deselecting categories or selecting alternative categories using the category toolbar 830. This causes the process to return to step 710, allows steps 710 to 725 to be repeated so alternative images are displayed.
- step 735 the user can also select actions to be performed, either with the currently selected categories and/or associated with particular ones of the display images, with the actions being performed at step 740 as will be described in more detail below.
- the processing systems 210 include an account server, metadata server, image server and map server.
- the account server is a web server that interacts with the client device 203, allowing the user to access a website interface or administer a user account and is responsible for coordinating additional actions.
- the metadata server interacts with a metadata database storing metadata relating to the images, allowing the searching process to be performed.
- the image server interacts with an image database allowing images to be retrieved and provided to client devices.
- the map server is adapted to provide mapping information and is therefore typically a GIS (Geographic Information System) server, or the like. It will be appreciated that these names are used for the purpose of illustration only, in particular to distinguish between the different servers used and these terms are not therefore intended to be limiting.
- GIS Geographic Information System
- the user accesses a webpage hosted by the account server using the client device 203, and at step 905 undergoes an optional login process, for example by providing a username and password or the like.
- a user account is typically established during a registration process and that the user account can be used to store a user profile including the username and password, demographic information relating to the user, as well as other information, such as previously identified images of interest, billing information, or the like. This will be performed in accordance with normal hosting procedures and this will not therefore be described in any more detail.
- the account server provides the webpage code, allowing the webpage to be displayed to the user on the client device 203.
- the webpage is typically initially displayed with default location and experience categories selected.
- the currently selected categories which are initially the default categories, are sent to the metadata server by the client device 203.
- the metadata server searches the metadata of reference images in order to identify display images within the selected categories. This process will be substantially as previously described and will involve comparing the selected categories to the categories and/or actual location and/or experiences specified in the metadata. This will not therefore be described in any further detail.
- the metadata server returns the image identifier of each display image to the client device 203, together with information indicative of the image storage location, such as a URL or similar, if this cannot be derived from the image identifier.
- the client device 203 transfers the image identifiers to the image server, which in turn returns images to the client devices at step 935.
- images are displayed, allowing the user to browse through images at step 945 and optionally update selected categories at step 950.
- the user can proceed with selecting actions to be performed.
- the user interface can display icons allowing different actions to be selected as shown in Figure 10.
- similar reference numerals increased by 200 are used to described similar features to those previously described with reference to Figure 8, and these will not therefore be described in any further detail.
- each image 1021, 1022, 1023 is associated with a respective image action toolbar 1021.1, 1022.1, 1023.1.
- a further gallery action toolbar 1025 is also provided within the image window 1020.
- Each action toolbar includes icons corresponding to respective actions, with six being shown in this example corresponding to play, add, information, map, favourite and share actions respectively (as viewed from left to right in Figure 10). It will be appreciated that these actions are for the purpose of example only and that in practice any number of different actions could be displayed depending on the preferred implementation.
- buttons are shown in all of the image action and gallery action toolbars 1021.1, 1022.1, 1023.1, 1025, however, this is not essential, and icons may only be displayed if corresponding actions are available.
- the play media icon may only be displayed for a gallery, allowing media such as music associated with the selected categories to be presented.
- the play media icon could be displayed with individual images if associated media is available corresponding to that image, for example in the event sound was recorded when the image was being captured.
- the user interface could be populated based on the content of the reference image metadata, for example in the event that corresponding media is defined within the metadata.
- the user selects a favourite icon in one of the image action toolbars 1021.1, 1022.1, 1023.1, so that the corresponding display image is added to a favourites list.
- the client device provides the image identifier of the respective display image to the account server, causing the account server to store the image identifier in a favourites list associated with the user profile at step 1110.
- the user can subsequently select a view favourite images option, for example via another interface (not shown).
- the account server retrieves the image identifiers of any images in the favourites list and returns these to the client device 203.
- the client device can then provide the image identifiers to the image server at step 1125, allowing the images to be returned and displayed at step 1130.
- the images could be displayed using an interface similar to the interface described above, or alternatively using a thumbnail gallery or the like, depending on the preferred implementation. In either case, as the images could potentially span a large number of different categories, the category toolbar can be omitted or unpopulated for simplicity.
- the user can interact with the favourite images in a manner similar to any display images, for example by selecting additional actions to be performed, including viewing additional information, sharing, deselecting the image as a favourite image, adding the image to an itinerary, or the like.
- users can filter favourites, for example based on categories, allowing them to refine their selection of images and categories of interest. This will not therefore be described in any further detail.
- a user selects a share icon in one of the image action toolbars 1021.1, 1022.1, 1023.1.
- a selection of options allowing the user to identify a recipient will typically be displayed. For example, this could involve having the interface display different delivery mechanisms, such as email, social media networks, or the like, together with a field allowing the user to enter a recipient address.
- step 1210 an image identifier of the corresponding display image and the recipient address are provided to the account server, which then transfers the image identifier to the recipient address at step 1215.
- the recipient receives the image identifier, on their own client device.
- the image identifier will typically be in the form of a link, such as a URL or the like, allowing the user to select the link, causing the image identifier to be provided to the image server at step 1225, with the image being returned at step 1130.
- the link could be used to display the image only, or alternatively could be used to direct the recipient to a webpage interface hosted by the account server, with the image displayed thereon. This latter option is typically preferred as this will drive additional traffic to the account hosts webpage, and could entice the recipient to engage in a process of reviewing images and selecting a destination of interest.
- a user selects a map view option.
- the map icon on the gallery action toolbar 1025 is selected, but it will be appreciated this process could alternatively be performed for an individual image, a favourites collection, or the like.
- step 1305 image identifiers of the display images currently being displayed in the gallery are provided to the metadata server, which returns the coordinates associated with each of the display images from the corresponding metadata at step 1310.
- step 1315 the coordinates are sent from the client device 203 to a map server, allowing the map server to return map information at step 1320.
- the map information corresponds to a map including each of the specified coordinates, and accordingly the extent of the map will depend on the geographical distribution of the locations of the selected images.
- the client device 203 can display the map together with a point of interest (POI) layer, which includes respective POI icons 1401 associated with each of the display images, as shown in Figure 14.
- POI point of interest
- the process could end at this point, additional interaction could also be performed.
- the user could interact with the map using normal map interaction techniques, such as a zoom, pan, scroll, or the like.
- the user can select a POI 1401, as shown at 1401.1, allowing further information to be viewed.
- an indication of an image identifier corresponding to the POI is transferred to the metadata server, allowing information regarding the image to be returned at step 1335, whilst the image identifier is also used to retrieve an image at steps 1340 and 1345.
- This allows information associated with the image, such as an image description, as well as the image to be displayed, for example using respective pop-up boxes 1402, 1403.
- step 1500 the user uses the add icon to add images to a trip itinerary.
- the add icon on the gallery action toolbar 1025 is selected, so that a number of images in a gallery can be added to an itinerary, but it will be appreciated this process could alternatively be performed for individual images or a favourites collection, or the like.
- Image identifiers associated with the currently displayed images are retrieved by the account server at step 1505 and forwarded to the metadata server at 1520. This can be performed either directly or via the client device 203 depending on the preferred implementation.
- step 1515 locations and experiences corresponding to each of the image identifiers are returned to the account server, with these being used to calculate an itinerary at step 1520.
- This can be performed in accordance with any standard route planning process, and would therefore typically involve choosing a sequence of locations and/or experiences utilizing a suitable optimisation algorithm.
- step 1525 an example itinerary can be displayed, an example of which is shown in Figure 16.
- the user interface 1600 includes an application toolbar 1610 and itinerary window 1620.
- the itinerary window 1620 includes a number of thumbnails 1621, 1622, 1623, 1624, 1625, 1626, 1627, 1628, each corresponding to a separate location activity, or attraction on the itinerary, and each typically including one or more of the corresponding display images. Additionally and/or alternatively, the thumbnails could correspond to different days in a form of calendar view.
- Each thumbnail may also include a respective toolbar 1621.1, 1622.1, 1623.1, 1624.1, 1625.1, 1626.1, 1627.1, 1628.1, which can provide associated options, such as to book accommodation, transport, entrance tickets, or the like, as well as to review additional information, such as reviews, details of individual activities for a given day or location, or the like.
- This process is typically performed by the accounts server based on the location information provided by the metadata server.
- the user can then interact with and modify the itinerary at step 1535, for example by adding or removing locations or experiences, adding bookings for accommodation, transport or the like.
- This process can be performed iteratively, allowing the user to interactively prepare the itinerary.
- the user is able to interact with display images using categories as previously described. For example, an itinerary could be saved, allowing the user to return to the previously described process to browse further images, with these being used to add additional locations and/or destinations to the itinerary as desired.
- the user can also select a map view option, allowing a map similar to that shown in Figure 14 to be displayed together with a travel route between the locations and experiences. Additionally, pricing could be generated dynamically based on selected accommodation and transport options, allowing the user to more easily adapt the itinerary to meet their requirements.
- step 1535 the user can proceed with making a booking at step 1535, which can be achieved using standard booking techniques, which will not therefore be described in any further detail.
- the system can be used to gather information regarding users and their behaviours, which can in turn be leveraged in business processes, such as targeted advertising or the like. An example of this will now be described with reference to Figure 17.
- one or more users are selected. This could be performed automatically, or could be performed manually, for example by an operator of the account server. This can be performed using the account server directly, or may be performed using a separate remote computer system.
- the one or more users could be selected in any manner and this could involve an operator reviewing a list of users, or alternatively by selecting users based on criteria, such as users with specified activity in given time periods, that have searched in specific locations or experience categories. For example, this could be used to identify users that have selected favourite or shared images within the last six months, but have not yet proceeded with a booking any travel or preparing an itinerary.
- the account server obtains image identifiers of the images that the user has recorded as favourites, forwarding these to the metadata server at step 1710.
- the metadata server retrieves a list of experiences and locations associated with the images, providing these to the account server at step 1715.
- the analysis of the locations and experiences can be performed, with this being used to allow a business process to be implemented at step 1725. The nature of the analysis performed will vary depending on the preferred implementation and the business process being performed.
- the business process could involve attempting to identify trends in popularity of different experiences and/or locations, or combinations of the two. This can be used to anticipate future bookings and ensure adequate availability.
- this can be used to analyse user behaviours, for example to anticipate when individuals are likely to make a booking. This allows advertising or discounts to be delivered in a more targeted manner, maximising their impact. For example, if it possible to predict when a user is likely to book a trip, advertising can be performed close to that time, encouraging the user to make a booking using the system.
- the above described system allows users to review and interact with images by way of location and experience categories, allowing the user to identify destinations of interest.
- the destinations can then be used to construct an itinerary, which can be reviewed and modified as required, until the user is ready to make a booking.
- the process of interacting with the images naturally leads the user through the process of developing an itinerary and hence making a booking, increasing the opportunity for the host of the system to convert casual interest in images into revenue.
- the early stage involves providing a mechanism to allow customers to explore an image library, selecting and sharing images with friends and family. This is designed to facilitate exploration of the images, with a simple interface allowing customers to connect with conscious, and unconscious, travel aspirations.
- Customers can access an entire image library, with tens of thousands of up-to-date images from the most popular and exotic destinations in the world right down to targeted Points of Interest.
- the images are associated with supporting metadata, which not only aids the search process, but also allows users to gain an understanding of the destinations being shown in the images, thereby facilitating further research. For example, this allows customers to explore geographically via simple controls that allow them to drill down from continent level to local POIs. Customers can also explore the library via experiences and a combination of geographic selections with multiple experiences (e.g. Europe + Castles + Lakes).
- the system provides a gallery space for users to view, arrange and filter favourite images.
- this allows multiple images to be viewed, with optional endless left to right scroll being used to allow large numbers of images to be accommodated.
- Other display options can also be provided, such as full screen slide shows, thereby enhancing the effect.
- Additional information, including text and video can be used to provide further information, with icons being displayed on the interface to allow users to select images as favourites, share images, or the like.
- Dynamic map views can also be provided as an alternative bird's-eye view of search results. Throughout this process, the location and experience category based navigation remains active, so that users can refine the categories and hence display images, thereby allowing them to progressively focus in on destinations of interest.
- the system allows for information rich navigation, which can be used to generate individual landing pages corresponding to individual galleries, to cover every geographic and experience combination (e.g. Castles + Europe; Beach + Bahamas).
- Optimized pages can also include original copy, title text, header elements, image tags and teaser text for each image, with an information icon linking to city destination guide pages providing a path to conversion at any stage.
- Icons can be used to facilitate sharing of individual images or entire galleries and can be performed via email and social media. Friends can explore and then join, allowing them to save and share results.
- An Add a Trip icon can be used to add destinations, places and POIs to new or existing trip itineraries.
- Images can be viewed in variety of ways, including gallery or full-screen modes, or with additional information, such as links to city destination guide pages or the like. Map view options and universal navigational icons remain constant throughout, allowing users to tag, share or add favourite images or galleries to an itinerary at any time.
- the next stage provides a fully integrated planning and booking environment designed to turn customers' travel dreams into realities. Multiple trips can be named, edited, saved and shared. This dramatically simplifies the often complex process of planning and booking trips and provides an easy to understand and manage overview of a single trip.
- Destinations within a trip can be easily rearranged and amended, to provide full flexibility on dates and durations.
- Expandable transport linking panels between each destination indicate transport options and booking status. Indicative pricing is calculated in accordance with travel preferences and timings, and is dynamically updated as the itinerary evolves.
- a sliding itinerary overview can be used to show broad details at a glance, or a close- up view of a single destination within a trip, allowing customer access to flights, hotels, car rental and Points of Interest. Dates can be preloaded into any booking screens to simplify the booking process. Activities and POIs can be perused and short-listed, and where available, tickets can be booked. A notes icon makes saving and sharing comments, details and reminders straightforward. [0166] Different views can be used to show a sequential breakdown of each day, including hotel and transport arrangements, with estimated travel times between transfer points, hotels and POIs being indicated. Destination details can also be viewed on a map. This allows customers to dynamically rearrange the order of their itinerary to ensure that the trip meets their requirements.
- Plans and itineraries relating to future trips can be shared via social media or via email, using similar techniques to sharing of images.
- a collaborate option can be used to allow co-planning between multiple users, regardless of location, thereby allowing groups to more easily plan and coordinate travel.
- a note function can be provided to make saving and sharing comments and reminders easy. Rich media slide shows at full screen maintain customer excitement through to final conversion.
- the system can also be adapted to provide a post-trip platform where customers' trip images, stories and discoveries can be stored, relived and shared online, and, be printed as high-quality albums. Customers can add their own unique places, discoveries and stories, upload images, and store trip details including maps and hotel details for future reference.
- a similar process could be applied to real estate by categorising images of different properties based on categories such as locations and/or property features, including the type of property, number of rooms, presence of parking, swimming pools, or the like . Users can view images of different properties selected by category and then review images allowing the category selection to be progressively refined until one or more properties of interest are identified.
- the system could also be used for other forms of purchase, such as retail shopping.
- categories could include different types of product, such as shoes, handbags, or the like, and features of products, such as size, colour, shape, or the like.
- the system can be applied more generally to any target of interest by displaying categorised images, and then using this to refine the categories and hence display more relevant images.
- the apparatus can be provided for use in identifying targets of interest.
- the apparatus can include one or more electronic processing devices that determine at least one selected category, select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category and cause at least some of the display images to be displayed to the user. This allows the user to review the display images and refine the at least one selected category allowing the user to iteratively review display images to thereby identify targets of interest.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Finance (AREA)
- Development Economics (AREA)
- Data Mining & Analysis (AREA)
- Library & Information Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Databases & Information Systems (AREA)
- Game Theory and Decision Science (AREA)
- General Engineering & Computer Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Apparatus for use in identifying destinations of interest, the apparatus including one or more electronic processing devices that determine at least one selected category, the selected category including at least one of a location category and an experience category; select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image location and a reference image experience and wherein the display images are selected at least partially in accordance with the at least one selected category; and, cause at least some of the display images to be displayed to the user to allow the user to review the display images and refine the at least one selected category allowing the user to iteratively review display images to thereby identify destinations of interest.
Description
IDENTIFYING METHOD AND APPARATUS Background of the Invention
[0001] This invention relates to a method and apparatus for identifying targets of interest and in one particular example for allowing users to review images of experiences and locations to identify destinations of interest, for example for use in travel planning.
Description of the Prior Art
[0002] The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that the prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
[0003] When booking travel, it is typical for individuals to perform research to identifying destinations of interest. This is currently largely performed using a variety of different resources, such as a combination of websites, television programs, personal recommendation or the like, in order to allow individuals to choose the destinations they wish to visit. They must then manually plan the trip, before using a booking system, travel agent or the like, in order to book the travel. As a result the travel booking process tends to be largely disjointed, and also typically requires the individual to have at least some idea of what they are interested in before research can commence.
Summary of the Present Invention
[0004] In one broad form the present invention seeks to provide an apparatus for use in identifying destinations of interest, the apparatus including one or more electronic processing devices that:
a) determine at least one selected category, the selected category including at least one of:
i) a location category; and,
ii) an experience category;
b) select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image location and a reference image experience and wherein the display images are selected at least partially in accordance with the at least one selected category; and,
c) cause at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify destinations of interest.
[0005] Typically the one or more electronic processing devices select the display images in accordance with a category structure defining relationships between location categories and between experience categories.
[0006] Typically the one or more electronic processing devices select display images having at least one of a reference image location and reference image experience including at least one of a location category and experience category within the selected category.
[0007] Typically, in response to display of at least some of the display images, the one or more electronic processing devices:
a) determine user interaction with at least one of the images; and,
b) at least partially refine the at least one selected category in accordance with the user interaction.
[0008] Typically, in response to display of at least some of the display images, the one or more electronic processing devices:
a) determine at least one selected category in accordance with user input commands; and,
b) select display images from the plurality of reference images at least partially in accordance with the at least one further selected category.
[0009] Typically the one or more electronic processing devices:
a) cause a list of categories to be displayed to a user; and,
b) determine at least one selected category in accordance with user interaction with the list of categories.
[0010] Typically the list of categories includes a list of location categories and a list of experience categories.
[0011] Typically the list of categories is displayed in accordance with at least one of:
a) a number of images in each category;
b) a popularity of each category; and,
c) a category structure.
[0012] Typically the one or more electronic processing devices cause a user interface to be displayed to a user on a client device, the user interface including:
a) a number of display images; and,
b) at least one toolbar including a list of categories.
[0013] Typically the one or more electronic processing devices perform at least one action associated with one or more of the display images.
[0014] Typically the one or more electronic processing devices determine at least one action associated with a respective display image in accordance with at least one of:
a) user input commands; and,
b) selection of an icon displayed coincidently with the respective display image.
[0015] Typically the at least one action includes at least one of:
a) sharing at least one display image with a recipient;
b) recording an indication of at least one display image as a favourite image;
c) displaying additional information relating to at least one display image; and, d) updating an itinerary at least partially in accordance with at least one display image.
[0016] Typically the one or more electronic processing devices share at least one display image by:
a) determining an image identifier associated with the at least one display image; and,
b) transferring an indication of the image identifier to a destination.
[0017] Typically the one or more electronic processing devices flag at least one display image as a favourite image by:
a) determining an image identifier associated with the at least one display image; and,
b) recording an indication of the image identifier as part of user data associated with the user.
[0018] Typically the one or more electronic processing devices:
a) select display images by searching a metadata store including metadata for the plurality of reference images;
b) retrieve image identifiers for the display images from the metadata store; and, c) use the image identifiers to retrieve display images from an image store.
[0019] Typically the one or more electronic processing devices include a metadata server that:
a) receives an indication of at least one selected category;
b) selects the display images;
c) determines an image identifier associated with each display image; and
d) provides an indication of the image identifier.
[0020] Typically the one or more electronic processing devices include an image server that: a) receives an indication of at least one image identifier;
b) uses the at least one image identifier to retrieve at least one image from an image store; and
c) provides the images.
[0021] Typically:
a) a metadata server:
i) receives an indication of at least one selected category from a client device via a communications network; and,
ii) provides an indication of the image identifier to the client device via the communications network; and,
b) an image server:
i) receives the indication of at least one image identifier from the client device via the communications network; and,
ii) provides the image to the client device via the communications network.
[0022] Typically the one or more electronic processing devices include an account server that performs actions at least in part using image identifiers received from a client device via a communications network.
[0023] Typically the one or more electronic processing devices include an account server that:
a) receives an action request associated with at least one display image from a client device via a communications network;
b) determines the image identifier of the at least one display image;
c) determines at least one of a location category and an experience category using the image identifier; and,
d) performs an action at least in part using the at least one of a location category and an experience category.
[0024] Typically the account server:
a) transfers the image identifier to a metadata server; and,
b) receives an indication of the at least one of a location category and an experience category from the metadata server.
[0025] Typically the account server:
a) determines selected display images in accordance with user selections made via a client device;
b) determines locations and experiences associated with selected display images; and,
c) uses the locations and experiences to at least partially prepare a trip itinerary. [0026] Typically the selected display images are at least in part used to book a trip. [0027] Typically the one or more electronic processing devices:
a) select display media from a plurality of reference media, wherein each of the reference media are associated with metadata indicative of at least one of a reference media location and a reference media experience and wherein the display media are selected at least partially in accordance with the at least one selected category; and,
b) cause at least some of the display media to be displayed to the user.
[0028] In another broad form the present invention seeks to provide a method for use in reviewing images of destinations, the method including, in one or more electronic processing devices:
a) determining at least one selected category, the selected category including at least one of:
i) a location category; and,
ii) an experience category;
b) selecting display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of a reference image location and a reference image experience and wherein the display images are selected at least partially in accordance with the at least one selected category; and,
c) causing at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify destinations of interest.
[0029] In another broad form the present invention seeks to provide an apparatus for use in identifying targets of interest, the apparatus including one or more electronic processing devices that:
a) determine at least one selected category;
b) select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a
reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category; and,
c) cause at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify targets of interest.
[0030] Typically the target includes at least one of:
a) realestate;
b) products; and,
c) destinations.
[0031] Typically the categories include at least one of:
a) locations;
b) types;
c) features; and,
d) experiences.
[0032] In another broad form the present invention seeks to provide a method for use in identifying targets of interest, the method including, in one or more electronic processing devices:
a) determining at least one selected category;
b) selecting display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category; and,
c) causing at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify targets of interest.
Brief Description of the Drawings
[0033] An example of the present invention will now be described with reference to the accompanying drawings, in which: -
[0034] Figure 1 is a flowchart of an example of a method for use in identifying destinations of interest;
[0035] Figure 2 is a schematic diagram of an example of a distributed computer architecture;
[0036] Figure 3 is a schematic diagram of an example of a processing system of Figure 2;
[0037] Figure 4 is a schematic diagram of an example of a client device of Figure 2;
[0038] Figure 5 is a flowchart of an example of a method of assigning images to categories;
[0039] Figure 6 is a schematic diagram of an example of a category structure;
[0040] Figure 7 is a flowchart of an example of a method of browsing images to identify destinations of interest;
[0041] Figure 8 is a schematic diagram of an example of a user interface for use in the method of Figure 7;
[0042] Figures 9A and 9B are a flowchart of a further specific example of a method for use in browsing images to identify destinations of interest;
[0043] Figure 10 is a schematic diagram of an example of a user interface used in the method of Figures 9A and 9B;
[0044] Figure 11 is a flowchart of an example of a method for selecting and viewing favourite images;
[0045] Figure 12 is a flowchart of an example of a method of sharing images;
[0046] Figures 13A and 13B are a flowchart of an example of a method for viewing maps;
[0047] Figure 14 is a schematic diagram of an example of a user interface for use in the method of Figures 13A and 13B;
[0048] Figure 15 is a flowchart of an example of a method for booking a trip;
[0049] Figure 16 is a schematic diagram of an example of user interface showing an itinerary; and,
[0050] Figure 17 is a flow chart of an example of a business intelligence process.
Detailed Description of the Preferred Embodiments
[0051] An example of a method for identifying destinations of interest will now be described with reference to Figure 1.
[0052] In this example, it is assumed that the process is performed at least in part using an electronic processing device forming part of a processing system, which may in turn be connected to one or more other processing systems or client devices via a network architecture, as will be described in more detail below.
[0053] For the purpose of example, the term "reference image" is used to refer to an image of a destination that is typically associated with defined metadata. In this regard, the term "metadata" will be understood to include information regarding the image, including at least a location shown in or otherwise associated with the image and/or an experience shown in or otherwise associated with the image, as well as any other relevant information. The metadata can be stored either together with the image or separately from the image and may be defined through a combination of automatic processes, such as during capture of the image and manual processes, for example as part of an image curation process.
[0054] The term "display image" is used to refer to a reference image that is presented to a user on a display device associated with a processing system, client device or the like. The term "destination" is used to broadly refer either to a location or an experience to which an individual may wish to travel. In this regard, a "location" generally refers to a geographic location, whilst the term "experience" refers to an event, activity, attraction, occurrence, landform, or the like, in which individuals can participate, observe or encounter. The terms are used for the purpose of illustration only and are not intended to be limiting.
[0055] In this example, at step 100 at least one selected category is determined by the one or more electronic processing devices. The at least one selected category includes at least one of a location category and an experience category, and can include both a location and experience category.
[0056] The at least one selected category can be determined in any one of a number of manners depending upon the preferred implementation. For example, a user interface can be
displayed by the one or more electronic processing devices, which includes details of available location and/or experience categories, allowing these to be selected by the user. Alternatively, categories could be selected at least in part through automated processes, for example on the basis of previously viewed categories, default categories, categories associated with selected images, on the basis of keyword searches or the like.
[0057] At step 110 the one or more electronic processing devices select one or more display images from a plurality of reference images. Each of the reference images is associated with metadata indicative of at least one of a reference image location and/or a reference image experience, allowing the display image(s) to be selected at least in part in accordance with the at least one selected category. In this regard, a hierarchical category structure can be defined allowing display images to be selected that are in currently selected categories, or subcategories thereof, as will be described in more detail below.
[0058] At step 120, at least some of the display images are displayed to the user allowing the user to review the display images. The user can then refine the one or more selected categories, in turn allowing the user to iteratively update and review the display images to thereby identify destinations of interest. This can be achieved using any suitable technique and could include selecting additional or alternative categories, or by deselecting categories. Alternatively, this could be achieved by selecting images of interest and then deriving category selections from the selected images.
[0059] In any event, the above described method utilises categories to search and browse images relating to a range of different destinations, allowing the user to review images and use this to refine selected categories thereby allowing additional relevant images to be displayed. This allows the user to go through an iterative process of progressively refining categories by identifying images of interest, in turn allowing them to identify one or more destination of interest.
[0060] Subsequently to identifying categories of interest, one or more additional actions can be performed, such as sharing images, identifying favourite images, viewing maps showing locations of destinations of interest, viewing additional information associated with destinations of interest, or the like. Additionally and/or alternatively, destinations of interest
can be used in developing a travel itinerary and/or booking an excursion, such as a holiday, or the like.
[0061] Accordingly, the above described process allows users to go through a process of browsing images relating to various destinations, allowing them to select destinations of interest, including different experiences and/or different locations. This can then be used in performing further actions, including preparing a travel itinerary and/or booking.
[0062] It will be appreciated that this provides a more straightforward mechanism for allowing individuals to prepare and book travel. In particular, the user is simply presented with images with which they can interact, allowing them to progressively refine locations and experiences to thereby select destinations that are of interest. Thus, this facilitates users in performing background research, allowing them to more intuitively identify destinations of interest, making it easier for them to plan travel, hence increasing the likelihood that casual browsing will lead to a concrete booking.
[0063] A number of further features will now be described.
[0064] In one example, the one or more electronic processing devices select the display images in accordance with a category structure defining relationships between location categories and between experience categories. This allows the display images to be selected based on relationships between categories, for example between experience and location categories, and/or based on a hierarchy, such as parent/child relationships.
[0065] In one example, the one or more electronic processing devices select display images having at least one of a reference image location and reference image experience including at least one of a location category and experience category within the selected category. Thus, when images are initially categorised, these can be assigned to multiple categories using the category structure, so that the reference image location/experience could include multiple location/experience categories. Alternatively, the one or more electronic processing devices select display images having at least one of a reference image location and reference image experience within the selected category or within a child category of the selected category.
[0066] Additionally, it will be appreciated that the selected categories can include multiple categories, for example, by selecting a location and experience category, or by selecting multiple complimentary experience categories. Thus, a user could select a beach and a surfing category, with images being displayed that relate to both surfing and beaches. In a further example, categories could be explicitly excluded, so that a user could select a beaches category, but exclude the surfing category, thereby excluding surfing related images.
[0067] In one example, in response to display of at least some of the display images, the one or more electronic processing devices determine user interaction with at least one of the images and at least partially refine the at least one selected category in accordance with the user interaction. For example, the user could select to "like" or "dislike" an image, in which case additional categories could be added or excluded. For example, if a user selects a beach category and then "likes" a surfing image, this could cause a surfing category to be added to the selected categories.
[0068] Similarly, in response to display of at least some of the display images, the one or more electronic processing devices determine at least one selected category in accordance with user input commands and select display images from the plurality of reference images at least partially in accordance with the at least one further selected category. Thus, the user can add further categories or deselect existing categories, allowing the images that are displayed to be modified accordingly.
[0069] Selection of categories can be achieved in any appropriate manner, but in one example, this involves having the one or more electronic processing devices cause a list of categories to be displayed to a user and determine at least one selected category in accordance with user interaction with the list of categories. Thus, the user can select/deselect categories in the list allowing these to be added or removed from the selected categories.
[0070] The list typically includes a list of location categories and a list of experience categories, and is optionally displayed in accordance with at least one of a number of images in each category or a popularity of each category or a category structure. Whilst the list could be displayed alphabetically, by ordering this based on number of images and/or
popularity, this allows the list to be prioritised to enhance the effectiveness of the list and maximise the number, relevance or appeal of images presented to the user.
[0071] In one example, interaction is via a user interface that is displayed to a user on a client device, the user interface including a number of display images and at least one toolbar including a list of categories. However, any suitable arrangement could be used.
[0072] In one example, in addition to displaying images, the one or more electronic processing devices perform at least one action associated with one or more of the display images. The action could be of any suitable form and example actions include, but are not limited to sharing at least one display image with a recipient, recording at least one display image as a favourite image, displaying additional information relating to at least one display image, such as a map or text based explanation of the experience or location, or updating an itinerary at least partially in accordance with at least one display image. Selection of the action can be performed in accordance with user input commands and/or selection of an icon displayed coincidently with the respective display image. This in turn provides a mechanism for users to easily perform actions as part of the process of browsing images, in turn increasing the likelihood that browsing of images will progress into further interaction with the system.
[0073] In terms of sharing the image, this can be achieved in any suitable manner, and in one example, this involves the one or more processing devices determining an image identifier associated with the at least one display image and transferring an indication of the image identifier to a destination. A client device at the destination can then use the identifier to retrieve the image, for example from an image server or the like. Similarly the one or more electronic processing devices can record an indication of at least one display image as a favourite image by determining an image identifier associated with the at least one display image and recording an indication of the image identifier as part of user data associated with the user. This allows an indication of the images to be stored and/or shared easily, without requiring multiple copies of images to be created and stored, thereby minimising storage requirements. This also assists image owners in retaining control of their images, avoiding unregulated copying and hence consequential devaluation of the images.
[0074] In one example, the one or more electronic processing devices select display images by searching a metadata store including metadata for the plurality of reference images, retrieve image identifiers for the display images from the metadata store and use the image identifiers to retrieve display images from an image store. Typically, the one or more electronic processing devices include a metadata server that receives an indication of at least one selected category, selects the display images, determines an image identifier associated with each display image and provides an indication of the image identifier. The one or more electronic processing devices also typically include an image server that receives an indication of at least one image identifier, uses the at least one image identifier to retrieve at least one image from an image store and provides the images.
[0075] Accordingly, this allows the images and metadata to be stored separately allowing searching and hosting of images to be performed using separate servers, which can assist balancing workflow, whilst also allowing images to be provided solely on the basis of image identifiers, making actions such as sharing of images more streamlined. This also allows indications of selected images to be stored by third parties in the form of an identifier, without any further information being required, such as details of selected categories, which assists image owners in retaining control of their images and the associated metadata.
[0076] Whilst the metadata and image servers could communicate directly, more typically communication is via a client device, such as a computer system, tablet, mobile phone or the like. In this case, the metadata server that receives an indication of at least one selected category from the client device via a communications network and provides an indication of the image identifiers to the client device via the communications network. Following this the image server receives an indication of at least one image identifier from the client device via the communications network and provides the image to the client device via the communications network.
[0077] The one or more electronic processing devices can also include an account server that performs actions at least in part using image identifiers. For example, the account server can be used to perform sharing of images, recording indications of favourite images, adding images and/or destinations to itineraries, booking travel, or the like. To achieve this, the account server can receive an action request associated with at least one display image from
the client device via the communications network, determine the image identifier of the at least one display image, determine at least one of a location category and an experience category using the image identifier and perform an action at least in part using the at least one of a location category and an experience category.
[0078] In this example, the account server typically transfers the image identifier to a metadata server and receives an indication of the at least one of a location category and an experience category from the metadata server. This can be performed directly, or via the client device, depending on the preferred implementation. In either case, the account server need only know the image identifier of an image selected by the user, and can then retrieve required information, such as the location and experience associated with the image as required. This reduces the storage requirements of the account server, whilst limiting access to the image metadata, allowing commercial value in the metadata to be maintained.
[0079] In one particular example, the account server determines selected display images in accordance with user selections made via a client device, determines locations and experiences associated with selected display images and uses the locations and experiences to at least partially prepare a trip itinerary and/or book a trip. Thus, the account server can receive identifiers of images of interest to the user, and then use these to receive location and experience information from the metadata server, with this in turn being used to prepare a travel itinerary.
[0080] In addition to displaying images, the one or more processing systems can also display other media, such as videos, music, or the like. This can be performed concurrently with, or separately and in addition to the display of images. In one example, the one or more processing devices select display media from a plurality of reference media, wherein each of the reference media are associated with metadata indicative of at least one of a reference media location and a reference media experience and wherein the display media are selected at least partially in accordance with the at least one selected category and cause at least some of the display media to be displayed to the user. Thus, for example, video segements could be presented instead of images, whilst music or other audio relevant to a location and/or experience could be presented together with images.
[0081] In one example, the process is performed by one or more processing systems operating as part of a distributed architecture, an example of which will now be described with reference to Figure 2.
[0082] In this example, a number of base stations 201 are coupled via communications networks, such as the Internet 202, and/or a number of local area networks (LANs) 204, to a number of client devices 203. It will be appreciated that the configuration of the networks 202, 204 are for the purpose of example only, and in practice the base stations 201 and client devices 203 can communicate via any appropriate mechanism, such as via wired or wireless connections, including, but not limited to mobile networks, private networks, such as an 802.11 networks, the Internet, LANs, WANs, or the like, as well as via direct or point-to- point connections, such as Bluetooth, or the like.
[0083] In one example, each base station 201 includes one or more processing systems 210, each of which may be coupled to one or more databases 211. The base station 201 is adapted to be used in performing actions associated with identifying destinations of interest by presenting images, and optionally performing additional ancillary actions, such as managing the categories and images, sharing images, recording favourite images, or the like. The client devices 203 are typically adapted to communicate with the base station 201, allowing images to be viewed, request that further actions are performed, or the like.
[0084] Whilst the base station 201 is shown as a single entity, it will be appreciated that the base station 201 can be distributed over a number of geographically separate locations, for example by using processing systems 210 and/or databases 211 that are provided as part of a cloud based environment. However, the above described arrangement is not essential and other suitable configurations could be used.
[0085] An example of a suitable processing system 210 is shown in Figure 3. In this example, the processing system 210 includes at least one microprocessor 300, a memory 301, an optional input/output device 302, such as a keyboard and/or display, and an external interface 303, interconnected via a bus 304 as shown. In this example the external interface 303 can be utilised for connecting the processing system 210 to peripheral devices, such as the communications networks 202, 204, databases 211, other storage devices, or the like.
Although a single external interface 303 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (eg. Ethernet, serial, USB, wireless or the like) may be provided.
[0086] In use, the microprocessor 300 executes instructions in the form of applications software stored in the memory 301 to allow the required processes to be performed. The applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like.
[0087] Accordingly, it will be appreciated that the processing system 210 may be formed from any suitable processing system, such as a suitably programmed client device, PC, web server, network server, or the like. In one particular example, the processing system 210 is a standard processing system such as an Intel Architecture based processing system, which executes software applications stored on non- volatile (e.g. , hard disk) storage, although this is not essential. However, it will also be understood that the processing system could be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
[0088] As shown in Figure 4, in one example, the client device 203 includes at least one microprocessor 400, a memory 401, an input/output device 402, such as a keyboard and/or display, and an external interface 403, interconnected via a bus 404 as shown. In this example the external interface 403 can be utilised for connecting the client device 203 to peripheral devices, such as the communications networks 202, 204, databases, other storage devices, or the like. Although a single external interface 403 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (eg. Ethernet, serial, USB, wireless or the like) may be provided.
[0089] In use, the microprocessor 400 executes instructions in the form of applications software stored in the memory 401 to allow communication with the base station 201, for example to allow for selection of categories, to receive image identifiers and images, or the like.
[0090] Accordingly, it will be appreciated that the client devices 203 may be formed from any suitable processing system, such as a suitably programmed PC, Internet terminal, lap-top, hand-held PC, smart phone, tablet, PDA, web server, or the like. Thus, in one example, the processing system 210 is a standard processing system such as an Intel Architecture based processing system, which executes software applications stored on non- volatile (e.g. , hard disk) storage, although this is not essential. However, it will also be understood that the client devices 203 can be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
[0091] Examples of the process for browsing images to identify destinations of interest will now be described in further detail. For the purpose of these examples it is assumed that one or more processing systems 210 act to host webpages allowing the user to browse images and refine categories using one of the client devices 203. The processing system 210 is therefore typically a server which communicates with the client device 203 via a communications network, or the like, depending on the particular network infrastructure available.
[0092] To achieve this the processing system 210 of the base station 201 typically executes applications software for hosting webpages and images and performing other actions, with actions performed by the processing system 210 being performed by the processor 300 in accordance with instructions stored as applications software in the memory 301 and/or input commands received from a user via the I/O device 302, or commands received from the client device 203.
[0093] It will also be assumed that the user interacts with the processing system 210 via a GUI (Graphical User Interface), or the like presented on the client device 203, and in one particular example via a browser application that displays webpages hosted by the base station 201. Actions performed by the client device 203 are performed by the processor 401 in accordance with instructions stored as applications software in the memory 402 and/or input commands received from a user via the I/O device 403.
[0094] However, it will be appreciated that the above described configuration assumed for the purpose of the following examples is not essential, and numerous other configurations may be used. It will also be appreciated that the partitioning of functionality between the client devices 203, and the base station 201 may vary, depending on the particular implementation.
[0095] Before images can be displayed they are assigned to categories and this is typically performed as part of a curation process, an example of which will now be described in more detail with reference to Figures 5 and 6.
[0096] At step 500 an image is initially received. This is typically performed by having the image supplied by a photographer, although the image may alternatively be retrieved from a repository, or other source such as a social media feed or the like, depending on the preferred workflow. At step 505, the image is reviewed, with this being used to optionally assess image suitability, for example to determine if the image meets quality and/or content requirements.
[0097] Assuming this to be the case, the image is allocated to one or more categories at step 510. In assigning the image to categories, this is performed in accordance with a category structure so categories are applied consistently across images. An example of a portion of a category structure will now be described with reference to Figure 6.
[0098] In this example, the category structure includes a number of location categories 601 provided in a hierarchy, so that locations are progressively sub-divided into continents, countries, states or regions and cities or other sub-categories. Similarly, experience categories 602 may be sub-categorised according to different types, such as nature, activities, attractions, or the like, with each then being further sub-divided to define specific experiences, such as particular activities.
[0099] Thus, in the illustrated portion, the locations include continents including Asia, North America and Europe, with North America being divided into countries including Mexico, USA and Canada. USA is further divided into states including Florida, California and New York, with California including cities San Diego, San Francisco and Los Angeles. Experiences are divided into Nature, Attractions, and Activities, with Nature being sub-
divided into Beach and Bush, whilst Activities include Surfing and Cycling. It will be appreciated that this example is for the purpose of illustration only and that in practice many hundreds of categories would typically be utilised.
[0100] Accordingly, if an image is received of a surfer at Santa Monica Beach, this can be assigned to a location category, in this case LA, as well as one or more experience categories, in this case "Beach" and "Surfing". When assigning an image to a category, it may also automatically inherent any parent categories, in which case this would also be included in the categories California, USA and America, as well as Nature and Activities.
[0101] At step 515, an image identifier (ID) is assigned to the image. The image identifier is typically a unique alpha-numeric code that can be generated automatically by one of the processing systems 210. The image identifier is used to subsequently retrieve the image, and can therefore form part of or be related to location information indicative of where the image is to be ultimately stored. Accordingly, the image identifier could form part of a URI (Uniform Resource Identifier), URL (Uniform Resource Locator), or the like, allowing the image to be retrieved.
[0102] At step 520, any other metadata to be associated with the image is typically generated, either by creating and/or importing this from an appropriate source. The metadata can include any one or more of experience and location categories, child location categories, image counts for each experience, image ID, image URL, image credit information, image ordering information, location latitude and longitude, image origin information, or the like.
[0103] At step 525, the image and associated metadata is stored. In this regard, the metadata could be encoded as part of the image file, and stored together with the image, although more typically the image and metadata are stored in respective image and metadata databases allowing these to be accessed independently. This allows the image to be provided, whilst the metadata is securely retained, thereby allowing commercial value in the metadata to be maintained.
[0104] It will be appreciated that the above described curation process is for the purpose of example only and that in practice these steps could be performed in any appropriate order depending on the preferred implementation.
[0105] Additionally, whilst the curation process has been described specifically with reference to images, a similar process could also be performed in respect of other media. For example, music or other audio information could be associated with a location and/or experience, allowing the music to be presented simultaneously with images from the same category. In a further example, audio information captured at a location or experience at the same time as images could be associated with the image itself, so that the audio can be presented at the same time as the image. For example, ambient sounds from a beach could be associated with beach related images.
[0106] An example of an image browsing process to identify a destination of interest will now be described with reference to Figure 7.
[0107] In this example, at step 700 the user utilises the client device 203 to access an interface, such as a webpage hosted by one or more of the processing systems 210. Upon initially accessing the webpage the one or more processing systems 210 typically select default categories at step 705, which may correspond to broad categories within a particular category structure, previously selected categories, or the like.
[0108] At step 710 an image search is performed utilising the selected categories. This typically involves comparing the selected categories to metadata associated with the images with image identifiers being retrieved at step 715. The image identifiers, and/or location information retrieved using the image identifiers can then be used to retrieve images at step 720. It will be appreciated that steps 715 and 720 allow the image identifier and image to be stored separately, for example in respective metadata and image databases, but that this is not essential, and instead these steps could be combined into a single image retrieval step in the event that the metadata is stored as part of or together with the image.
[0109] At step 725 images in the relevant categories will then be displayed as part of the interface and an example of this will now be described with reference to Figure 8.
[0110] In this example, the interface 800 typically includes an application toolbar 810, image windows 620 and a categories toolbar 830. The application toolbar 810 includes menu options, such as "File", "Edit", "View" and "History" menus, as may be found in any
application, such as a web browser application or the like. These are typically standard and will not therefore be described in any further detail.
[0111] The image window 820 displays a number of display images, with three images 821, 822, 823 being shown for the purpose of illustration. The arrangement of images will vary depending on factors such as the image sizes, screen resolution or the like, and these could be laid out dynamically, in accordance with templates or the like. A scroll bar 824 can also be provided allowing additional images to be displayed should these be available, thereby allowing a user to browse any number of images.
[0112] The category toolbar 830 includes location, experience categories 831 and 832 with controls 833, 834 being used to allow users to browse through different categories presented in the toolbar as shown. Thus, in this example, the selected categories include "LA", "Beach" and "Surfing", in which case the image of the surfer at Santa Monica Beach, which is mentioned above, could be displayed as one of the images 821, 822, 823.
[0113] At step 730 the user can revise the categories selected, for example by selecting additional categories, deselecting categories or selecting alternative categories using the category toolbar 830. This causes the process to return to step 710, allows steps 710 to 725 to be repeated so alternative images are displayed. During this iterative process of updating the categories and reviewing different images, at step 735 the user can also select actions to be performed, either with the currently selected categories and/or associated with particular ones of the display images, with the actions being performed at step 740 as will be described in more detail below.
[0114] Further specific examples of the browsing process and actions that can be performed will now be described in more detail. For the purpose of the following examples, it is assumed that the processing systems 210 include an account server, metadata server, image server and map server. In this example, the account server is a web server that interacts with the client device 203, allowing the user to access a website interface or administer a user account and is responsible for coordinating additional actions. The metadata server interacts with a metadata database storing metadata relating to the images, allowing the searching process to be performed. The image server interacts with an image database allowing images
to be retrieved and provided to client devices. The map server is adapted to provide mapping information and is therefore typically a GIS (Geographic Information System) server, or the like. It will be appreciated that these names are used for the purpose of illustration only, in particular to distinguish between the different servers used and these terms are not therefore intended to be limiting.
[0115] In this example, at step 900 the user accesses a webpage hosted by the account server using the client device 203, and at step 905 undergoes an optional login process, for example by providing a username and password or the like. It will be appreciated that in this regard a user account is typically established during a registration process and that the user account can be used to store a user profile including the username and password, demographic information relating to the user, as well as other information, such as previously identified images of interest, billing information, or the like. This will be performed in accordance with normal hosting procedures and this will not therefore be described in any more detail.
[0116] At step 910 the account server provides the webpage code, allowing the webpage to be displayed to the user on the client device 203. The webpage is typically initially displayed with default location and experience categories selected.
[0117] At step 915 the currently selected categories, which are initially the default categories, are sent to the metadata server by the client device 203. At step 920 the metadata server searches the metadata of reference images in order to identify display images within the selected categories. This process will be substantially as previously described and will involve comparing the selected categories to the categories and/or actual location and/or experiences specified in the metadata. This will not therefore be described in any further detail.
[0118] At step 925 the metadata server returns the image identifier of each display image to the client device 203, together with information indicative of the image storage location, such as a URL or similar, if this cannot be derived from the image identifier. At step 930 the client device 203 transfers the image identifiers to the image server, which in turn returns images to the client devices at step 935. At step 940 images are displayed, allowing the user to browse through images at step 945 and optionally update selected categories at step 950.
[0119] During this process, the user can proceed with selecting actions to be performed. In this regard, the user interface can display icons allowing different actions to be selected as shown in Figure 10. In Figure 10, similar reference numerals increased by 200 are used to described similar features to those previously described with reference to Figure 8, and these will not therefore be described in any further detail.
[0120] In this example, each image 1021, 1022, 1023 is associated with a respective image action toolbar 1021.1, 1022.1, 1023.1. A further gallery action toolbar 1025 is also provided within the image window 1020. Each action toolbar includes icons corresponding to respective actions, with six being shown in this example corresponding to play, add, information, map, favourite and share actions respectively (as viewed from left to right in Figure 10). It will be appreciated that these actions are for the purpose of example only and that in practice any number of different actions could be displayed depending on the preferred implementation.
[0121] In the current example, if the user selects an icon from one of the image action toolbars 1021.1, 1022.1, 1023.1, the action is performed on the current image only, whereas if the user selects an action from the gallery action toolbar 1025, then the action is performed on each of the display images within the gallery. Accordingly, it will be appreciated that this provides a mechanism to allow users to easily select to perform actions with respect to individual images, or alternatively all current images within a gallery, which in turn corresponds to the display images for the particular location and experience categories currently selected.
[0122] In the above example, all icons are shown in all of the image action and gallery action toolbars 1021.1, 1022.1, 1023.1, 1025, however, this is not essential, and icons may only be displayed if corresponding actions are available. For example, the play media icon may only be displayed for a gallery, allowing media such as music associated with the selected categories to be presented. Alternatively, the play media icon could be displayed with individual images if associated media is available corresponding to that image, for example in the event sound was recorded when the image was being captured. Accordingly the user interface could be populated based on the content of the reference image metadata, for example in the event that corresponding media is defined within the metadata.
[0123] In any event, once an icon has been selected, the selected actions can then be performed and examples of these will now be described in more detail with reference to Figures 11 to 15.
[0124] An example of the use of a favourites list will now be described with reference to Figure 11.
[0125] In this example, at step 1100 the user selects a favourite icon in one of the image action toolbars 1021.1, 1022.1, 1023.1, so that the corresponding display image is added to a favourites list. At step 1105, the client device provides the image identifier of the respective display image to the account server, causing the account server to store the image identifier in a favourites list associated with the user profile at step 1110.
[0126] At step 1115 the user can subsequently select a view favourite images option, for example via another interface (not shown). At step 1120 the account server retrieves the image identifiers of any images in the favourites list and returns these to the client device 203. The client device can then provide the image identifiers to the image server at step 1125, allowing the images to be returned and displayed at step 1130.
[0127] It will be appreciated that the images could be displayed using an interface similar to the interface described above, or alternatively using a thumbnail gallery or the like, depending on the preferred implementation. In either case, as the images could potentially span a large number of different categories, the category toolbar can be omitted or unpopulated for simplicity. Once displayed, the user can interact with the favourite images in a manner similar to any display images, for example by selecting additional actions to be performed, including viewing additional information, sharing, deselecting the image as a favourite image, adding the image to an itinerary, or the like. Additionally, users can filter favourites, for example based on categories, allowing them to refine their selection of images and categories of interest. This will not therefore be described in any further detail.
[0128] An example of a method of sharing images will now be described with reference to Figure 12.
[0129] In this example, at step 1200 a user selects a share icon in one of the image action toolbars 1021.1, 1022.1, 1023.1. At step 1205, a selection of options allowing the user to identify a recipient will typically be displayed. For example, this could involve having the interface display different delivery mechanisms, such as email, social media networks, or the like, together with a field allowing the user to enter a recipient address.
[0130] At step 1210 an image identifier of the corresponding display image and the recipient address are provided to the account server, which then transfers the image identifier to the recipient address at step 1215.
[0131] At step 1220 the recipient receives the image identifier, on their own client device. The image identifier will typically be in the form of a link, such as a URL or the like, allowing the user to select the link, causing the image identifier to be provided to the image server at step 1225, with the image being returned at step 1130. It will be appreciated that the link could be used to display the image only, or alternatively could be used to direct the recipient to a webpage interface hosted by the account server, with the image displayed thereon. This latter option is typically preferred as this will drive additional traffic to the account hosts webpage, and could entice the recipient to engage in a process of reviewing images and selecting a destination of interest. Additionally, as the image identifier only is shared, this allows a large number of images to be shared, such as an entire gallery, without this being unduly restrictive in terms of bandwidth. However, it will also be appreciated that as an alternative to sharing images by sharing image identifiers, this could also be achieved by sharing the images themselves, for example by embedding the image within a message.
[0132] Whilst the above described sharing process has been described with respect to images specifically, this could also be used to share other media, such as maps, music, or the like. In addition to sharing images or media via messages, this can also be achieved using other suitable sharing techniques. For example, this could include generating code, such as HTML code, that could be embedded in a webpage, blog or the like, allowing the images and/or media to be displayed.
[0133] An example of the process for displaying a map will now be described with reference to Figures 13A and 13B.
[0134] In this example, at step 1300 a user selects a map view option. For the purpose of this example, the map icon on the gallery action toolbar 1025 is selected, but it will be appreciated this process could alternatively be performed for an individual image, a favourites collection, or the like.
[0135] At step 1305 image identifiers of the display images currently being displayed in the gallery are provided to the metadata server, which returns the coordinates associated with each of the display images from the corresponding metadata at step 1310. At step 1315 the coordinates are sent from the client device 203 to a map server, allowing the map server to return map information at step 1320. The map information corresponds to a map including each of the specified coordinates, and accordingly the extent of the map will depend on the geographical distribution of the locations of the selected images.
[0136] At step 1325 the client device 203 can display the map together with a point of interest (POI) layer, which includes respective POI icons 1401 associated with each of the display images, as shown in Figure 14.
[0137] Whilst the process could end at this point, additional interaction could also be performed. For example, the user could interact with the map using normal map interaction techniques, such as a zoom, pan, scroll, or the like. Additionally, the user can select a POI 1401, as shown at 1401.1, allowing further information to be viewed. In this case, at step 1330 an indication of an image identifier corresponding to the POI is transferred to the metadata server, allowing information regarding the image to be returned at step 1335, whilst the image identifier is also used to retrieve an image at steps 1340 and 1345. This allows information associated with the image, such as an image description, as well as the image to be displayed, for example using respective pop-up boxes 1402, 1403.
[0138] Accordingly, it will be appreciated that this allows users to interact with the images via the map, allowing the user to understand the image in context with its surroundings, and how different images are positioned relative to each other geographically.
[0139] An example of the process for generating an itinerary and booking a trip will now be described with reference to Figure 15.
[0140] In this example, at step 1500 the user uses the add icon to add images to a trip itinerary. For the purpose of this example, the add icon on the gallery action toolbar 1025 is selected, so that a number of images in a gallery can be added to an itinerary, but it will be appreciated this process could alternatively be performed for individual images or a favourites collection, or the like.
[0141] Image identifiers associated with the currently displayed images are retrieved by the account server at step 1505 and forwarded to the metadata server at 1520. This can be performed either directly or via the client device 203 depending on the preferred implementation.
[0142] At step 1515, locations and experiences corresponding to each of the image identifiers are returned to the account server, with these being used to calculate an itinerary at step 1520. This can be performed in accordance with any standard route planning process, and would therefore typically involve choosing a sequence of locations and/or experiences utilizing a suitable optimisation algorithm. At step 1525, an example itinerary can be displayed, an example of which is shown in Figure 16.
[0143] In this example, the user interface 1600 includes an application toolbar 1610 and itinerary window 1620. The itinerary window 1620 includes a number of thumbnails 1621, 1622, 1623, 1624, 1625, 1626, 1627, 1628, each corresponding to a separate location activity, or attraction on the itinerary, and each typically including one or more of the corresponding display images. Additionally and/or alternatively, the thumbnails could correspond to different days in a form of calendar view. Each thumbnail may also include a respective toolbar 1621.1, 1622.1, 1623.1, 1624.1, 1625.1, 1626.1, 1627.1, 1628.1, which can provide associated options, such as to book accommodation, transport, entrance tickets, or the like, as well as to review additional information, such as reviews, details of individual activities for a given day or location, or the like. This process is typically performed by the accounts server based on the location information provided by the metadata server.
[0144] The user can then interact with and modify the itinerary at step 1535, for example by adding or removing locations or experiences, adding bookings for accommodation, transport or the like. This process can be performed iteratively, allowing the user to interactively
prepare the itinerary. Throughout this process, the user is able to interact with display images using categories as previously described. For example, an itinerary could be saved, allowing the user to return to the previously described process to browse further images, with these being used to add additional locations and/or destinations to the itinerary as desired.
[0145] The user can also select a map view option, allowing a map similar to that shown in Figure 14 to be displayed together with a travel route between the locations and experiences. Additionally, pricing could be generated dynamically based on selected accommodation and transport options, allowing the user to more easily adapt the itinerary to meet their requirements.
[0146] Once the user is happy with the itinerary, the user can proceed with making a booking at step 1535, which can be achieved using standard booking techniques, which will not therefore be described in any further detail.
[0147] In addition to being used to plan and book travel, the system can be used to gather information regarding users and their behaviours, which can in turn be leveraged in business processes, such as targeted advertising or the like. An example of this will now be described with reference to Figure 17.
[0148] In this example, at step 1700 one or more users are selected. This could be performed automatically, or could be performed manually, for example by an operator of the account server. This can be performed using the account server directly, or may be performed using a separate remote computer system. The one or more users could be selected in any manner and this could involve an operator reviewing a list of users, or alternatively by selecting users based on criteria, such as users with specified activity in given time periods, that have searched in specific locations or experience categories. For example, this could be used to identify users that have selected favourite or shared images within the last six months, but have not yet proceeded with a booking any travel or preparing an itinerary.
[0149] At step 1705, the account server obtains image identifiers of the images that the user has recorded as favourites, forwarding these to the metadata server at step 1710. The metadata server retrieves a list of experiences and locations associated with the images, providing these to the account server at step 1715.
[0150] At step 1720 the analysis of the locations and experiences can be performed, with this being used to allow a business process to be implemented at step 1725. The nature of the analysis performed will vary depending on the preferred implementation and the business process being performed.
[0151] For example, the business process could involve attempting to identify trends in popularity of different experiences and/or locations, or combinations of the two. This can be used to anticipate future bookings and ensure adequate availability.
[0152] Additionally, this can be used to analyse user behaviours, for example to anticipate when individuals are likely to make a booking. This allows advertising or discounts to be delivered in a more targeted manner, maximising their impact. For example, if it possible to predict when a user is likely to book a trip, advertising can be performed close to that time, encouraging the user to make a booking using the system.
[0153] Accordingly, it will be appreciated that the above described system allows users to review and interact with images by way of location and experience categories, allowing the user to identify destinations of interest. The destinations can then be used to construct an itinerary, which can be reviewed and modified as required, until the user is ready to make a booking. Thus, the process of interacting with the images naturally leads the user through the process of developing an itinerary and hence making a booking, increasing the opportunity for the host of the system to convert casual interest in images into revenue.
[0154] Whilst a number of different arrangements can be used, the use of respective servers for hosting the webpage, image metadata and images can lead to additional benefits, including the ability to more easily share images, whilst limiting access to valuable information regarding the images, such as the categories and additional information, thereby avoiding this information being copied.
[0155] Consequently, the above described process provides a purchase funnel to guide users from initial interest in planning a trip, but with little idea of destinations, through to a fully booked itinerary.
[0156] The early stage involves providing a mechanism to allow customers to explore an image library, selecting and sharing images with friends and family. This is designed to facilitate exploration of the images, with a simple interface allowing customers to connect with conscious, and unconscious, travel aspirations. Customers can access an entire image library, with tens of thousands of up-to-date images from the most popular and exotic destinations in the world right down to targeted Points of Interest.
[0157] The images are associated with supporting metadata, which not only aids the search process, but also allows users to gain an understanding of the destinations being shown in the images, thereby facilitating further research. For example, this allows customers to explore geographically via simple controls that allow them to drill down from continent level to local POIs. Customers can also explore the library via experiences and a combination of geographic selections with multiple experiences (e.g. Europe + Castles + Lakes).
[0158] With searching performed, the system provides a gallery space for users to view, arrange and filter favourite images. In particular, this allows multiple images to be viewed, with optional endless left to right scroll being used to allow large numbers of images to be accommodated. Other display options can also be provided, such as full screen slide shows, thereby enhancing the effect. Additional information, including text and video can be used to provide further information, with icons being displayed on the interface to allow users to select images as favourites, share images, or the like. Dynamic map views can also be provided as an alternative bird's-eye view of search results. Throughout this process, the location and experience category based navigation remains active, so that users can refine the categories and hence display images, thereby allowing them to progressively focus in on destinations of interest.
[0159] In addition to allowing for visual navigation, the system allows for information rich navigation, which can be used to generate individual landing pages corresponding to individual galleries, to cover every geographic and experience combination (e.g. Castles + Europe; Beach + Bahamas). Optimized pages can also include original copy, title text, header elements, image tags and teaser text for each image, with an information icon linking to city destination guide pages providing a path to conversion at any stage.
[0160] Icons can be used to facilitate sharing of individual images or entire galleries and can be performed via email and social media. Friends can explore and then join, allowing them to save and share results. An Add a Trip icon can be used to add destinations, places and POIs to new or existing trip itineraries.
[0161] Once favourite images have been established, the next stage involves providing a gallery of favourite images to allow users to peruse and organize these images. Every image and video is location tagged and can be logically arranged by location. Filtering options allowing images to be filtered based on locations (e.g. Just California images), one or more experiences (e.g. Snow) and a combination of the two (e.g. Europe + Snow).
[0162] Images can be viewed in variety of ways, including gallery or full-screen modes, or with additional information, such as links to city destination guide pages or the like. Map view options and universal navigational icons remain constant throughout, allowing users to tag, share or add favourite images or galleries to an itinerary at any time.
[0163] The next stage provides a fully integrated planning and booking environment designed to turn customers' travel dreams into realities. Multiple trips can be named, edited, saved and shared. This dramatically simplifies the often complex process of planning and booking trips and provides an easy to understand and manage overview of a single trip.
[0164] Destinations within a trip can be easily rearranged and amended, to provide full flexibility on dates and durations. Expandable transport linking panels between each destination indicate transport options and booking status. Indicative pricing is calculated in accordance with travel preferences and timings, and is dynamically updated as the itinerary evolves.
[0165] A sliding itinerary overview can be used to show broad details at a glance, or a close- up view of a single destination within a trip, allowing customer access to flights, hotels, car rental and Points of Interest. Dates can be preloaded into any booking screens to simplify the booking process. Activities and POIs can be perused and short-listed, and where available, tickets can be booked. A notes icon makes saving and sharing comments, details and reminders straightforward.
[0166] Different views can be used to show a sequential breakdown of each day, including hotel and transport arrangements, with estimated travel times between transfer points, hotels and POIs being indicated. Destination details can also be viewed on a map. This allows customers to dynamically rearrange the order of their itinerary to ensure that the trip meets their requirements.
[0167] Plans and itineraries relating to future trips can be shared via social media or via email, using similar techniques to sharing of images. A collaborate option can be used to allow co-planning between multiple users, regardless of location, thereby allowing groups to more easily plan and coordinate travel. A note function can be provided to make saving and sharing comments and reminders easy. Rich media slide shows at full screen maintain customer excitement through to final conversion.
[0168] Finally, the system can also be adapted to provide a post-trip platform where customers' trip images, stories and discoveries can be stored, relived and shared online, and, be printed as high-quality albums. Customers can add their own unique places, discoveries and stories, upload images, and store trip details including maps and hotel details for future reference.
[0169] Customer stories and photos can be combined with existing content to form trip albums that can be enjoyed privately or shared with the world. Customers can share trip albums digitally via email and social media. Print-on-demand technology provides opportunities for high-quality printed albums and keepsakes. These experiences can be used to introduce and encourage others to participate in the process.
[0170] Whilst the above described arrangements have focused on identifying a destination of interest by viewing images associated with location or experience categories, similar processes can also be used in a wider range of situations.
[0171] For example, a similar process could be applied to real estate by categorising images of different properties based on categories such as locations and/or property features, including the type of property, number of rooms, presence of parking, swimming pools, or the like . Users can view images of different properties selected by category and then review
images allowing the category selection to be progressively refined until one or more properties of interest are identified.
[0172] The system could also be used for other forms of purchase, such as retail shopping. In this instance, categories could include different types of product, such as shoes, handbags, or the like, and features of products, such as size, colour, shape, or the like.
[0173] Accordingly, the system can be applied more generally to any target of interest by displaying categorised images, and then using this to refine the categories and hence display more relevant images.
[0174] Accordingly, in one example, the apparatus can be provided for use in identifying targets of interest. In this example, the apparatus can include one or more electronic processing devices that determine at least one selected category, select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category and cause at least some of the display images to be displayed to the user. This allows the user to review the display images and refine the at least one selected category allowing the user to iteratively review display images to thereby identify targets of interest.
[0175] This can be applied to different targets, such as real estate, products and destinations and could utilise categories such as locations, types, features or experiences. Thus, it will be appreciated that in the event that the target is a destination, the categories can include location and/or experience categories, substantially as outlined above.
[0176] Throughout this specification and claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated integer or group of integers or steps but not the exclusion of any other integer or group of integers.
[0177] Persons skilled in the art will appreciate that numerous variations and modifications will become apparent. All such variations and modifications which become apparent to
persons skilled in the art, should be considered to fall within the spirit and scope that the invention broadly appearing before described.
Claims
THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1) Apparatus for use in identifying destinations of interest, the apparatus including one or more electronic processing devices that:
a) determine at least one selected category, the selected category including at least one of:
i) a location category; and,
ii) an experience category;
b) select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image location and a reference image experience and wherein the display images are selected at least partially in accordance with the at least one selected category; and, c) cause at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify destinations of interest.
2) Apparatus according to claim 1, wherein the one or more electronic processing devices select the display images in accordance with a category structure defining relationships between location categories and between experience categories.
3) Apparatus according to claim 2, wherein the one or more electronic processing devices select display images having at least one of a reference image location and reference image experience including at least one of a location category and experience category within the selected category.
4) Apparatus according to any one of the claims 1 to 3, wherein, in response to display of at least some of the display images, the one or more electronic processing devices:
a) determine user interaction with at least one of the images; and,
b) at least partially refine the at least one selected category in accordance with the user interaction.
5) Apparatus according to any one of the claims 1 to 3, wherein, in response to display of at least some of the display images, the one or more electronic processing devices:
a) determine at least one selected category in accordance with user input commands; and,
b) select display images from the plurality of reference images at least partially in accordance with the at least one further selected category.
6) Apparatus according to any one of the claims 1 to 5, wherein the one or more electronic processing devices:
a) cause a list of categories to be displayed to a user; and,
b) determine at least one selected category in accordance with user interaction with the list of categories.
7) Apparatus according to claim 6, wherein the list of categories includes a list of location categories and a list of experience categories.
8) Apparatus according to claim 6 or claim 7, wherein the list of categories is displayed in accordance with at least one of:
a) a number of images in each category;
b) a popularity of each category; and,
c) a category structure.
9) Apparatus according to any one of the claims 1 to 8, wherein the one or more electronic processing devices cause a user interface to be displayed to a user on a client device, the user interface including:
a) a number of display images; and,
b) at least one toolbar including a list of categories.
10) Apparatus according to any one of the claims 1 to 9, wherein the one or more electronic processing devices perform at least one action associated with one or more of the display images.
11) Apparatus according to claim 10, wherein the one or more electronic processing devices determine at least one action associated with a respective display image in accordance with at least one of:
a) user input commands; and,
b) selection of an icon displayed coincidently with the respective display image.
12) Apparatus according to claim 10 or claim 11, wherein the at least one action includes at least one of:
a) sharing at least one display image with a recipient;
b) recording an indication of at least one display image as a favourite image;
c) displaying additional information relating to at least one display image; and,
d) updating an itinerary at least partially in accordance with at least one display image.
13) Apparatus according to claim 12, wherein the one or more electronic processing devices share at least one display image by:
a) determining an image identifier associated with the at least one display image; and, b) transferring an indication of the image identifier to a destination.
14) Apparatus according to claim 12, wherein the one or more electronic processing devices flag at least one display image as a favourite image by:
a) determining an image identifier associated with the at least one display image; and, b) recording an indication of the image identifier as part of user data associated with the user.
15) Apparatus according to any one of the claims 1 to 14, wherein the one or more electronic processing devices:
a) select display images by searching a metadata store including metadata for the plurality of reference images;
b) retrieve image identifiers for the display images from the metadata store; and, c) use the image identifiers to retrieve display images from an image store.
16) Apparatus according to any one of the claims 1 to 15, wherein the one or more electronic processing devices include a metadata server that:
a) receives an indication of at least one selected category;
b) selects the display images;
c) determines an image identifier associated with each display image; and
d) provides an indication of the image identifier.
17) Apparatus according to any one of the claims 1 to 16, wherein the one or more electronic processing devices include an image server that:
a) receives an indication of at least one image identifier;
b) uses the at least one image identifier to retrieve at least one image from an image store; and
c) provides the images.
18) Apparatus according to any one of the claims 1 to 17, wherein:
a) a metadata server:
i) receives an indication of at least one selected category from a client device via a communications network; and,
ii) provides an indication of the image identifier to the client device via the communications network; and,
b) an image server:
i) receives the indication of at least one image identifier from the client device via the communications network; and,
ii) provides the image to the client device via the communications network.
19) Apparatus according to any one of the claims 1 to 18, wherein the one or more electronic processing devices include an account server that performs actions at least in part using image identifiers received from a client device via a communications network.
20) Apparatus according to any one of the claims 1 to 19, wherein the one or more electronic processing devices include an account server that:
a) receives an action request associated with at least one display image from a client device via a communications network;
b) determines the image identifier of the at least one display image;
c) determines at least one of a location category and an experience category using the image identifier; and,
d) performs an action at least in part using the at least one of a location category and an experience category.
21) Apparatus according to claim 20, wherein the account server:
a) transfers the image identifier to a metadata server; and,
b) receives an indication of the at least one of a location category and an experience category from the metadata server.
22) Apparatus according to claim 20 or claim 21, wherein the account server:
a) determines selected display images in accordance with user selections made via a client device;
b) determines locations and experiences associated with selected display images; and, c) uses the locations and experiences to at least partially prepare a trip itinerary.
23) Apparatus according to claim 22, wherein the selected display images are at least in part used to book a trip.
24) Apparatus according to any one of the claims 1 to 23, wherein the one or more electronic processing devices:
a) select display media from a plurality of reference media, wherein each of the reference media are associated with metadata indicative of at least one of a reference media location and a reference media experience and wherein the display media are selected at least partially in accordance with the at least one selected category; and, b) cause at least some of the display media to be displayed to the user.
25) A method for use in reviewing images of destinations, the method including, in one or more electronic processing devices:
a) determining at least one selected category, the selected category including at least one of:
i) a location category; and,
ii) an experience category;
b) selecting display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of a reference image location and a reference image experience and wherein the display images are selected at least partially in accordance with the at least one selected category; and, c) causing at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify destinations of interest.
26) Apparatus for use in identifying targets of interest, the apparatus including one or more electronic processing devices that:
a) determine at least one selected category;
b) select display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category; and,
c) cause at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify targets of interest.
27) Apparatus according to claim 26, wherein the target includes at least one of:
a) real estate;
b) products; and,
c) destinations.
28) Apparatus according to claim 26, wherein the categories include at least one of:
a) locations;
b) types;
c) features; and,
d) experiences.
29) A method for use in identifying targets of interest, the method including, in one or more electronic processing devices:
a) determining at least one selected category;
b) selecting display images from a plurality of reference images, wherein each of the reference images are associated with metadata indicative of at least one of a reference image category and wherein the display images are selected at least partially in accordance with the at least one selected category; and,
c) causing at least some of the display images to be displayed to the user to allow the user to:
i) review the display images; and,
ii) refine the at least one selected category allowing the user to iteratively review display images to thereby identify targets of interest.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2014904097 | 2014-10-14 | ||
AU2014904097A AU2014904097A0 (en) | 2014-10-14 | Identifying method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016058043A1 true WO2016058043A1 (en) | 2016-04-21 |
Family
ID=55745877
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/AU2015/050624 WO2016058043A1 (en) | 2014-10-14 | 2015-10-13 | Identifying method and apparatus |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016058043A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9734167B2 (en) | 2011-09-21 | 2017-08-15 | Horsetooth Ventures, LLC | Interactive image display and selection system |
US11068532B2 (en) | 2011-09-21 | 2021-07-20 | Horsetooth Ventures, LLC | Interactive image display and selection system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020091535A1 (en) * | 2001-01-08 | 2002-07-11 | Getinaction, Ltd | System and method for selecting a vacation destination and accommodation |
US20120089597A1 (en) * | 2010-10-12 | 2012-04-12 | Anna Visioli | System and Method for Searching Real Estate Listings Using Imagery |
US20130069990A1 (en) * | 2011-09-21 | 2013-03-21 | Horsetooth Ventures, LLC | Interactive Image Display and Selection System |
-
2015
- 2015-10-13 WO PCT/AU2015/050624 patent/WO2016058043A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020091535A1 (en) * | 2001-01-08 | 2002-07-11 | Getinaction, Ltd | System and method for selecting a vacation destination and accommodation |
US20120089597A1 (en) * | 2010-10-12 | 2012-04-12 | Anna Visioli | System and Method for Searching Real Estate Listings Using Imagery |
US20130069990A1 (en) * | 2011-09-21 | 2013-03-21 | Horsetooth Ventures, LLC | Interactive Image Display and Selection System |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9734167B2 (en) | 2011-09-21 | 2017-08-15 | Horsetooth Ventures, LLC | Interactive image display and selection system |
US10459967B2 (en) | 2011-09-21 | 2019-10-29 | Horsetooth Ventures, LLC | Interactive image display and selection system |
US11068532B2 (en) | 2011-09-21 | 2021-07-20 | Horsetooth Ventures, LLC | Interactive image display and selection system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9430498B2 (en) | Methods and systems for generating a digital celebrity map tour guide | |
US7945852B1 (en) | Strategies for annotating digital maps | |
US9465890B1 (en) | Method and system for managing and sharing geographically-linked content | |
US8467955B2 (en) | Map-centric service for social events | |
US9483495B1 (en) | Selecting photographs for a destination or point of interest | |
US20160034827A1 (en) | Automated diary population ii | |
US20140297666A1 (en) | Managing event data in a diary | |
US20060271277A1 (en) | Interactive map-based travel guide | |
US20170161651A1 (en) | Electronic System and Method for Travel Planning, Based On Object-Oriented Technology | |
Mango et al. | Web-based GIS for managing and promoting tourism in sub-Saharan Africa | |
US11062374B2 (en) | Continuum-based selection of product choice | |
US20160335272A1 (en) | Methods and systems for rating celebrities for generating a digital celebrity map tour guide | |
WO2010042726A1 (en) | Interactive metro guide map and portal system, methods of operation, and storage medium | |
US10776443B2 (en) | Systems and methods for creating user-managed online pages (MAPpages) linked to locations on an interactive digital map | |
KR20080007437A (en) | Virtual earth | |
CN102063512A (en) | Virtual earth | |
KR20170030379A (en) | Method and system for personalized travel curation service | |
JP2003517164A (en) | User navigation system and method | |
EP3176736A1 (en) | Electronic system and method for travel planning, based on object-oriented technology | |
US20210304086A1 (en) | System and method for sharing a travel itinerary with booking options | |
US20100174591A1 (en) | Data Access Engine And Graphic Interface For Portable Computer Devices | |
Kleinen et al. | Interactive faceted search and exploration of open social media data on a touchscreen mobile phone | |
CN101601025A (en) | Be used for providing system and method to the center access entrance of geographical focus to a plurality of participants | |
WO2016058043A1 (en) | Identifying method and apparatus | |
KR20170030380A (en) | Method and system for planning travel route using map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15850186 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15850186 Country of ref document: EP Kind code of ref document: A1 |