WO2014190374A1 - Procédé, appareil et système permettant d'afficher des données - Google Patents

Procédé, appareil et système permettant d'afficher des données Download PDF

Info

Publication number
WO2014190374A1
WO2014190374A1 PCT/AU2014/000536 AU2014000536W WO2014190374A1 WO 2014190374 A1 WO2014190374 A1 WO 2014190374A1 AU 2014000536 W AU2014000536 W AU 2014000536W WO 2014190374 A1 WO2014190374 A1 WO 2014190374A1
Authority
WO
WIPO (PCT)
Prior art keywords
item
image
images
data
attributes
Prior art date
Application number
PCT/AU2014/000536
Other languages
English (en)
Inventor
Matthew Lloyd Byrne
Michael Alexander ROBINS
Original Assignee
Curicon Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2013901959A external-priority patent/AU2013901959A0/en
Application filed by Curicon Pty Ltd filed Critical Curicon Pty Ltd
Priority to AU2014273829A priority Critical patent/AU2014273829A1/en
Publication of WO2014190374A1 publication Critical patent/WO2014190374A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]

Definitions

  • the present invention relates generally to image processing and, in particular, to a method and apparatus for displaying data for use in determining attributes of an item, and to a computer program product including a computer readable medium having recorded thereon a computer program for displaying data for use in determining attributes of an item.
  • e-commerce websites such as EBayTM
  • Some e-commerce websites have huge market revenues and provide both buyers and sellers with the convenience of shopping online.
  • individuals When viewing an item listed on an e-commerce website, individuals typically fall into one of two groups.
  • the first group have an intimate knowledge of the listed item.
  • the first group may be able to identify the item and determine attributes of the item based on viewing logos and brand colouring displayed in an image of the item.
  • the second group may be totally unfamiliar with the item.
  • a seller In order to aid buyers in distinguishing items, as well as providing an image of an item being sold, a seller is typically required to provide data representing attributes of the item for use in providing a description of the item.
  • the data representing the attributes of the item may be referred to as "attribute data”.
  • the description may include data representing item attributes such as the name of the item, a category, a model number, condition of the item, brand, style and features of the item.
  • the description may include data representing other item attributes such as a recommended retail price, an average price for the item, origin of the item, year of release of the item, name of an artist and an indication of the rarity of the item.
  • the description may include data representing other item attributes such as a volume number, title and issue number.
  • the seller is typically required to enter the data representing the item attributes into an electronic "listing form" comprising one or more data fields.
  • the listing form may include a data field for each of the item attributes discussed above, such as a title field, a category field, a model number field, a brand field etc.
  • a potential buyer using an e-commerce website currently faces difficulty in searching for an item that the buyer may like to purchase or obtain a value for the item from an e- commerce website as the buyer may not be able to identify the item, describe any of the attributes of the item or know how to define the item for a text based search.
  • An image of the item is used to retrieve data from an image database.
  • the image database contains data representing images classified based on visual features and other attributes.
  • a method of displaying data for use in determining attributes of an item comprising:
  • a system for displaying data for use in determining attributes of an item comprising:
  • a memory for storing data and a computer program
  • a processor coupled to the memory for executing the computer program, the computer program comprising instructions for:
  • an apparatus for displaying data for use in determining attributes of an item comprising:
  • a computer readable medium comprising a computer program stored thereon for displaying data for use in determining attributes of an item, said program comprising:
  • code for receiving one or more images of the item code for analysing at least a selected one of the received images to determine one or more attributes of the selected image
  • a computer program product including a computer readable medium having recorded thereon a computer program for implementing the method described above.
  • a method of adding data for an item to an e-commerce website comprising:
  • a system for adding data for an item to an e-commerce website comprising:
  • a memory for storing data and a computer program
  • a processor coupled to the memory for executing the computer program, the computer program comprising instructions for:
  • a computer readable medium comprising a computer program stored thereon for adding data for an item to an e-commerce website, said method comprising:
  • an apparatus for adding data for an item to an e-commerce website comprising:
  • Fig. 1 is a flow diagram of a method of listing an item according to the present disclosure
  • FIG. 3 is a schematic block diagram of a computer module upon which a server computer can be practiced
  • FIG. 4 is a flow diagram showing a method of selecting an image
  • FIG. 5 is a flow diagram showing a method of performing geometric operations on an image
  • Fig. 6 is a flow diagram showing a method of isolating an object in an image
  • Fig 7. Is a flow diagram showing a method of determining feature descriptors for an image
  • FIG. 8 is a flow diagram showing a method of comparing images.
  • Fig. 9 shows a user interface displaying a representation of a downloaded image.
  • a method 100 of retrieving data for use in determining attributes of an item is described below with reference to Figs. 1 to 8.
  • the retrieved data represents the attributes of the item.
  • the attributes represented by the retrieved data may include, for example, a name of the item, a category, a model number, condition of the item, brand, style and/or features of the item.
  • the attributes represented by the retrieved data may also include a recommended retail price, an average price for the item, origin of the item, year of release of the item, name of an artist and an indication of the rarity of the item.
  • the attributes represented by the retrieved data may include a volume number, title and issue number.
  • the data retrieved in accordance with the method 100 may be used to identify the item, without an intimate knowledge of the item being required.
  • the retrieved data may be added to a listing of the item on an e-commerce website, for example, where the item is being listed for sale on the e-commerce website.
  • the retrieved data may also be used for cataloguing the item on the e-commerce website. For example, a seller may wish to catalogue their item or items on the e-commerce website for use in possible future sale or sales.
  • the cataloguing may also be used for keeping track of a seller's items and/or to determine the value of the items. Such cataloguing has conventionally been problematic for sellers due to the complexity of identifying each item, determining the attributes of the item and providing required attribute data for each item.
  • the data retrieved for an item may also be used for displaying website locations and/or physical locations (e.g., an address of a traditional retail store) from which the item may be purchased.
  • the item may be identified and the attribute data retrieved based on one or more images of the item.
  • Visual features and attributes are determined based on an analysis of a selected one of the images of the item.
  • the determined visual features and attributes are used to compare the image against data stored in a database containing representations of images of items.
  • the database containing the image representations may be referred to the "image database”.
  • the determined visual features and attributes of the selected image may be represented by data in the form of feature descriptors.
  • Each of the images in the image database is classified based on one or more visual features and attributes of the associated database image, and an item represented in the associated image.
  • the visual features and attributes of each of the database images may be represented by data (e.g., metadata and/or feature descriptors) which may be stored in the image database to represent each database image.
  • data e.g., metadata and/or feature descriptors
  • each database image is stored in the image database as data (e.g., metadata, feature descriptors) representing the database image.
  • actual image data e.g., a joint photo-graphics expert group (JPEG) image file, an exchangeable image file (EXIF), a portable network graphics file (PNG) or a graphics interchange format (GIF) file
  • JPEG joint photo-graphics expert group
  • EXIF exchangeable image file
  • PNG portable network graphics file
  • GIF graphics interchange format
  • an identifier representing the item represented by each database image may be stored in the image database.
  • the data stored in the image database for each database image may represent the nature and properties of an item represented by each database image.
  • the data stored in the image database for each database image may include, but is not limited to, data representing attributes of the item including a brand of the item, a title for the item, an average price of the item, conditional pricing of the item (e.g., a price set when certain physical conditions are met), release date of the item, number of copies or number of the items that are available for sale, popularity of the item, trending nature of the item, historical data associated with the item, category information for the item and a model number for the item.
  • data is retrieved from the image database based on a comparison of an image of an item and the database images.
  • One or more data fields of an electronic listing form may be populated using the retrieved data in order to provide a description of the item for a listing of the item on an e-commerce website or the like.
  • the data fields of such an electronic listing form may be populated for use in cataloguing the item (e.g., the retrieved data may be used to fill a catalogue entry).
  • FIGs. 2A and 2B depict a general-purpose computer system 200, upon which the various arrangements described can be practiced.
  • the computer system 200 includes: a computer module 201 ; input devices such as a keyboard 202, a mouse pointer device 203, a scanner 226, a camera 227, and a microphone 280; and output devices including a printer 215, a display device 214 and loudspeakers 217.
  • the camera 227 may be a webcam, for example.
  • An external Modulator-Demodulator (Modem) transceiver device 216 may be used by the computer module 201 for communicating to and from a communications network 220 via a connection 221.
  • the communications network 220 may be a wide-area network (WAN), such as the Internet, a cellular telecommunications network, or a private WAN.
  • WAN wide-area network
  • the modem 216 may be a traditional "dial-up" modem.
  • the modem 216 may be a broadband modem.
  • connection 221 may be an asymmetric digital subscriber line (ADSL), a very-high-bit-rate digital subscriber line (VDSL) or a symmetric digital subscriber line (SDSL) connection.
  • a wireless modem may also be used for wireless connection to the communications network 220.
  • a server computer module 300 is connected to the network 220.
  • the computer module 201 may communicate with the server computer module 300 via the communications network 220.
  • the server computer module 300 will be further described below with reference to Fig. 3.
  • the computer module 201 typically includes at least one processor unit 205.
  • the computer module 201 may comprise a graphical processing unit (GPU).
  • the processor unit 205 may comprise one or more GPUs.
  • the computer module 201 typically includes a memory unit 206.
  • the memory unit 206 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM).
  • the computer module 201 also includes an number of input/output (I/O) interfaces including: an audio-video interface 207 that couples to the video display 214, loudspeakers 217 and microphone 280; an I/O interface 213 that couples to the keyboard 202, mouse 203, scanner 226, camera 227 and optionally a joystick or other human interface device (not illustrated); and an interface 208 for the external modem 216 and printer 215.
  • the modem 216 may be incorporated within the computer module 201 , for example within the interface 208.
  • the computer module 201 also has a local network interface 211 , which permits coupling of the computer system 200 via a connection 223 to a local-area communications network 222, known as a Local Area Network (LAN).
  • LAN Local Area Network
  • the local communications network 222 may also couple to the wide network 220 via a connection 224, which would typically include a so-called "firewall" device or device of similar functionality.
  • the local network interface 211 may comprise an Ethernet circuit card, a Bluetooth ® wireless arrangement or an IEEE 802.11 wireless arrangement; however, numerous other types of interfaces may be practiced for the interface 211.
  • the I/O interfaces 208 and 213 may afford either or both of serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and having corresponding USB connectors (not illustrated).
  • Storage devices 209 are provided and typically include a hard disk drive (HDD) 210. Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. In one arrangement, the storage devices 209 may include a solid state drive.
  • An optical disk drive 212 is typically provided to act as a non-volatile source of data.
  • Portable memory devices such optical disks (e.g., CD-ROM, DVD, Blu-ray DiscTM), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 200.
  • the components 205 to 213 of the computer module 201 typically communicate via an interconnected bus 204 and in a manner that results in a conventional mode of operation of the computer system 200 known to those in the relevant art.
  • the processor 205 is coupled to the system bus 204 using a connection 218.
  • the memory 206 and optical disk drive 212 are coupled to the system bus 204 by connections 219. Examples of computers on which the described arrangements can be practised include IBM-PC's and compatibles, OracleTM Sparcstations, Apple MacTM, ARMTM or like computer systems. The described arrangements may also be practised on virtual computers or the like.
  • One or more steps of methods described below may be implemented using the computer system 200 wherein the processes of Figs. 1 and 4 to 7, to be described, may be implemented as one or more software application programs 233 executable within the computer system 200.
  • one or more of the steps of the described methods are affected by instructions 231 (see Fig. 2B) in the software 233 that are carried out within the computer system 200.
  • the software instructions 231 may be formed as one or more code modules, each for performing one or more particular tasks.
  • the software may also be divided into two separate parts, in which a first part and the corresponding code modules performs the described methods and a second part and the corresponding code modules manage a user interface between the first part and the user.
  • the software may be stored in a computer readable medium, including the storage devices described below, for example.
  • the software 233 is typically stored in the HDD 210 or the memory 206. However, in one arrangement, the software 233 may be stored in a solid state drive.
  • the software is typically loaded into the computer system 200 from the computer readable medium, and then executed by the computer system 200.
  • the software 233 may be stored on an optically readable disk storage medium (e.g., CD-ROM) 225 that is read by the optical disk drive 212.
  • a computer readable medium having such software or computer program recorded on the computer readable medium is a computer program product.
  • the use of the computer program product in the computer system 200 preferably effects an advantageous apparatus for implementing the described methods.
  • the application programs 233 may be supplied to the user encoded on one or more CD-ROMs 225 and read via the corresponding drive 212, or alternatively may be read by the user from the networks 220 or 222. Still further, the software can also be loaded into the computer system 200 from other computer readable media.
  • Computer readable storage media refers to any non-transitory tangible storage medium that provides recorded instructions and/or data to the computer system 200 for execution and/or processing.
  • Examples of such storage media include floppy disks, magnetic tape, CD-ROM, DVD, Blu-rayTM Disc, a hard disk drive, a ROM or integrated circuit, USB memory, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such devices are internal or external of the computer module 201.
  • Examples of transitory or non-tangible computer readable transmission media that may also participate in the provision of software, application programs, instructions and/or data to the computer module 201 include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like.
  • GUIs graphical user interfaces
  • a user of the computer system 200 and the application may manipulate the interface in a functionally adaptable manner to provide controlling commands and/or input to the applications associated with the GUI(s).
  • Other forms of functionally adaptable user interfaces may also be implemented, such as an audio interface utilizing speech prompts output via the loudspeakers 217 and user voice commands input via the microphone 280.
  • Fig. 2B is a detailed schematic block diagram of the processor 205 and a "memory" 234.
  • the memory 234 represents a logical aggregation of all the memory modules (including the HDD 209 and semiconductor memory 206) that can be accessed by the computer module 201 in Fig. 2A.
  • a power-on self-test (POST) program 250 executes.
  • the POST program 250 is typically stored in a ROM 249 of the semiconductor memory 206 of Fig. 2A.
  • a hardware device such as the ROM 249 storing software is sometimes referred to as firmware.
  • the POST program 250 examines hardware within the computer module 201 to ensure proper functioning and typically checks the processor 205, the memory 234 (209, 206), and a basic input-output systems software (BIOS) module 251 , also typically stored in the ROM 249, for correct operation.
  • BIOS 251 activates the hard disk drive 210 of Fig. 2A.
  • Activation of the hard disk drive 210 causes a bootstrap loader program 252 that is resident on the hard disk drive 210 to execute via the processor 205.
  • the operating system 253 is a system level application, executable by the processor 205, to fulfil various high level functions, including processor management, memory management, device management, storage management, software application interface, and generic user interface.
  • the operating system 253 manages the memory 234 (209, 206) to ensure that each process or application running on the computer module 201 has sufficient memory in which to execute without colliding with memory allocated to another process. Furthermore, the different types of memory available in the system 200 of Fig. 2A must be used properly so that each process can run effectively. Accordingly, the aggregated memory 234 is not intended to illustrate how particular segments of memory are allocated (unless otherwise stated), but rather to provide a general view of the memory accessible by the computer system 200 and how such is used. In one arrangement, the memory may include the memory of a video card device.
  • the processor 205 includes a number of functional modules including a control unit 239, an arithmetic logic unit (ALU) 240, and a local or internal memory 248, sometimes called a cache memory.
  • the cache memory 248 typically include a number of storage registers 244 - 246 in a register section.
  • One or more internal buses 241 functionally interconnect these functional modules.
  • the processor 205 typically also has one or more interfaces 242 for communicating with external devices via the system bus 204, using a connection 218.
  • the memory 234 is coupled to the bus 204 using a connection 219.
  • the application program 233 includes a sequence of instructions 231 that may include conditional branch and loop instructions.
  • the program 233 may also include data 232 which is used in execution of the program 233.
  • the instructions 231 and the data 232 are stored in memory locations 228, 229, 230 and 235, 236, 237, respectively.
  • a particular instruction may be stored in a single memory location as depicted by the instruction shown in the memory location 230.
  • an instruction may be segmented into a number of parts each of which is stored in a separate memory location, as depicted by the instruction segments shown in the memory locations 228 and 229.
  • the processor 205 is given a set of instructions which are executed therein. In one arrangement, instructions may be executed in parallel where some instructions are executed on a different processor (e.g., by executing on non-CPU cores where the computer module 201 comprises one or more GPUs). As described above, the processor 205 may comprise one or more GPUs. The processor 205 waits for a subsequent input, to which the processor 205 reacts to by executing another set of instructions.
  • Each input may be provided from one or more of a number of sources, including data generated by one or more of the input devices 202, 203, data received from an external source across one of the networks 220, 202, data retrieved from one of the storage devices 206, 209 or data retrieved from a storage medium 225 inserted into the corresponding reader 212, all depicted in Fig. 2A.
  • the execution of a set of the instructions may in some cases result in output of data. Execution may also involve storing data or variables to the memory 234.
  • the disclosed arrangements use input variables 254, which are stored in the memory 234 in corresponding memory locations 255, 256, 257.
  • the disclosed arrangements produce output variables 261 , which are stored in the memory 234 in corresponding memory locations 262, 263, 264.
  • Intermediate variables 258 may be stored in memory locations 259, 260, 266 and 267.
  • each fetch, decode, and execute cycle comprises:
  • a fetch operation which fetches or reads an instruction 231 from a memory location 228, 229, 230;
  • control unit 239 and/or the ALU 240 execute the instruction.
  • a further fetch, decode, and execute cycle for the next instruction may be executed.
  • a store cycle may be performed by which the control unit 239 stores or writes a value to a memory location 232.
  • One or more steps or sub-processes in the processes of Figs. 1 and 4 to 8 is associated with one or more segments of the program 233 and is performed by the register section 244, 245, 247, the ALU 240, and the control unit 239 in the processor 205 working together to perform the fetch, decode, and execute cycles for every instruction in the instruction set for the noted segments of the program 233.
  • the described methods may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of the described methods.
  • dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories.
  • the server computer module 300 typically includes at least one processor unit 305. Again, in one arrangement, the computer module 300 may comprise a GPU. In one arrangement, the processor 305 may comprise one or more GPUs.
  • the computer module 300 typically includes a memory unit 306.
  • the memory unit 306 may have semiconductor random access memory (RAM) and semiconductor read only memory (ROM).
  • the computer module 300 also includes a number of input/output (I/O) interfaces including: an audio-video interface 307.
  • the computer module 300 also includes an interface 308 for an external modem transceiver device 316 for communicating to and from the communications network 220 via a connection 321.
  • storage devices 309 are provided and typically include a hard disk drive (HDD) 310.
  • HDD hard disk drive
  • Other storage devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used.
  • the storage devices 309 may include a solid state drive.
  • An optical disk drive 312 is typically provided to act as a non-volatile source of data.
  • Portable memory devices such optical disks (e.g., CD-ROM, DVD, Blu-ray DiscTM), USB-RAM, portable, external hard drives, and floppy disks, for example, may be used as appropriate sources of data to the system 300.
  • the components 305 to 313 of the computer module 300 typically communicate via an interconnected bus 304 and in a manner that results in a conventional mode of operation of the computer module 300 known to those in the relevant art.
  • Examples of computers on which the server computer module 300 can be practised include IBM-PC's and compatibles, Sun Sparcstations, Apple MacTM, ARMTM or a like computer systems. As described above, the described arrangements may also be practised on virtual computers or the like.
  • one or more steps of the described methods may be implemented as one or more software application programs executable within a smart-phone, tablet device or similar portable computing device connected to the network 220. Accordingly, the processing performed to execute one or more steps of the described methods may be distributed across the computer module 201 , the server computer module 300 and/or a portable computing device (e.g,. a smart-phone, tablet device or the like) connected to the network 220. For example, steps 105 and 107 of the method 100 described below may be executed within a smart-phone, tablet device or similar portable computing device.
  • the method 100 will be described below by way of example where data is retrieved and displayed for determining attributes of a Spider-man comic.
  • the retrieved data and determined attributes may be used to identify the Spider-man comic and may be added to a listing of the Spider-man comic on an e-commerce website.
  • the retrieved data may be added as a catalogue entry on the e-commerce website in order to catalogue the Spider-man comic in a collection of such comics.
  • the method 100 may be used to retrieve and display data, and to identify any item, such as for example, antique and pop culture collectables (e.g., coins, stamps, magazines, comics, books, furniture), manufactured products, or any other consumer item that a user may wish to identify, list and/or catalogue on an e-commerce website.
  • the method 100 may also be used with other types of items such as packaged food. For example, a user may wish to identify foreign food items which are labelled in a foreign language.
  • the method 100 may be used for border control such as by a customs agent or the like wishing to control the types of foods being brought into a region or country.
  • the method 100 begins at step 101 , where the software program 333, under execution of the processor 305, is used for receiving one or more images of the item.
  • the user may capture a series of video images of the item using the computer module 201 and the camera 227.
  • only a single image of the item may be captured by the user.
  • the processor 205 uploads the images to the server computer module 300 and the images may be stored within the storage devices 309 of the service computer module 300.
  • the camera 227 may be in the form of a webcam and the image(s) may be captured using the webcam.
  • the video images of the item may be captured using a smart-phone, tablet device or similar device and then be uploaded to the computer module 201 or server computer module 300 via the network 220.
  • the video images captured at step 101 may be stored in the storage device 209 before being uploaded to the server computer module 300.
  • the video images may be captured over a period of time prior to execution of the method 100 with the captured images being stored in the storage device 209 prior to being uploaded to the computer module 300.
  • the resolution of a video image or still image captured by the camera 227 or other device may be equal to or exceed 640x480 pixels in size with a quality of two or greater megapixels. If the captured images are of sufficient quality or size, the images may be down- sampled or compressed (e.g., using JPEG compression) prior to the images being uploaded to the server computer module 300 in order to minimise processing time. Such down-sampling or compression may also be performed to minimise transmission time of the images from the computer module 201 to the server computer module 300.
  • the software program 333 under execution of the processor 305, selects one of the images received at step 101 for further analysis.
  • the method 100 is used for analysing one of the images received at step 103 to determine one or more attributes of the selected image.
  • a method 400 of selecting an image, as executed at step 103, will be described below with reference to Fig. 4.
  • geometric operations are performed on the selected image (i.e., the image selected at step 103) by the software program 333.
  • the geometric operations may be used for analysing at least the image selected at step 103 to determine one or more attributes of the selected image, as described in detail below.
  • a method 500 of performing geometric operations on an image, as executed at step 105, will be described in detail below with reference to Fig. 5.
  • the processed image may be stored within the storage module 309 of the computer module 300.
  • the method 500 may be implemented as one or more software code modules of the software application program 233 resident in the hard disk drive 210 and being controlled in its execution by the processor 205.
  • background isolation may be performed on the selected image to isolate a graphic object, representing the item in the selected image, from the background of the selected image.
  • a bilateral blur filter may be applied to the selected image. The filter may be used to blur the image.
  • the selected image may also be converted to grayscale and an adaptive threshold applied to the image.
  • Edge detection may then be performed on the image to detect edges within the image in order to isolate the object.
  • the software program 333 under execution of the processor 305, determines feature descriptors for the selected image.
  • the feature descriptors may be based on one or more hue, saturation, and value (HSV) histograms for the image selected at step 103.
  • the feature descriptors may also be based on a reduced HSV histogram determined for the selected image. Reduced HSV histograms may represent a region(s) of space in the selected image, rather than the full image.
  • an HSV histogram of a region in a top right corner of the selected image is determined.
  • An HSV histogram of a region in a top left corner of the selected image may also be determined.
  • the histogram of the top right corner and the histogram of the top left corner of the selected image may be stored as separate feature descriptors.
  • the feature descriptors may also be based on geometric based features.
  • the feature descriptors determined at step 109 may be stored in the storage module 309 of the server computer module 300 for use in comparing the selected image to data (e.g., metadata, feature descriptors) representing one or more further images stored in a database of images (i.e., the "image database") configured within the hard disk drive 310.
  • data e.g., metadata, feature descriptors
  • the image database a database of images
  • the software application program 333 under execution of the processor 305, compares the selected image to data representing one or more images stored within a database (i.e., the "image database") configured within the hard disk drive 310.
  • the image database may be stored within RAM configured within the memory unit 306.
  • the method 100 is configured to perform the comparison as efficiently as possible by comparing the selected image to as few of the database images as possible.
  • images may be added to a candidate list based on similarity. In this instance, the selected image is compared to the images in the candidate list instead of comparing the image to all of the images in the image database.
  • the software application program 333 analyses the selected image (i.e., the image selected at step 103) to determine if the image belongs to one of a plurality of predetermined image clusters.
  • Each image in such a predetermined image cluster shares visually similar attributes.
  • each of the images may be placed in the same cluster based on shared colours in the images.
  • each of the images may be placed in the same cluster based on similarity in lettering of the series title.
  • each image that has similar general feature descriptors may be placed in a particular cluster. If the software application 333 determines that the image selected at step 103 shares similar feature descriptors to the feature descriptors of a particular cluster, then at step 1 11 the software application 333 compares the selected image only to the images of the particular cluster.
  • a method 800 of comparing images, as executed at step 11 1 , will be described in detail below with reference to Fig. 8.
  • the method 800 may be used for comparing attributes determined for the selected image to data (e.g., metadata) stored in the image database representing one or more images.
  • the images of the item may be received by the software application program 333 at step 101 already in the form of data (e.g., metadata and/or feature descriptors) representing the images rather than actual image data (e.g., a joint photo-graphics expert group (JPEG) image file).
  • the method 100 may proceed directly to step 111 of the method 100 as described above.
  • the software application program 333 identifies an image in a candidate list (i.e., comprising one or more database images) matching the image selected at step 103, based on the comparison, then the method 100 proceeds to step 115.
  • step 119 images may be added to a candidate list based on similarity.
  • the software application program 333 may determine if the selected image matched an image in one or more particular image clusters.
  • the identifier may be downloaded from the image database to the computer module 201.
  • the computer module 201 may then retrieve the image 903, from an external source such as another server connected to the network 220, based on the downloaded identifier, before displaying the retrieved image 903.
  • an external source such as another server connected to the network 220
  • the image data may be downloaded from the image database at step 115 to the computer module 201 for displaying the image 903 on the display 214.
  • the software application program 333 under execution of the processor 305, is used for retrieving the data associated with the matched image in the image database. Accordingly, the data is retrieved at step 117 depending on the comparison performed at step 111.
  • the data retrieved at step 117 may include the title of the comic (i.e., the Amazing Spider-man, Spider-man Vs. The Chameleon).
  • the data may also include an average price (i.e., $1500) of the comic, the release date of the comic (i.e., March 1963) and the number of copies of the comic that are available for sale (i.e., 3 available).
  • the data retrieved from the database at step 117 may also include a link or links directing the user to a uniform resource locator (URL) for a website or websites where the item may be purchased. Such links enable the user to buy the item efficiently.
  • the data retrieved at step 117 may also include a link or links to website locations from which to purchase the Amazing Spider-man comic represented by the matched image.
  • the data retrieved from the database at step 117 may describe a physical location (e.g., a street address) where the item may be purchased.
  • the data retrieved from the database at step 117 may include, but is not limited to, data representing other attributes of the item such as popularity of the item, trending nature of the item, historical data associated with the item, category information for the item, a model number for the item and a brand of the item, as well as any other of the attributes described above, or any other suitable attributes.
  • the data in the description fields 905, 906, 907 and 908 may be used in populating the electronic form for listing the comic on the e-commerce website.
  • the retrieved data may be "cherry- picked" by the user (e.g., the comic issue may not match the comic to be listed but series data may match the comic to be listed).
  • the cherry-picked data may be used for adding a description to a listing of the comic on an e-commerce website.
  • the description fields of the electronic form may be automatically filled based on the attribute data retrieved at step 117.
  • the user interface 900 including the images 901 and 903 and the description, as seen in Fig. 9, may be used in an e- commerce website such as EBayTM.
  • the user interface 900 of Fig. 9 also includes icons 909 and 910.
  • the icon 909, titled "one click sell”, may be selected by a user to initiate a sale of the item represented in the images 901 and 903 (i.e., the Spider-man comic).
  • the icon 910 which is titled "Buy” in the example of Fig. 9, may be selected in order to redirect the user to another uniform resource locator (URL) where the item may be purchased.
  • URL uniform resource locator
  • the data (e.g., metadata, feature descriptors) stored in the image database may link to a table in the image database.
  • the table may comprise data for aggregated items available in traditional retail stores or online stores.
  • the user interface 900 or a similar means may be used to provide purchase options, including pricing and shipping information, to a user.
  • the purchase options may be stored in the image database or may be aggregated from external sources such as the traditional retail stores or online stores. As an example, the purchase options may include a price estimate for the item.
  • the retrieved data used to populate the description fields (e.g., 905, 906, 907 and 908) of the user interface 900 may persist to a further user interface screen, upon the one click sell icon 909 being selected, for example.
  • the further user interface screen may be a "selling" screen which lists the data from the description fields in further data fields. Accordingly, the data from the description fields is used to populate the data fields of the selling screen. Other data may also be used to populate one or more description fields of the selling screen.
  • the retrieved data used to populate the description fields (e.g., 905, 906, 907 and 908) of the user interface 900 may persist to a further user interface screen, upon an "add to catalogue" icon 911 being selected, for example.
  • the further user interface screen may be a "catalogue" screen (used by a user to keep track of their items) which lists the data from the description fields in further data fields. Accordingly, the data from the description fields is used to populate the data fields of the catalogue screen. Other data may also be used to populate one or more description fields of the catalogue screen.
  • the data in the data fields of the catalogue screen and the image 901 may be stored in a database as a catalogue of comics.
  • the software application program 333 under execution of the processor 305, performs a more exhaustive comparison by comparing the image selected at step 103 to the data for all of the database images stored within the image database configured within the hard disk drive 310. Then at step 121 , if the software application program 333 identifies an image in the image database matching the image selected at step 103, based on the comparison performed at step 119, then the method 100 proceeds to step 1 15. Otherwise, the method 100 concludes.
  • a screen is downloaded by the software application to the computer module 201 for display on the display 214 notifying the user that no matching image has been determined in the image database.
  • images associated with visually or categorically similar items may be displayed.
  • the software application program 333 may prompt the user to enter attribute data for the item for storage in the image database configured within the hard disk drive 310.
  • the image selected at step 103 may also be stored in the image database. Accordingly, the image database may be periodically populated with attribute data for further images when no matching image is determined at step 1 19.
  • the method 400 of selecting an image for further processing, as executed at step 103, will now be described with reference to Fig. 4.
  • the method 100 is implemented as one or more software code modules of the software application program 333 resident in the hard disk drive 310 and being controlled in its execution by the processor 305.
  • each step of the method 400 may be implemented as a separate software code module.
  • the method 400 may alternatively be implemented as one or more software code modules of the software application program 233 resident in the hard disk drive 210 and being controlled in its execution by the processor 205.
  • one more steps of the method 400 may be executed within a smart-phone, tablet device or similar portable computing device connected to the network 220.
  • the method 400 begins at step 401 , where the images captured by the user are accessed by the software application program 333 from the storage devices 309 and one of the images is selected for further processing.
  • the selected image is typically a colour image.
  • step 403 the method 400 proceeds to step 405, where the software program 333 performs edge detection on the selected image to locate edges of objects within the image.
  • edge detection may be performed on the selected image.
  • the software program 333 under execution of the processor 305, identifies contours in the image based on any detected edges within the image. Then at step 409, if the number of contours is less than a threshold number of contours, then the method 400 proceeds to step 413.
  • the threshold number of contours is adaptive and is predetermined for a first image received at step 101 of the method 100. Subsequently, the threshold number may be determined based on the average number of contours in the images for a predetermined number of contiguous images (e.g., thirty images). A decrease in the number of contours in a current image compared to a previously received image (i.e., as represented by the threshold) suggests blurring and/or loss of focus.
  • the current image is discarded. Otherwise, the method 400 proceeds to step 411 where the current image is selected for further analysis.
  • the shape of the current image is used to mask the original colour image as selected at step 401.
  • the selected colour image shows the item and is used in the following steps 105 to 111 throughout the method 100 while the grayscale image determined at step 403 may be discarded.
  • the selected image may be stored within the storage devices 309. Alternatively, the selected image may be stored in the memory 209 or storage modules 209 of the computer module 201 and/or a smart-phone or the like before being transmitted to the computer module 300.
  • the method 500 of performing geometric operations on an image, as executed at step 105, will be described in detail below with reference to Fig. 5.
  • the method 500 may be implemented as one or more software code modules of the software application program 333 resident in the hard disk drive 310 and being controlled in its execution by the processor 305.
  • each step of the method 500 may be implemented as a separate software code module.
  • the method 500 may be implemented as one or more software code modules of the software application program 233 resident in the hard disk drive 210 and being controlled in its execution by the processor 205.
  • one or more steps of the method 500 may be executed within a smart-phone, tablet device or similar portable computing device connected to the network 220.
  • the method 500 begins at step 501 , where the selected image (i.e., the image selected at step 103) is accessed by the software application program 333 from the storage devices 309 of the server computer module 300. Then at the next step 503, the software application program 333 determines the colour space (e.g., RGB, RGBA) of the selected image. The colour space may be stored within the memory 306 of the server computer module 300. At the next step 505, the image is scaled according to an original resolution of the selected image accessed at step 501. The selected image may also be rotated at step 505. The orientation of the selected image may be determined based on visual features within the selected image.
  • the colour space e.g., RGB, RGBA
  • the method 500 concludes at the next step 507, where the scaled image is down-sampled (or compressed) if the selected image is of sufficient quality or size.
  • JPEG compression may be performed on the scaled image.
  • the down-sampling provides an advantage during the transmission of image.
  • the image processed in accordance with the method 500 may be stored within the storage devices 309. [0099]
  • the method 600 of isolating an object in an image, as executed at step 107, will be described in detail below with reference to Fig. 6.
  • the method 600 may be implemented as one or more software code modules of the software application program 333 resident in the hard disk drive 310 and being controlled in its execution by the processor 305. In one arrangement, each step of the method 600 may be implemented as a separate software code module.
  • the method 600 may be implemented as one or more software code modules of the software application program 233 resident in the hard disk drive 210 and being controlled in its execution by the processor 205. In another arrangement, one or more steps of the method 600 may be executed within a smart-phone, tablet device or similar portable computing device connected to the network 220.
  • the method 600 begins at step 601 , where the image processed in accordance with the method 500 is accessed from the storage devices 309 of the server computer module 300. Then at the next step 603, the software program 333 under execution of the processor 305 applies a bilateral blur filter to the accessed image.
  • the filter blurs the image generally while preserving the integrity of edges in the image.
  • peripheral regions within the accessed image may be subject to additional blur filters such as a Gaussian blur filter.
  • the blurred image is converted to a grayscale image.
  • the image is converted to an eight (8) bit grayscale image.
  • the grayscale image may be stored in the storage devices 309 of the server computer module 300.
  • the software program 333 under execution of the processor 305, applies an adaptive threshold to the grayscale image.
  • the adaptive threshold may be applied using a transformation according to Equation (1) as follows:
  • T(x,y) represents the threshold for each pixel in the image.
  • the mean threshold is a mean threshold of a block of pixels surrounding the pixel (x,y) minus the weighted mean. Each pixel in the surrounding block is weighted differently.
  • inversion may also be performed at step 607.
  • the method 600 continues to the next step 609, where the software program 333 performs Canny edge detection on the threshold image output from step 607 to detect edges within the threshold image. Then at step 611 , based on any detected edges, contours are estimated for each portion of an object corresponding to the detected edges.
  • the contours are clustered based on proximity to neighbouring contours.
  • a sum of the contours is determined to produce an n-sided polygon.
  • the n-sided polygon 'covers' the object and provides separation between a foreground area and background area in the threshold image.
  • the method 700 of determining feature descriptors for an image, as executed at step 109, will be described in detail below with reference to Fig. 7.
  • the method 700 is implemented as one or more software code modules of the software application program 333 resident in the hard disk drive 310 and being controlled in its execution by the processor 305.
  • each step of the method 700 may be implemented as a separate software code module.
  • one more steps of the method 700 may be executed within a smart-phone, tablet device or similar portable computing device connected to the network 220.
  • the method 700 begins at step 701 , where the image selected at step 103 is accessed by the software application program 333 from the storage devices 309 of the server computer module 300. Then at the next step 703, the software application 333, under execution of the processor 305, determines attribute data for the selected image.
  • the determined attribute data may include one or more HSV histograms for the image. In one arrangement, three histograms are determined for the selected image, in the form of one histogram for each channel of the image.
  • the attribute data determined at step 703 may also include a reduced HSV histogram.
  • the reduced HSV histogram may similarly be in the form of three histograms (i.e., one histogram for each channel of the image).
  • one or more histograms may be combined into a "fingerprint" single object based on a quantized image or quantized region.
  • the selected image may be quantized using a number of means, k, based on randomly placed centres on the data representing the image. Accuracy of such a quantization is determined based on a variable, epsilon.
  • the attribute data determined at step 703 may also include data representing geometric based features, such as corners within the image, areas which rapidly change brightness (e.g., lines) and textural defects.
  • the geometric features may be stored within the hard disk drive 310 as keypoints within the image. Each keypoint may contain attributes including coordinates, size, angle, response (strength) and octave associated with the keypoint.
  • the attributes of the geometric features may also include a class identifier representing an image cluster to which the image belongs.
  • the method 700 continues at the next step 705, where feature descriptors are determined based on the attribute data determined at step 703.
  • the feature descriptors may be determined at step 705 using any suitable method.
  • the feature descriptors are determined at step 705 using the ORB algorithm as described in the article by Rublee, E., Rabaud, V., Konolige, K., & Bradski, G. (2011 , November), entitled "ORB: an efficient alternative to SIFT or SURF", In Computer Vision (ICCV), 2011 IEEE International Conference on (pp. 2564-2571). IEEE, hereby incorporated by reference in its entirety as if fully set forth herein.
  • the feature descriptors may be determined based on keypoints determined at step 703.
  • Matching features in a selected image to features in a template image may be difficult.
  • the feature matching may be difficult since for the feature in the selected image to match a feature in the template image, the features must be in exactly the same position and orientation in both the images.
  • the feature descriptors are determined at step 705 to allow for a greater degree of flexibility when comparing images.
  • the determined feature descriptors may be scale invariant, so that a feature in the selected image will match another feature in an image stored in the image database even if the feature in the database image is larger or smaller than the feature in the selected image.
  • the determined feature descriptors may also be rotation invariant, so that a feature in the selected image will match another feature in the database image even if the feature in the database image has been rotated from a stored rotation of the feature in the selected image.
  • the determined feature descriptors are used for performing feature matching as the features in the selected image are unlikely to perfectly match the features of the database images.
  • the method 700 concludes at the next step 707, where the feature descriptors are stored in the database configured within the hard disk drive 310, for use in matching the image.
  • the method 800 of comparing images, as executed at step 111 will be described in detail below with reference to Fig. 8.
  • the method 800 is implemented as one or more software code modules of the software application program 333 resident in the hard disk drive 310 and being controlled in its execution by the processor 305.
  • each step of the method 700 may be implemented as a separate software code module.
  • one more steps of the method 700 may be executed within a smart-phone, tablet device or similar portable computing device connected to the network 220.
  • the method 800 begins at step 801, where the image selected at step 103 is accessed by the software application program 333 from the storage devices 309 of the server computer module 300.
  • the accessed selected image may be stored in the RAM memory 306.
  • the software application program 333 also accesses the attributes of the selected image, such as the HSV histograms and feature descriptors, determined in accordance with the method 700. Again, the accessed attributes of the selected image may be stored in the RAM memory 306.
  • the following steps of the method 800 are used for comparing the attributes determined for the selected image (e.g., as determined in the methods 400, 500 and 600) to data (e.g., metadata, feature descriptors) representing one or more further images stored within the image database configured within the hard disk drive 310 or memory 306.
  • data e.g., metadata, feature descriptors
  • steps 802 and 804 are performed in parallel. Similarly, the other steps of the method 800 may be performed in parallel. In another arrangement, steps 802 and 804, and the other steps of the method 800 may be performed sequentially.
  • the software program 333 under execution of the processor 305, performs a histogram comparison to compare the three (3) different channels (i.e., HSV) of the selected image as determined at step 703 to data representing the channels of one or more images in the image database.
  • Data for the database images used at step 802 may be stored in the RAM memory 306 prior to execution of the step 802.
  • any database image having histograms with a cumulative "similarity score" that exceeds a predetermined threshold is added to a "candidate list" stored in the RAM memory 306.
  • the similarity score indicates how similar the selected image is to one of the database images.
  • the similarity score indicates how similar colours in a quantized selected image are to colours in a quantized database image.
  • the similarity score may be 0.8.
  • the similarity score may be .0.
  • the similarity score may be negative.
  • the software program 333 under execution of the processor 305, determines a generalised feature descriptor (e.g., a GIST descriptor) of the selected image.
  • the generalised feature descriptor may be determined at step 804 using any suitable method.
  • the generalised feature descriptor is determined at step 804 using the method described in the article by Oliva Aude, and Antonio Torralba., entitled “Modeling the shape of the scene: A holistic representation of the spatial envelope.” International journal of computer vision 42, no. 3 (2001): 145-175, hereby incorporated by reference in its entirety as if fully set forth herein.
  • the generalised feature descriptor may be determined based on a computational model of the recognition of real world scenes that bypasses segmentation and processing of individual objects or regions.
  • the generalised feature descriptor may be determined based on a very low dimensional representation of a scene in the selected image, that is referred to as a Spatial Envelope, using a set of perceptual dimensions (e.g., naturalness, openness, roughness, expansion, ruggedness) that represent the dominant spatial structure of a scene.
  • the generalised feature descriptor determined at step 804 is compared to feature descriptors of one or more images in the image database. Data for the database images used at step 804 may be stored in the RAM memory 306 prior to execution of the step 804.
  • step 805 any database image found to be similar to the selected image at step 802 is added to a "candidate list" stored in the RAM memory 306.
  • the software program 333 under execution of the processor 305, may be used in the method 400 for performing optical character recognition (OCR) on the selected image to detect any text on the object corresponding to the item as determined in the method 400.
  • OCR optical character recognition
  • text may be detected on the cover of the item.
  • Any suitable OCR method may be used.
  • the OCR library "Tesseract" is used to scan the object in the selected image to detect text. Then, any detected text that matches a word in an ordinary dictionary in addition to any custom defined words (e.g., Spider-man) is used to perform a full fuzzy string search on the data representing the images in the image database.
  • Any image in the image database having associated data (e.g., metadata, feature descriptors) that matches the detected text is added to the candidate list.
  • image database having associated data e.g., metadata, feature descriptors
  • only images in the image database having associated data which matches the detected text according to a predetermined threshold is added to the candidate list.
  • a comic window may also be detected on the selected image.
  • a comic window is typically a white region, measuring approximately 4x2cm, positioned on a comic.
  • the comic window is positioned on the upper half of a comic.
  • the comic window may alternatively be positioned anywhere on a comic such as on the upper left hand side of a comic.
  • OCR Optical character recognition
  • any image in the image database having associated data e.g., metadata, feature descriptors
  • any image in the image database having associated data e.g., metadata, feature descriptors
  • the software program 333 may be used in the method 400 for detecting a barcode region in the selected image.
  • a barcode region may be detected by scanning the selected image and applying adaptive thresholding followed by edge detection on the selected image. Any edges detected in the selected image having similar dimensions to a barcode are isolated if the edges form a rectangular region.
  • the detected rectangular region is determined to be a barcode if the rectangular region has a high spatial frequency in the vertical or horizontal domain and is positioned on the edges of an object representing an item.
  • the barcode region may be detected by applying a Sobel operator to the gray scale image (determined as described above) using a median blur filter.
  • any barcodes detected are typically Universal Product Code (e.g., UPC_A) barcodes.
  • the barcodes may be European Article Number (e.g., EAN13/ISBN) barcodes or matrix barcodes (e.g., Quick Response code, Data Matrix code).
  • the barcode region may be read using a suitable barcode reading method to determine a code represented by the barcode.
  • the detected barcode region may be upsampled before being read.
  • the code represented by the barcode may be compared to the barcodes associated with one or more images in the image database using fuzzy string matching. Any image in the image database having an associated barcode that matches the barcode determined as described above is added to the candidate list.
  • fuzzy string matching may be used to determine a series and/or a publisher for the item. Any determined series or publisher details for the item may compared to data associated with the images stored in the image database to generate the candidate list.
  • the software application program 333 under execution of the processor 305, may be used in the method 400 to perform any suitable face detection method on the selected image.
  • any suitable face detection method For example, a Haar cascade classifier may be used on the selected image.
  • the software program 333 may use eigenfaces stored in a character database, configured within the RAM memory 306, to determine and identify any faces in the selected image. Then, details of any faces determined using the face detection method is compared to data (e.g., metadata, feature descriptors) associated with one or more images in the image database. Any image in the image database having an associated face that matches any faces determined as described above is added to the candidate list.
  • data e.g., metadata, feature descriptors
  • optical character recognition, barcode detection and face detection described above may be performed in parallel or sequentially with the other steps 802, 803, 804 and 805 of the method 800 described above.
  • the method 800 continues at the next step 806 following either of steps 803 and 805, where the selected image is compared to each of the images in the candidate list to determine any image in the candidate list that matches the selected image.
  • the feature descriptors of the selected image are compared to the feature descriptors of each of the images in the candidate list using any suitable method. In one arrangement, the feature descriptors are compared using a k-nearest neighbour algorithm.
  • any matching image in the image database is displayed on the display 214 at step 115. Further, the database image is displayed based with data (e.g., metadata, feature descriptors) associated with the matched image, where the data is retrieved from the image database at step 7.
  • the method 100 allows a seller or buyer to determine attributes of an item in order to identify the item, without the seller or buyer being required to have an intimate knowledge of the item.
  • the method 100 allows the seller or buyer to determine attributes of the item, such as the name, age, origin or value of the item, in order to list the item on an e-commerce website for sale.
  • the method 100 also allows the seller or buyer to list the item on an e-commerce website in a catalogue of items belonging to the seller or buyer.
  • the methods 100 also allows the buyer to locate where the item may be purchased.

Abstract

La présente invention concerne un procédé permettant d'afficher des données afin de déterminer des attributs d'un article. Une ou plusieurs images de l'article sont reçues. Au moins une image sélectionnée parmi les images reçues est analysée pour déterminer un ou plusieurs attributs de l'image sélectionnée. Les attributs déterminés sont comparés à des données associées à une ou plusieurs images supplémentaires. Les données associées à au moins une des images supplémentaires sont récupérées en fonction de la comparaison. Les données récupérées sont affichées afin de déterminer les attributs de l'article.
PCT/AU2014/000536 2013-05-31 2014-05-21 Procédé, appareil et système permettant d'afficher des données WO2014190374A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2014273829A AU2014273829A1 (en) 2013-05-31 2014-05-21 Method, apparatus and system for displaying data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2013901959 2013-05-31
AU2013901959A AU2013901959A0 (en) 2013-05-31 Method, apparatus and system for displaying data

Publications (1)

Publication Number Publication Date
WO2014190374A1 true WO2014190374A1 (fr) 2014-12-04

Family

ID=51987765

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2014/000536 WO2014190374A1 (fr) 2013-05-31 2014-05-21 Procédé, appareil et système permettant d'afficher des données

Country Status (2)

Country Link
AU (1) AU2014273829A1 (fr)
WO (1) WO2014190374A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020087426A1 (en) * 2000-12-28 2002-07-04 Fujitsu Limited Online shopping method and system
US20090304267A1 (en) * 2008-03-05 2009-12-10 John Tapley Identification of items depicted in images
US20120128239A1 (en) * 2010-11-18 2012-05-24 Ebay Inc. Image quality assessment to merchandise an item
US20120183185A1 (en) * 2008-10-02 2012-07-19 International Business Machines Corporation Product identification using image analysis and user interaction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020087426A1 (en) * 2000-12-28 2002-07-04 Fujitsu Limited Online shopping method and system
US20090304267A1 (en) * 2008-03-05 2009-12-10 John Tapley Identification of items depicted in images
US20120183185A1 (en) * 2008-10-02 2012-07-19 International Business Machines Corporation Product identification using image analysis and user interaction
US20120128239A1 (en) * 2010-11-18 2012-05-24 Ebay Inc. Image quality assessment to merchandise an item

Also Published As

Publication number Publication date
AU2014273829A1 (en) 2015-12-10

Similar Documents

Publication Publication Date Title
US9336459B2 (en) Interactive content generation
US11093748B2 (en) Visual feedback of process state
US20170278289A1 (en) Apparatus, systems, and methods for integrating digital media content into other digital media content
Kim et al. Spatiotemporal saliency detection and its applications in static and dynamic scenes
US20190362193A1 (en) Eyeglass positioning method, apparatus and storage medium
US20150030239A1 (en) Training classifiers for deblurring images
US20210141826A1 (en) Shape-based graphics search
US10134149B2 (en) Image processing
Joshi OpenCV with Python by example
Hossein-Nejad et al. Clustered redundant keypoint elimination method for image mosaicing using a new Gaussian-weighted blending algorithm
Kapur et al. Mastering opencv android application programming
CN111695971B (zh) 物品推荐方法、装置和设备及计算机存储介质
CN107423739B (zh) 图像特征提取方法及装置
US20210166028A1 (en) Automated product recognition, analysis and management
CN115601586A (zh) 标签信息获取方法、装置、电子设备及计算机存储介质
CN110659923A (zh) 用于用户终端的信息展示方法和装置
Hung et al. Phase fourier reconstruction for anomaly detection on metal surface using salient irregularity
WO2014190374A1 (fr) Procédé, appareil et système permettant d'afficher des données
Novozámský et al. Extended IMD2020: a large‐scale annotated dataset tailored for detecting manipulated images
Na et al. Unconstrained object segmentation using grabcut based on automatic generation of initial boundary
CN116486209B (zh) 一种新品识别方法、装置、终端设备及存储介质
CN114529912A (zh) 图形验证码识别方法、装置、电子设备及可读存储介质
Kong et al. Interactive deformation‐driven person silhouette image synthesis
Latha et al. Advanced Denoising Model for QR Code Images Using Hough Transformation and Convolutional Neural Networks
Medhi et al. A text recognition augmented deep learning approach for logo identification

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14804699

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2014273829

Country of ref document: AU

Date of ref document: 20140521

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 14804699

Country of ref document: EP

Kind code of ref document: A1