WO2008015571A2 - Simulation-assisted search - Google Patents
Simulation-assisted search Download PDFInfo
- Publication number
- WO2008015571A2 WO2008015571A2 PCT/IB2007/003047 IB2007003047W WO2008015571A2 WO 2008015571 A2 WO2008015571 A2 WO 2008015571A2 IB 2007003047 W IB2007003047 W IB 2007003047W WO 2008015571 A2 WO2008015571 A2 WO 2008015571A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- simulation
- search
- computer program
- program product
- visual attributes
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/532—Query formulation, e.g. graphical querying
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/951—Indexing; Web crawling techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9538—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
Definitions
- the present invention relates to methods and systems for creating queries for search engines.
- search engines have become increasingly important and increasingly sophisticated tools for locating online content.
- Advances in search engine technology have expanded the scope of indexed content, increased the speed of searches, added flexibility to the syntax of user queries, and improved the relevance of search results.
- search remains generally tied to the use of textual input, either through the entry of keywords or through menu-driven specification of search parameters.
- a visually-oriented search system guides a search with non-verbal inputs. Instead of specifying discrete attributes (words) as input to a search engine, a user may create a visual model of a desired end result and apply the model as a generalized input from which discrete attributes are extracted for submission to conventional search engines.
- the search may be enhanced with a simulation of the visually-created query, and the simulation may be transformed into a query suitable for distribution to one or more search engines.
- the query may be refined using domain-specific rules, vocabulary, expert systems, and the like. Search results may be browsed by a user, or employed to further refine subsequent searches.
- FIG. 1 shows a conceptual block diagram of a visually-oriented search
- Fig. 2 shows entities that may participate in a visually-oriented search system
- FIG. 3 shows a user interface for a visually-oriented search system
- Fig. 4 shows a user interface for a visually-oriented search system
- Fig. 5 shows a user interface for a visually-oriented search system
- Fig. 6 shows a user interface for a visually-oriented search system
- Fig. 7 shows a user interface for a visually-oriented search system
- Fig. 8 shows a high-level flow chart of a process for simulation-assisted search.
- the systems described herein may assist a user in building a desired model by providing visualization and domain-specific expert systems.
- the user may adjust the model visually and interactively using visual elements selected from a palette of options displayed within a user interface. This removes or diminishes the need of the user to be a domain expert, or to be familiar with vocabulary used to describe various aspects of an item or type of item. This may be particularly useful where, for example, a user sees a new style of clothing or feature, and would like to search for clothes having that feature without knowing any of popular or trade names for the feature.
- a set of discrete, searchable, domain-specific attributes may be extracted from the simulation model (or from the visual attribute selections used to create the model).
- a search can then be performed directly using the extracted search attributes, or a query such as a textual search string may be generated for distribution to various search engines.
- the search string may also be expanded through the use of domain-specific knowledge as applied, for example, through an expert system.
- the search may explicitly or implicitly target results tagged with corresponding descriptions or metadata.
- Figure 1 shows a conceptual block diagram of a visually-oriented search.
- the system 100 may include a user interface 110 that provides a questionnaire 112 and a three-dimensional model simulation 114 that applies results of the questionnaire.
- the system 100 may provide processing for search attribute extraction 132, search string generation 134, and a search engine 136.
- domain-specific knowledge 120 may be deployed generally through the system to support various search functions.
- the domain-specific knowledge 120 may be implemented, for example, as rules and expert systems 122, a database of suitable three-dimensional sub-entities 124, semantic data 126 such as synonyms, word mappings, exclusions, and so forth.
- the user interface 110 may be, for example, any computer user interface suitable for presentation on a client device such as a personal computer, laptop computer, cellular phone, personal digital assistant, public kiosk, and so forth.
- the user interface 110 may employ Web technologies such as HTML, Java, JavaScript, J2ME, J2SE, J2EE, Flash Media, AJAX, and any other technologies for local and/or remote processing and presentation of a user interface, as well as any proprietary technology suitable for use with the systems described herein.
- the questionnaire 112 generally operates to receive user input concerning visual attributes.
- visual attributes as types that, together with specific values, form attribute-value pairs (such as a visual attribute of "color” with a value of "red”).
- a visual attribute may also or instead be understood as a type and a value that together serve as an attribute-value pair to describe some visual aspect of a physical object such as an article of clothing.
- certain values may weakly or strongly imply a particular attribute type (such as "high heel” suggesting a heel type) such as to render an explicit attribute type unnecessary.
- all such meanings are intended to fall within the scope of the term "visual attribute,” unless a more specific meaning is provided or otherwise clear from the context.
- the questionnaire 112 may present a menu of selections to a user within the user interface 110. This may include checkboxes, radio buttons, drop-down lists, or any other controls for receiving user input. Where visual features are being selected, such as a car shape (e.g., sedan, wagon, coupe, SUV, and so forth), a user may be presented with abstract graphical representations of the various features from which to select the desired feature. Other visual aspects may be amenable to different input means, such as sliders to select various bodily dimensions on a graphically displayed mannequin, or a continuous color palette from which to interactively select color. While any features, attributes, or other information may be specified in the questionnaire 112, three general areas of information are described below.
- the questionnaire 112 may acquire personal information.
- the system 100 may be applied to specify clothing, in which case, relevant personal information may include body type, body dimensions, body shape, height, weight, skin tone, gender, hair color, hair length, hair style, face shape, head shape, facial hair, muscularity, and so forth.
- personal information may be employed to create a personalized simulation or virtual model upon which clothing selections can be simulated.
- the system 100 may be applied to select appliances for a kitchen.
- personalization information may include an existing kitchen layout, furnishings, flooring, cabinetry, countertops, and so forth, all of which may be used to create a personalized model kitchen in which appliance selections can be simulated.
- the questionnaire 112 may acquire visual attributes of a product. In order to assist a user in selecting suitable visual attributes, a number of possible selections may be presented to the user. For example, for footwear this may include laces, soles, heels, materials, straps, toes, and so forth.
- the questionnaire 112 may also provide high-level guidance, such as by initially requesting a shoe type (e.g., athletic, formal, casual, outdoor), which may be further refined within a sub-type (e.g., for formal footwear, categories for men's and women's shoes, or professional and evening wear) to pre-parameterize visual features. This pre-parameterization may limit the availability of visual attribute selections according to current fashion.
- a shoe type e.g., athletic, formal, casual, outdoor
- sub-type e.g., for formal footwear, categories for men's and women's shoes, or professional and evening wear
- Visual attributes may be specified in a variety of ways within the questionnaire. For example, color may be specified in textual form by a user text entry, by selecting a color from a list of options, or by selecting a color or range of colors from a color palette. Using the techniques described below, the user's color selection may be translated into one or more keywords corresponding to conventional names, trade names, and/or vendor names for various colors and color schemes.
- the questionnaire may also acquire non-visual attributes of a product.
- non-visual attributes such as engine type, miles per gallon, and so forth may be relevant for a user. Values for these attributes may be specified through the questionnaire, and used as a basis for search in addition to visually specified information. It will also be appreciated that some information may be considered visual or non-visual.
- shirt size may be assumed to be non-visual information if it is assumed that a range of sizes will be available for any product.
- shirt size may be a highly relevant visual attribute if a user is selecting between relatively loose-fitting or tight- fitting clothing and would like to receive visual simulation of size alternatives.
- the three-dimensional model simulation 114 may visually display a simulation of an object along with user-specified visual attributes and/or any personalization data provided in the questionnaire 112.
- the simulation may be incrementally updated as a user makes selections within the questionnaire.
- the simulation may be, for example, a three-dimensional simulation (typically, though not necessarily rendered in two dimensions for display on a conventional computer display or the like), a two-dimensional simulation, an animated simulation, an auditory simulation, a mechanical simulation, a lighting simulation, or any other still or time-based simulation, as well as various combinations of any of the foregoing.
- a user may specify various simulation-specific aspects for the generation and display of the simulation.
- a user may select a motion type for the simulation such as standing, walking, running, sitting, and so forth.
- a motion type such as standing, walking, running, sitting, and so forth.
- a user may specify a point of view, lighting, and so forth.
- a simulation may include any number of simulated physical objects.
- a number of clothing items may be concurrently simulated, such as a shirt and a pair of pants.
- other items such as accessories, other apparel, and the like may be included in a single simulation.
- a user may select socks, shoes, hats, handbags, knapsacks, belts, scarves, sunglasses, jewelry, and so forth.
- the simulation may be supplemented with search results from the search engine 136.
- manufacturers or retailers may maintain simulation-compliant data for products. Where this data is available, search results may be displayed in the user interface 110, and simulation-compliant results may be identified with an icon or the like in the search results. A user may select the icon to transfer the attributes of the search result directly to the simulation.
- the systems and methods described herein may be enhanced by adding simulation-compliant attributes to content retrieved by the search engine. These attributes may be added, for example, by retailers who are offering items for sale, or their corresponding wholesalers or manufacturers. The attributes may also, or instead be created automatically through computerized examination of information that is available for such products.
- the simulation may provide a two-dimensional simulation such as an architectural floor plan or an industrial layout where component footprint (as well as any required buffer space surrounding components) is important.
- art layouts, vertical shelf space, or any other design or purchase decisions driven by substantially two-dimensional constraints can be usefully simulated in two-dimensions.
- non-spatial simulations may also be employed, such as auditory or tactile simulations that permit sensory simulation corresponding to some feature of the subject matter of a search. All such variations are intended to fall within the scope of this disclosure.
- a search attribute extraction module 132 may extract attributes from the simulation for searching. In one aspect, this may include an analysis of explicit user selections such as the visual attributes selected in the questionnaire 112. In another aspect, this may include visual analysis of the simulation result.
- a search string generation module 134 may convert the attributes into a search string suitable for presentation to a remote search engine 134. This may include converting the search attributes into a suitable syntax for submission to one or more search engines.
- the search engine may be any network-accessible search engine including wide scale public search engines such as Google, Yahoo, AltaVista, and the like.
- the search engine may also, or instead, include specialty search engines at retail sites hosted by general retailers or branded product companies.
- the search engine may also, or instead, include auction web sites, product selection sites, product configuration sites, product review sites, or any other electronic commerce site or other web site that responds to search requests.
- the search engine may also or instead include local search engine created for use with the search systems described herein.
- the search engine may employ any suitable algorithms known in the art including textual search algorithms such as proximity searching, string matching, word stem searching, fuzzy logic, and so forth.
- a search engine may also, or instead, employ spatial searching based upon the simulation model using, e.g., feature vectors, neural networks, skeletal graph techniques, and so forth.
- a computer-generated search string may take full advantage of the query syntax of each search engine addressed, to the extent that features of the syntax are known. For example, different search engines provide different grammars and features relating to wild cards, word stems, word variants, Boolean operators, proximity searching, synonyms, exclusions, and so forth. While a human user would typically not know how to optimize a query for any particular search engine, the computer-generated search strings may be tailored to the features and syntax of each search engine.
- Domain-specific knowledge and content 120 may be used throughout the system described above.
- domain-specific knowledge may be employed in forming a questionnaire for a particular subject matter area, for generating simulations, for extracting search attributes from a simulation, generating search strings, and selecting suitable search engines.
- One useful form of domain-specific knowledge for some applications is a dictionary or taxonomy of keywords for visual attributes.
- Other domain-specific knowledge may relate to relationships among visual attributes. This may be implemented by ranking choices in the questionnaire 112 where, for example, certain cuffs and collars are usually but not exclusively used together for clothing.
- the system may employ a rules engine and/or expert systems 122, referred to herein interchangeably unless a more specific meaning is specifically provided or otherwise clear from the context.
- an expert system incorporates subject- specific knowledge or analytical skills from human experts, which may be implemented as a set of rules for analyzing and acting on inputs.
- the rules engine may provide known expert system functionality using, for example Prolog to parse rules and maintain an associated knowledge base.
- the rules may be based on context, such as personalization information provided above, and existing entities/sub-entities within the context, along with current selections of values for visual attributes, placement, configuration, and so forth.
- the context may define available sources for items, so that, for example, a kitchen outfitted with items from one retailer (e.g., Home Depot) may be compared to the same kitchen outfitted with items from a competing retailer (e.g., Loews).
- a kitchen outfitted with items from one retailer e.g., Home Depot
- a competing retailer e.g., Loews
- the systems and methods described herein may also support a simulation-based product comparison that visually simulates two or more products for visual comparison while providing a detailed comparison of other objective criteria such as price, delivery time, and so forth.
- the knowledge base 120 may generally establish descriptive data 124 for entities and sub-entities known within the system, and store associated attributes of each sub-entity and relationships among sub-entities.
- this may include physical object types (e.g., shirt, pants, dress), visual attributes (e.g., collar, sleeves, hemline), and values for visual attributes (e.g., for a collar, the values may include v- neck, crew neck, polo, etc.). More generally, it will be understood that a physical object may be described with reference to one or more visual attributes, each of which may have a variety of values, and that this taxonomy of properties may be represented in the descriptive data 124 of the knowledge base 120.
- Semantic data 126 may be provided for construction of more thorough search strings.
- semantic data 126 may encode semantic content to augment or constrain search parameters extracted from the simulation.
- a dictionary, thesaurus, or the like may be employed to identify related or similar terms for searching. This may be based upon domain-specific knowledge for a search, or more narrowly upon search keywords.
- semantic data 126 may conform to any available standards for describing terms and relationships for network content.
- the "semantic web” refers broadly to a philosophy, design principles, and a variety of enabling technologies for describing content in a manner amenable to use and interpretation by software agents.
- Existing formal specifications for the semantic web include (among others) the Resource Description Framework ("RDF"), the RDF Schema, and the Web Ontology Language, all of which seek to formally describe terms and relationships within a knowledge domain.
- tools may be provided to vendors or other sources of product information to permit labeling consistent with semantic web principles.
- a tool may, for example, provide a predetermined ontology for product description. This may guide a vendor's selection of visual attribute types and values within a hierarchy of existing terms/concepts for a product.
- new content may be released with metadata that is pre-configured for efficient use with a visually-assisted search system.
- the product may be released for use with the search system, or any other semantic-web- compliant systems, simply by publishing a product image and the associated metadata to a network-accessible location.
- tools may be provided for preparing a simulation- compliant model of the product in the form of a software developer kit, web-based application programming interface, or the like.
- the details of such a system will necessarily depend on the particular product, simulation system, and other technical details.
- preparing a distributable version of a simulation software development tool kit (or a web-based application programming interface) is well within the level or ordinary skill in the relevant programming arts, and is intended to fall within the scope of the systems and methods described herein.
- network content may be tagged with descriptive metadata corresponding to, for example, a product name, a product type or class, product description(s), product visual attributes, and so forth that might be responsive to a search.
- Tags may include information such as make, model, color, price, finish, materials, part numbers or sku's, sizes, features, characteristics, and narrative descriptions.
- tags may be more specifically tailored to content so that, for example, clothing may have readily discernible descriptive tags for visual attributes such as fabric, color, size, sleeve type, neckline type, and so forth.
- Other information that might be relevant to a purchasing decision may also be included such as brand, year of make, store location, care instructions, and retailer.
- semantically-oriented tags may be provided to capture subjective features such as style, item popularity, and so forth.
- retailers may coordinate with the search system 100 so that a common vocabulary is provided for searching, and the retailer may be provided with a tagging tool as generally described above.
- the tagging tool may assist in correctly tagging inventory with attribute values that correspond to searches developed using the visually-assisted and/or simulation assisted search systems described herein.
- the source of tags may be important, and the system may provide for a tagging structure that recognizes tag source as a parameter for searching and displaying results.
- a manufacturer may explicitly tag products with metadata including product names and the like that uniquely identify the manufacturer's products.
- the manufacturer may also provide descriptive content. This tagging may or may not be perceived as reliable by consumers, but the descriptive tags from a manufacturer may be clearly identified so that a user can independently determine what weight to afford these manufacturer-derived descriptions.
- Retailers may also separately tag products, again subject to various user interpretations of reliability. Objective reviewers may be afforded a different tagging hierarchy, so that metadata from reviews by various individuals or institutions can be separately considered.
- tagging may be controlled or supported by a trusted third party so that the source of tags can be verified or otherwise examined for authenticity with reference to an external authority.
- a commercial trusted third party such as VeriSign, Entrust or the like may be used to manage certificates.
- Tags may also or instead be community-based, such as through social networking sites that permit ad hoc tagging of content. While posing potential reliability problems, this source of metadata may be uniquely suited to identifying popular visual attributes, or identifying new descriptive terms appearing in popular culture.
- Community-based or other consumer-level tagging may accommodate a wide array of annotations including rankings, photographs, descriptions, comments, evaluations, and so forth.
- tags from a manufacturer may be afforded a highest priority. This prioritization is based upon an assumption that a manufacturer directly specified the tags for a desired association, based upon a specific product name, stock keeping unit, product code, or the like.
- Another level, such as a second highest priority may be accorded to search results with tags that closely or exactly match the search string extracted from the model.
- Another level may be provided for tags created by a qualified organization such as a retailer.
- tags may carry a presumption of reliability, although not tied specifically to a product manufacturer.
- Another level or weight may be accorded to search results with tags that loosely match the search string based upon, e.g., keywords, synonyms, and the like.
- Another level or weight may be accorded to tags created by individual consumers or social networking sites.
- Another level or weight may be accorded to other content, such as images matching the simulation model based on pattern recognition.
- a suggestion or recommendation engine 128 may be provided that generates recommendations based upon information in the knowledge base 120.
- the recommendation engine may suggest items that go well together, additional features suitable for a context (e.g., if you select an oven, you may need a hood, or you may need to remove overhead cabinets, and perhaps a microwave could suitably be added, or moved from another location).
- the recommendation engine 128 may also or instead identify related items based upon purchase history for other users. Numerous other suggestion and recommendation techniques are known in the art and may be suitably incorporated into the knowledge base 120 of the system 100 described herein.
- search systems and supporting components may be implemented in computer executable code that supports operation of a web server to provide a web-based client- server deployment of the system 100 described herein.
- Other deployments include, for example, a web application, a closed in-store system for use at a physical retail location, an application programming interface (or collection of API's) for use in third-party web application integration, one or more services for use in a services-oriented architecture, and so forth. All such permutations are intended to fall within the scope of this disclosure.
- systems described above may be local or distributed, or some combination of these.
- the domain-specific content 120 or portions thereof may be locally deployed on a client device, or may be stored at a remote, network- accessible location for use by a server or client.
- Other features such as the questionnaire, the simulation, the search attribute extraction, and so forth, may similarly be deployed locally at a client, or remotely accessed for use in the systems described herein.
- the search engine(s) 136 would be remote from the client, particularly in applications intended for use with third party search engines, however, this is not strictly required, and in some embodiments the search engine, or portions thereof, may reside locally at a client device that provides the user interface 110.
- the entire domain-specific knowledge base, search engine, simulation, and user interface may be deployed on a single stand-alone device (or a device that is networked to receive updates and the like).
- the systems described herein may be improved by providing keyword suggestion to a user, such as through suggestions 128 to the questionnaire 112 or a separate window or pop-up within the user interface 110.
- This may, for example, suggest keywords that appear applicable to a user's search such as neighboring concepts or synonyms, based upon domain-specific knowledge 120 within the system 100, or based upon an analysis of tags obtained from a social networking site.
- a user may then optionally review keywords and explicitly select or exclude particular keywords based upon the user's desired results and understanding of the keywords presented.
- the system may dynamically provide an estimate or actual measure of the number of results in a search result set based upon the user's selections.
- the system may analyze the current search attributes, the current simulation, and any potential search strings derived therefrom to recommend additional parameters for a user. For example, applying domain-specific knowledge, a search for shirts may generate keywords and queries for neighboring concepts such as blouses and tees. These results may be incorporated into a search, or presented to a user for explicit selection of relevant items.
- FIG. 2 shows entities that may participate in a visually-oriented search system.
- the system 200 may include a network 201 interconnecting a client 202 and a number of servers 204-210.
- the network 201 may interconnect a plurality of clients 202 and servers 204-210. In general, any number of clients 202 and servers 204-210 may participate in such a system 200.
- the system may further include one or more local area networks ("LAN") interconnecting clients 202 through a hub (in, for example, a peer network such as a wired or wireless Ethernet network) or a local area network server (in, for example, a client-server network).
- the LAN may be connected to the network 201 through a gateway that provides security to the LAN and ensures operating compatibility between the LAN and the network 201. Any data network may be used as the network 201.
- the network 201 is the Internet, and the World Wide Web provides a system for interconnecting clients 202 and servers 204-210 in a communicating relationship.
- the network 201 may also, or instead, include a cable network (where at least one of the clients 202 would be a set-top box, cable-ready game console, or the like).
- the network 201 may include other networks, such as satellite networks, the Public Switched Telephone Network, WiFi networks, WiMax networks, cellular networks, and any other public, private, and/or dedicated networks that might be used to interconnect devices for transfer of data.
- An exemplary client 202 includes a processor, a memory (e.g. RAM), a bus which couples the processor and the memory, a mass storage device (e.g. a magnetic hard disk or an optical storage disk) coupled to the processor and the memory through an I/O controller, and a network interface coupled to the processor and the memory, such as a modem, digital subscriber line (“DSL") card, cable modem, network interface card, wireless network card, or other interface device capable of wired, fiber optic, or wireless data communications.
- DSL digital subscriber line
- One example of such a client 202 is a personal computer equipped with an operating system such as Microsoft Windows XP, UNIX, Linux, or Apple Computer's OS X, along with software support for Internet communication protocols.
- the computer may also include a browser program, such as Microsoft Internet Explorer, Netscape Navigator, or FireFox, to provide a user interface for access to the network 201.
- a personal computer is one possible client 202, the client 202 may also or instead include a workstation, a mobile computer, a Web phone, a VOIP device, a television set-top box, an interactive kiosk, a personal digital assistant, a wireless electronic mail device, or any other device capable of communicating over the Internet.
- client is intended to refer to any of the above-described clients 202 or any other client devices suitable for use with the systems described herein
- browser is intended to refer to any of the above browser programs or other software or firmware supporting a user interface for navigating a network such as the Internet.
- An exemplary server 204 includes a processor, a memory (e.g. RAM), a bus which couples the processor and the memory, a mass storage device (e.g. a magnetic or optical disk) coupled to the processor and the memory through an I/O controller, and a network interface coupled to the processor and the memory.
- Servers may be clustered together to handle more client traffic and may include separate servers for different functions such as a database server, an application server, and a Web presentation server.
- Such servers may further include one or more mass storage devices such as a disk farm or a redundant array of independent disk (“RAID”) system for additional storage and data integrity.
- Read-only devices such as compact disk drives and digital versatile disk drives, may also be connected to the servers.
- Suitable servers and mass storage devices are manufactured by, for example, Compaq, IBM, and Sun Microsystems.
- a server 204 may operate as a source of content or services and may provide any associated back-end processing while a client 202 is a consumer of content and services provided by the server 204.
- many of the devices described above may be configured to respond to remote requests, thus operating as a server, and the devices described as servers 204 may operate as clients of remote data sources and services.
- the distinction between clients and servers may blur.
- certain peer-sharing technologies employ "servelets" that act as both clients and servers within a peer-to-peer network.
- the term "server” as used herein is generally intended to refer to any of the above-described servers 204, or any other device that may be used to provide content or services in a networked environment.
- the servers 204 may perform a variety of functions.
- one or more of the servers 204 may provide the knowledge base and expert systems described above to support visually-oriented search. These servers 204 may be accessed by a client 202 during the search process in order to provide questionnaires, simulation, attribute extraction, search string formation, and so forth.
- one or more of the servers 204 may provide search engines including any of the wide area or dedicated search engines described herein.
- one or more of the servers 204 may provide content, such as product listings and information from manufacturers. In another aspect, one or more of the servers 204 may provide transaction engines for financial transactions such as a product purchase. In another aspect, one or more of the servers 204 may provide a bulletin board, on-line classified listings, on-line auctions, or any other services that might generate products potentially responsive to a search. In another aspect, one or more of the servers 204 may provide social networking services such as chat rooms, personalized web pages, discussion groups, web logs, and so forth that might generate relevant metadata for the systems described herein. In one aspect, all of these services may be combined within the user interface of a client 202 to provide an end-to-end search, configuration, and purchase experience.
- a number of examples of user interfaces that may be used to perform a search and review search results are now provided. It will be understood that while no specific interface technology is discussed in the following description, a number of suitable technologies exist for various platforms and devices, any of which may be used for presenting the following user interfaces on an appropriately capable client device including client-side applets, JavaScript (client or server), Java on a client-side Java Virtual Machine, browser plug-ins, AJAX, HTML, J2ME, J2SE, J2EE, Flash Media, Web Services, graphics, audio media, video media, streaming media, and so forth. In addition, various aspects of these interfaces may employ client-side technology, server- side technology, or some combination of these. All such variations suitable for use with the interfaces discussed herein are intended to fall within the scope of this disclosure.
- Figure 3 shows a user interface 300 for a visually-oriented search system.
- the interface 300 may include icons 302, text hyperlinks 304, buttons, or the like for receiving user input.
- a user may select a general subject matter area, topic, product area, or specific object type for further refinement.
- the relevant expert systems and domain-specific knowledge may be selected to guide further user input.
- possible product categories include home decor, appliances, clothing, lawn & garden, tools, jewelry, sport, electronics, shoes, toys, baby, travel, maternity, computers, outdoor, small appliances, camping, and health.
- other subject matter areas may usefully be displayed for selection/refinement including the topics generally described herein.
- topic selection process may be hierarchical. That is, a top level selection menu may cover, for example, goods, services, and media, or some other set of high-level categories.
- a flat scheme may be preferred so that users are not required to traverse a hierarchy of descriptive categories in order to arrive at an appropriate search domain.
- Figure 4 shows a user interface 400 for a visually-oriented search system. As shown in Fig. 4, once a particular category is chosen, the interface may proceed to present sub-entities 402 within the category. Selections may be received using any suitable user interface tool, including icons, text hyperlinks, text input fields, dropdown lists, check boxes, and so forth. Again referring specifically to the non-limiting example of the figures, a selection of clothing may present clothing types such as car coat, jacket, shirt, vest, dress, skirt, short, jeans, and so forth.
- the interface 400 may also include a control 404 to activate a simulation model and/or a control 406 to perform a search using the specified visual attributes.
- Figure 5 shows a user interface 500 for a visually-oriented search system.
- a simulation 502 may be initiated and rendered within a window of the interface 500 for the selected type.
- the simulation may be personalized, such as by incorporating details of the individual for whom the clothing is being selected. This may include body measurements, hair color, gender, and any other aspects of appearance.
- the simulation may be interactive, so that a user can alter orientation, lighting, and so forth for the simulation model.
- the simulation may be animated so that, for example, the simulation displays the object (in this case, a person) in motion. This may include pre-defined or user-defined motions such as standing, sitting, walking, running, and so forth.
- the user interface may also present a number of visual attributes such as a sleeve type 504 and a collar type 506.
- Each attribute may have a number of possible values 508 represented by drawings, keywords, and the like. For convenience, a subset of possible values 508 may be initially depicted.
- Figure 5 illustrates the selection of various visual features of the selected clothing type.
- a number of sleeve types are presented, including a leg-of-mutton sleeve, a bishop sleeve, a short sleeve, and a kimono sleeve.
- a number of collar types are also presented, including a v neck, a polo collar, a sweetheart top, and a 1976 neck.
- a "next" option or other user control to may be provided to control viewing of more or additional options.
- Figure 6 shows a user interface 600 for a visually-oriented search system.
- a number of visual attributes for clothing have been selected and refined to specify particular values.
- Search results for this selection using attributes extracted from the simulation 602 (or the selection process used to produce the simulation) may then be presented in a search result window 604.
- the display of search results may include, for example, price information, visually descriptive information, product images, and so forth.
- a user may select the product for incorporation into the simulation 602 using a "try it on" or other appropriate control.
- a user may apply one or more search results to the personalized model in the simulation 602 for a virtual fitting of the product.
- the interface may support purchase transactions using any suitable techniques known in the art. This may include a shopping cart or the like for gathering multiple items into a single purchase.
- a user may save a current product and any related visual attributes (or other attributes) to serve as a basis for additional searches.
- a user may identify an item of interest, and use this item as a basis to search for similar items either immediately or at some future time.
- Figure 7 shows a user interface 700 for a visually-oriented search system. This interface 700 shows another embodiment of the systems described herein as applied in a different context - home furnishing.
- a user may specify appliances and various aspects of appliances, and these selections may then be simulated within a personalized model of a user's kitchen. More generally, any room of a home may be simulated, and furnishings such as furniture, paint, carpeting, appliances, flooring, tiles, windows, and so on, may be incorporated into the simulation to aid in visual selection of desired products.
- a user interface for visually assisted searching that includes a first window that receives an incremental specification of a plurality of visual attributes for a product from a user to provide a specification for the product.
- a second window may display a simulation of the product according to the specification, and a control such as a button may initiate a search among a plurality of remote search engines for items having the plurality of visual attributes.
- a third window may display search results from the plurality of remote search engines.
- a questionnaire or the like may be used to gather user information and the user interface or related software may translate the user-specified simulation into queries for remote search engines.
- Figure 8 shows a high-level flow chart of a process 800 for simulation- assisted search. It will be understood that the systems described above include a user interface with numerous windows, each of which may be in various states for displaying and/or receiving information, any of which may depend on user inputs and the states of other ones of the windows. Thus Fig. 8 illustrates only one possible, representative series of steps as an example process incorporating many of the features described above, and should not be understood as limiting the systems and methods described herein.
- the process 800 may begin with receiving a description of a product as shown in step 802. This may include, for example, input received from a textual questionnaire, a visually-based questionnaire, or any of the other techniques noted above.
- the questions may be directed at locating responsive items, such as by identifying size, color, shape, ornamentation, and the like.
- various visual features may be presented and selected. For example, when searching for a shirt, a user may select from various lengths, materials, sizes, cuts, sleeve types, collar types, colors, buttons and/or laces, and so forth.
- the responses may be captured as a number of visual attributes for an object, which may be represented as attribute-value pairs as described above.
- a user may provide information such as body shape, measurements, height, weight, shoe size, and so forth.
- a user may provide information such as room dimensions, current appliances, window locations, floor type, and so on. This personalization information may be employed to provide context for search results, and to control the simulation as rendered for the user.
- the process 800 may generate a simulation, such as any of the simulations described above, of the object as specified by the user, which may be displayed in any suitable manner within a user interface or the like.
- This may include a three-dimensional simulation such as a human model wearing clothing specified by the inputs or a room of a house furnished according to user selections.
- the simulation may be based on personalization data and visual description data provided by the user, in combination with a library of pre-existing three-dimensional sub-entities and entities.
- An expert system may apply domain-specific knowledge, such as dressing rules for clothing to conform user inputs to existing clothing styles, fashions, and features.
- the simulation may account for surface textures, finishes, materials, lighting, and so forth.
- the simulation may be animated, such as by simulating a person walking while wearing user-selected clothing.
- the simulation may be rendered within a user interface for viewing by a user.
- step 806 additional description may be considered.
- a user may adjust model parameters in an iterative fashion (e.g., by repeating the steps above). If additional description is desired, the process 800 may return to step 802 where additional description is received. The user may incrementally describe an object in this fashion until a satisfactory model is derived for use in searching.
- the process 800 may proceed to step 808 where a search engine query is generated. While certain description herein refers to extraction of visual attributes from a simulation, it will be understood that this extraction may take a number of forms. For example, the extraction of visual attributes from a simulation may be based upon an analysis of attribute selections used to create the simulation, data associated with the simulation, or direct graphical inspection of the simulation, or some combination of these.
- the translation from a user-specified simulation to a textual query may employ any or all of the techniques outlined above including without limitation an application of domain-specific knowledge that might be derived from expert systems, dictionaries, thesauruses, semantic analysis, object definitions, and so forth.
- the process 800 may arrive at a search query suitable for presentation to one or more search engines.
- search engines a number of corresponding queries may be devised according to search engine syntax and any constraints or enhanced features provided therein.
- the system may be deployed for use with one or more search engines available through the Internet, or for use with a proprietary search engine local to the search system, or some combination of these.
- a user may explicitly select one or more search engines for receipt of the query, such as by selecting search engines or categories of search engines in a check box user interface.
- the query may be submitted to one or more search engines, with results displayed within the interface as shown in step 810.
- the process 800 may provide a user with an opportunity to import a search result into the current simulation. Where this option is selected, the simulation with the new object may be rendered for the user as shown in step 814. As generally described above, this may include virtually trying on an article of clothing, adding an appliance to a simulation of a kitchen, or any other suitable import of an object into a simulation. If this option is not selected, the process 800 may return to step 802 where a description of a new object is received. This may include incremental changes to the current description or the initiation of an entirely new description. [0067] It will be appreciated that the various steps identified and described above may be varied, and that the order of steps may be changed to suit particular applications of the techniques disclosed herein.
- the user interface for creating a query may be enhanced with a graphical input which may receive an initial product description in the form of a digital photograph, a facsimile, a sketch created by a user with online drawing tools, a CAD model or other three-dimensional model, or other graphical or image-based input.
- This image may be analyzed using techniques known in the art to extract visual attributes that may be employed to prepare a search, or to pre-load any number of selection criteria for the iterative description techniques disclosed above.
- a user may take a photograph of an item, such as an article of clothing, with a device such as a cellular phone camera, and load this digital photograph into the system described above to assist the user in locating and purchasing the item or a similar item for personal use.
- a user may provide a graphical description, including any of the foregoing models or images, for use in initiating a simulation-assisted search.
- social networking techniques may be employed to develop and refine descriptive vocabulary.
- this approach permits evolution of descriptive terminology according to fashion trends, popular phraseology, and the like.
- objects such as products may be tagged with metadata derived from social networking sites. This process may be slightly constrained, perhaps productively so, through the use of a Wiki or the like specifically designed for user-created metadata.
- a visual description Wiki may, for example, provide an interface for adding new content. In this interface, a user may add a new visual feature through a photograph, a drawing, a CAD drawing or other three-dimensional model, a fabric pattern, or the like, along with one or more visual attributes and/or values that describe the new feature.
- the interface may itself also provide one or more drawing tools for direct input of the visual features.
- the interface may permit explicit specification of a full attribute description, or may support semi-automated attribute creation such as through user-provided identification of similar or related items.
- the Wiki may monitor usage of each new feature and/or description and provide quantitative or qualitative evaluations of adoption, popularity, and the like (either for use of a new feature in products or use of a new description for an existing feature).
- the visual description interface may be available to all users, a secure interface may be provided through which authorized users can specify new products. This may include, for example, a board of editors or expert advisors in the relevant field, manufacturers, vendors, and the like. These users may directly specify terminology and visual specifications for immediate use by search engines and the like, and may provide any corresponding keywords, images, simulation models, and other related content. These users may also evaluate and edit content contributed by the general public or developed through the social networking techniques described above.
- Another interface which may be public or non-public, may receive identification of new search engines. This interface may also permit the submission of information about search syntax, content, and so forth that may be used to incorporate the new search engine into the systems and methods described above.
- the user interface 110 described above may be enhanced with numerous features.
- the interface may provide a closet feature - a visual metaphor for storing clothing selections similar to an electronic shopping cart where a user can retrieve and simulate items in the closet.
- this virtual closet may subscribe to syndicated data feeds of new clothing products. The data feeds may be processed so that a user can receive computer-generated notifications when clothing having the features of a closet item is published on the data feeds. Notifications may also or instead be generated when a new product has a feature set that is similar to one or more of the closet items.
- this closet metaphor a user may specify clothes of interest according to visual attributes, and virtually shop for these items in a continuous manner by monitoring relevant data feeds.
- the hardware may include a general purpose computer and/or dedicated computing device.
- the processes may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors or other programmable device, along with internal and/or external memory.
- the processes may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device that may be configured to process electronic signals.
- process(es) may be realized as computer executable code created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low- level programming language (including assembly languages, hardware description languages, database programming languages, and so forth) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software.
- processing may be distributed across a number of computers and other devices, or all of the functionality may be integrated into a dedicated, standalone product selection or configuration device. All such permutations and combinations are intended to fall within the scope of the present disclosure.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Library & Information Science (AREA)
- Software Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA002652762A CA2652762A1 (en) | 2006-05-19 | 2007-05-21 | Simulation-assisted search |
BRPI0713114-3A BRPI0713114A2 (en) | 2006-05-19 | 2007-05-21 | simulation assisted search |
AU2007280092A AU2007280092A1 (en) | 2006-05-19 | 2007-05-21 | Simulation-assisted search |
EP07825352A EP2021962A2 (en) | 2006-05-19 | 2007-05-21 | Simulation-assisted search |
JP2009511609A JP2009545019A (en) | 2006-05-19 | 2007-05-21 | Simulation support type search |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US74775806P | 2006-05-19 | 2006-05-19 | |
US60/747,758 | 2006-05-19 | ||
US80495206P | 2006-06-16 | 2006-06-16 | |
US60/804,952 | 2006-06-16 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2008015571A2 true WO2008015571A2 (en) | 2008-02-07 |
WO2008015571A8 WO2008015571A8 (en) | 2008-10-02 |
WO2008015571A3 WO2008015571A3 (en) | 2011-02-24 |
Family
ID=38997532
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2007/003047 WO2008015571A2 (en) | 2006-05-19 | 2007-05-21 | Simulation-assisted search |
Country Status (8)
Country | Link |
---|---|
US (2) | US20080097975A1 (en) |
EP (1) | EP2021962A2 (en) |
JP (1) | JP2009545019A (en) |
KR (1) | KR20090028713A (en) |
AU (1) | AU2007280092A1 (en) |
BR (1) | BRPI0713114A2 (en) |
CA (1) | CA2652762A1 (en) |
WO (1) | WO2008015571A2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IT201800007812A1 (en) * | 2018-08-03 | 2020-02-03 | Else Corp Srl | A 3D visual search and AI-based recommendation system |
US10614602B2 (en) | 2011-12-29 | 2020-04-07 | Ebay Inc. | Personal augmented reality |
US10628877B2 (en) | 2011-10-27 | 2020-04-21 | Ebay Inc. | System and method for visualization of items in an environment using augmented reality |
EP3916678A4 (en) * | 2019-01-28 | 2022-04-27 | Samsung Electronics Co., Ltd. | Electronic device and graphic object control method of electronic device |
Families Citing this family (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8630984B1 (en) | 2003-01-17 | 2014-01-14 | Renew Data Corp. | System and method for data extraction from email files |
US8065277B1 (en) | 2003-01-17 | 2011-11-22 | Daniel John Gardner | System and method for a data extraction and backup database |
US8943024B1 (en) | 2003-01-17 | 2015-01-27 | Daniel John Gardner | System and method for data de-duplication |
US8375008B1 (en) | 2003-01-17 | 2013-02-12 | Robert Gomes | Method and system for enterprise-wide retention of digital or electronic data |
US8069151B1 (en) | 2004-12-08 | 2011-11-29 | Chris Crafford | System and method for detecting incongruous or incorrect media in a data recovery process |
US8527468B1 (en) | 2005-02-08 | 2013-09-03 | Renew Data Corp. | System and method for management of retention periods for content in a computing system |
NZ569107A (en) | 2005-11-16 | 2011-09-30 | Evri Inc | Extending keyword searching to syntactically and semantically annotated data |
US7885947B2 (en) * | 2006-05-31 | 2011-02-08 | International Business Machines Corporation | Method, system and computer program for discovering inventory information with dynamic selection of available providers |
US8150827B2 (en) * | 2006-06-07 | 2012-04-03 | Renew Data Corp. | Methods for enhancing efficiency and cost effectiveness of first pass review of documents |
CA2717462C (en) | 2007-03-14 | 2016-09-27 | Evri Inc. | Query templates and labeled search tip system, methods, and techniques |
US9081852B2 (en) * | 2007-10-05 | 2015-07-14 | Fujitsu Limited | Recommending terms to specify ontology space |
US8280892B2 (en) | 2007-10-05 | 2012-10-02 | Fujitsu Limited | Selecting tags for a document by analyzing paragraphs of the document |
US8594996B2 (en) | 2007-10-17 | 2013-11-26 | Evri Inc. | NLP-based entity recognition and disambiguation |
US8700604B2 (en) * | 2007-10-17 | 2014-04-15 | Evri, Inc. | NLP-based content recommender |
JP5042787B2 (en) * | 2007-11-20 | 2012-10-03 | 富士フイルム株式会社 | Product search system, product search method and product search program |
US8615490B1 (en) | 2008-01-31 | 2013-12-24 | Renew Data Corp. | Method and system for restoring information from backup storage media |
US10460085B2 (en) | 2008-03-13 | 2019-10-29 | Mattel, Inc. | Tablet computer |
US8200649B2 (en) * | 2008-05-13 | 2012-06-12 | Enpulz, Llc | Image search engine using context screening parameters |
US20090310187A1 (en) * | 2008-06-12 | 2009-12-17 | Harris Scott C | Face Simulation in Networking |
TW201013430A (en) * | 2008-09-17 | 2010-04-01 | Ibm | Method and system for providing suggested tags associated with a target page for manipulation by a user |
CA2681697A1 (en) * | 2008-10-09 | 2010-04-09 | Retail Royalty Company | Methods and systems for online shopping |
US8914397B2 (en) * | 2008-12-04 | 2014-12-16 | Microsoft Corporation | Rich-context tagging of resources |
US20100169376A1 (en) * | 2008-12-29 | 2010-07-01 | Yahoo! Inc. | Visual search engine for personal dating |
US20100185525A1 (en) * | 2009-01-16 | 2010-07-22 | John Hazen | Controlling presentation of purchasing information based on item availability |
US10191982B1 (en) * | 2009-01-23 | 2019-01-29 | Zakata, LLC | Topical search portal |
WO2010096763A1 (en) * | 2009-02-20 | 2010-08-26 | Fuhu, Inc. | System and method for defined searching and web crawling |
US20100217867A1 (en) * | 2009-02-25 | 2010-08-26 | International Business Machines Corporation | System and method for creating and using service dependency graphs to automate the development and deployment of service oriented applications |
US8370336B2 (en) | 2009-04-08 | 2013-02-05 | Ebay Inc. | Methods and systems for deriving demand metrics used in ordering item listings presented in a search results page |
US8341241B2 (en) * | 2009-04-14 | 2012-12-25 | At&T Intellectual Property I, L.P. | Method and apparatus for presenting media content |
WO2010120699A2 (en) * | 2009-04-16 | 2010-10-21 | Evri Inc. | Enhanced advertisement targeting |
WO2010135746A1 (en) * | 2009-05-22 | 2010-11-25 | Facebook, Inc. | Unified online conversation application and platform |
CA2779208C (en) * | 2009-10-30 | 2016-03-22 | Evri, Inc. | Improving keyword-based search engine results using enhanced query strategies |
US20110145269A1 (en) * | 2009-12-09 | 2011-06-16 | Renew Data Corp. | System and method for quickly determining a subset of irrelevant data from large data content |
WO2011075610A1 (en) | 2009-12-16 | 2011-06-23 | Renew Data Corp. | System and method for creating a de-duplicated data set |
US9639880B2 (en) * | 2009-12-17 | 2017-05-02 | Google Inc. | Photorealistic recommendation of clothing and apparel based on detected web browser input and content tag analysis |
US20110213679A1 (en) * | 2010-02-26 | 2011-09-01 | Ebay Inc. | Multi-quantity fixed price referral systems and methods |
US9710556B2 (en) | 2010-03-01 | 2017-07-18 | Vcvc Iii Llc | Content recommendation based on collections of entities |
US8645125B2 (en) | 2010-03-30 | 2014-02-04 | Evri, Inc. | NLP-based systems and methods for providing quotations |
US20110257941A1 (en) * | 2010-04-19 | 2011-10-20 | Sebastian Magro | System and automated method for creating drawings online for product manufacturing |
US8838633B2 (en) | 2010-08-11 | 2014-09-16 | Vcvc Iii Llc | NLP-based sentiment analysis |
US20120047025A1 (en) | 2010-08-19 | 2012-02-23 | Google Inc. | Query stem advertising |
US9405848B2 (en) | 2010-09-15 | 2016-08-02 | Vcvc Iii Llc | Recommending mobile device activities |
US8725739B2 (en) | 2010-11-01 | 2014-05-13 | Evri, Inc. | Category-based content recommendation |
US9589053B1 (en) * | 2010-12-17 | 2017-03-07 | The Boeing Company | Method and apparatus for constructing a query based upon concepts associated with one or more search terms |
US8566325B1 (en) * | 2010-12-23 | 2013-10-22 | Google Inc. | Building search by contents |
US9116995B2 (en) | 2011-03-30 | 2015-08-25 | Vcvc Iii Llc | Cluster-based identification of news stories |
US8949212B1 (en) * | 2011-07-08 | 2015-02-03 | Hariharan Dhandapani | Location-based informaton display |
EP2549389A1 (en) * | 2011-07-20 | 2013-01-23 | Axel Springer Digital TV Guide GmbH | Easy 2D navigation in a video database |
TWI627084B (en) * | 2011-09-19 | 2018-06-21 | 塔塔顧問服務有限公司 | A computing platform for development and deployment of sensor-driven vehicle telemetry applications and services |
US9183280B2 (en) * | 2011-09-30 | 2015-11-10 | Paypal, Inc. | Methods and systems using demand metrics for presenting aspects for item listings presented in a search results page |
US9292603B2 (en) * | 2011-09-30 | 2016-03-22 | Nuance Communications, Inc. | Receipt and processing of user-specified queries |
US8908962B2 (en) | 2011-09-30 | 2014-12-09 | Ebay Inc. | Item recommendations using image feature data |
US8589410B2 (en) * | 2011-10-18 | 2013-11-19 | Microsoft Corporation | Visual search using multiple visual input modalities |
US10002164B2 (en) * | 2012-06-01 | 2018-06-19 | Ansys, Inc. | Systems and methods for context based search of simulation objects |
US10031978B1 (en) | 2012-06-29 | 2018-07-24 | Open Text Corporation | Methods and systems for providing a search service application |
US10157229B1 (en) | 2012-06-29 | 2018-12-18 | Open Text Corporation | Methods and systems for building a search service application |
CN103678335B (en) * | 2012-09-05 | 2017-12-08 | 阿里巴巴集团控股有限公司 | The method of method, apparatus and the commodity navigation of commodity sign label |
US9460208B2 (en) | 2012-10-04 | 2016-10-04 | Hubub, Inc. | Publication and interactive discussion engine driven by user-specified topic |
US9443016B2 (en) | 2013-02-08 | 2016-09-13 | Verbify Inc. | System and method for generating and interacting with a contextual search stream |
EP2973040A1 (en) | 2013-03-15 | 2016-01-20 | NIKE Innovate C.V. | Product presentation assisted by visual search |
KR101338895B1 (en) | 2013-06-24 | 2013-12-09 | 한국과학기술정보연구원 | System and method for binding simulation program |
US10896187B2 (en) * | 2015-07-14 | 2021-01-19 | Conduent Business Services, Llc | Methods and systems for searching for users |
US10832305B1 (en) * | 2015-08-26 | 2020-11-10 | Pcsso Inc. | System and method for image processing and searching for classification in a product database |
US10776861B1 (en) | 2017-04-27 | 2020-09-15 | Amazon Technologies, Inc. | Displaying garments on 3D models of customers |
US10897539B1 (en) | 2019-10-31 | 2021-01-19 | Talkdesk Inc. | Method for visual-based programming of self-service workflow |
US20210390430A1 (en) * | 2020-06-12 | 2021-12-16 | Rokkcb10, Inc. | Machine Learning System and Method for Garment Recommendation |
WO2023039193A1 (en) * | 2021-09-09 | 2023-03-16 | Motional Ad Llc | Search algorithms and safety verification for compliant domain volumes |
US11893847B1 (en) | 2022-09-23 | 2024-02-06 | Amazon Technologies, Inc. | Delivering items to evaluation rooms while maintaining customer privacy |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5911139A (en) * | 1996-03-29 | 1999-06-08 | Virage, Inc. | Visual image database search engine which allows for different schema |
US6175829B1 (en) * | 1998-04-22 | 2001-01-16 | Nec Usa, Inc. | Method and apparatus for facilitating query reformulation |
US20020087426A1 (en) * | 2000-12-28 | 2002-07-04 | Fujitsu Limited | Online shopping method and system |
US20020184111A1 (en) * | 2001-02-07 | 2002-12-05 | Exalt Solutions, Inc. | Intelligent multimedia e-catalog |
US20030018607A1 (en) * | 2000-08-04 | 2003-01-23 | Lennon Alison Joan | Method of enabling browse and search access to electronically-accessible multimedia databases |
US6535888B1 (en) * | 2000-07-19 | 2003-03-18 | Oxelis, Inc. | Method and system for providing a visual search directory |
US20030076318A1 (en) * | 2001-10-19 | 2003-04-24 | Ar Card | Method of virtual garment fitting, selection, and processing |
US20040019536A1 (en) * | 2002-07-23 | 2004-01-29 | Amir Ashkenazi | Systems and methods for facilitating internet shopping |
US20040039663A1 (en) * | 1999-02-26 | 2004-02-26 | Kernz James J. | Integrated market exchange system, apparatus and method facilitating trade in graded encapsulated objects |
US6847980B1 (en) * | 1999-07-03 | 2005-01-25 | Ana B. Benitez | Fundamental entity-relationship models for the generic audio visual data signal description |
US6912293B1 (en) * | 1998-06-26 | 2005-06-28 | Carl P. Korobkin | Photogrammetry engine for model construction |
US20050162419A1 (en) * | 2002-03-26 | 2005-07-28 | Kim So W. | System and method for 3-dimension simulation of glasses |
US20060002607A1 (en) * | 2000-11-06 | 2006-01-05 | Evryx Technologies, Inc. | Use of image-derived information as search criteria for internet and other search engines |
US20060050993A1 (en) * | 2002-12-19 | 2006-03-09 | Stentiford Frederick W | Searching images |
US20060095345A1 (en) * | 2004-10-28 | 2006-05-04 | Microsoft Corporation | System and method for an online catalog system having integrated search and browse capability |
US20060116994A1 (en) * | 2004-11-30 | 2006-06-01 | Oculus Info Inc. | System and method for interactive multi-dimensional visual representation of information content and properties |
US20070130020A1 (en) * | 2005-12-01 | 2007-06-07 | Paolini Michael A | Consumer representation rendering with selected merchandise |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7287214B1 (en) * | 1999-12-10 | 2007-10-23 | Books24X7.Com, Inc. | System and method for providing a searchable library of electronic documents to a user |
US7043474B2 (en) * | 2002-04-15 | 2006-05-09 | International Business Machines Corporation | System and method for measuring image similarity based on semantic meaning |
US7437439B2 (en) * | 2002-12-03 | 2008-10-14 | Hewlett-Packard Development Company, L.P. | System and method for the hybrid harvesting of information from peripheral devices |
US7555691B2 (en) * | 2003-05-22 | 2009-06-30 | At&T Intellectual Property, Ii, L.P. | Apparatus and method for providing near-optimal representations over redundant dictionaries |
US7437358B2 (en) * | 2004-06-25 | 2008-10-14 | Apple Inc. | Methods and systems for managing data |
AU2005277506C1 (en) * | 2004-08-23 | 2011-03-31 | Lexisnexis, A Division Of Reed Elsevier Inc. | Point of law search system and method |
-
2007
- 2007-05-21 EP EP07825352A patent/EP2021962A2/en not_active Withdrawn
- 2007-05-21 JP JP2009511609A patent/JP2009545019A/en active Pending
- 2007-05-21 US US11/751,485 patent/US20080097975A1/en not_active Abandoned
- 2007-05-21 CA CA002652762A patent/CA2652762A1/en not_active Abandoned
- 2007-05-21 WO PCT/IB2007/003047 patent/WO2008015571A2/en active Application Filing
- 2007-05-21 BR BRPI0713114-3A patent/BRPI0713114A2/en not_active IP Right Cessation
- 2007-05-21 KR KR1020087030675A patent/KR20090028713A/en not_active Application Discontinuation
- 2007-05-21 AU AU2007280092A patent/AU2007280092A1/en not_active Abandoned
-
2011
- 2011-03-14 US US13/047,321 patent/US20120072405A1/en not_active Abandoned
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5911139A (en) * | 1996-03-29 | 1999-06-08 | Virage, Inc. | Visual image database search engine which allows for different schema |
US6175829B1 (en) * | 1998-04-22 | 2001-01-16 | Nec Usa, Inc. | Method and apparatus for facilitating query reformulation |
US6912293B1 (en) * | 1998-06-26 | 2005-06-28 | Carl P. Korobkin | Photogrammetry engine for model construction |
US20040039663A1 (en) * | 1999-02-26 | 2004-02-26 | Kernz James J. | Integrated market exchange system, apparatus and method facilitating trade in graded encapsulated objects |
US6847980B1 (en) * | 1999-07-03 | 2005-01-25 | Ana B. Benitez | Fundamental entity-relationship models for the generic audio visual data signal description |
US6535888B1 (en) * | 2000-07-19 | 2003-03-18 | Oxelis, Inc. | Method and system for providing a visual search directory |
US20030018607A1 (en) * | 2000-08-04 | 2003-01-23 | Lennon Alison Joan | Method of enabling browse and search access to electronically-accessible multimedia databases |
US20060002607A1 (en) * | 2000-11-06 | 2006-01-05 | Evryx Technologies, Inc. | Use of image-derived information as search criteria for internet and other search engines |
US20020087426A1 (en) * | 2000-12-28 | 2002-07-04 | Fujitsu Limited | Online shopping method and system |
US20020184111A1 (en) * | 2001-02-07 | 2002-12-05 | Exalt Solutions, Inc. | Intelligent multimedia e-catalog |
US20030076318A1 (en) * | 2001-10-19 | 2003-04-24 | Ar Card | Method of virtual garment fitting, selection, and processing |
US20050162419A1 (en) * | 2002-03-26 | 2005-07-28 | Kim So W. | System and method for 3-dimension simulation of glasses |
US20040019536A1 (en) * | 2002-07-23 | 2004-01-29 | Amir Ashkenazi | Systems and methods for facilitating internet shopping |
US20060050993A1 (en) * | 2002-12-19 | 2006-03-09 | Stentiford Frederick W | Searching images |
US20060095345A1 (en) * | 2004-10-28 | 2006-05-04 | Microsoft Corporation | System and method for an online catalog system having integrated search and browse capability |
US20060116994A1 (en) * | 2004-11-30 | 2006-06-01 | Oculus Info Inc. | System and method for interactive multi-dimensional visual representation of information content and properties |
US20070130020A1 (en) * | 2005-12-01 | 2007-06-07 | Paolini Michael A | Consumer representation rendering with selected merchandise |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10628877B2 (en) | 2011-10-27 | 2020-04-21 | Ebay Inc. | System and method for visualization of items in an environment using augmented reality |
US11113755B2 (en) | 2011-10-27 | 2021-09-07 | Ebay Inc. | System and method for visualization of items in an environment using augmented reality |
US11475509B2 (en) | 2011-10-27 | 2022-10-18 | Ebay Inc. | System and method for visualization of items in an environment using augmented reality |
US10614602B2 (en) | 2011-12-29 | 2020-04-07 | Ebay Inc. | Personal augmented reality |
IT201800007812A1 (en) * | 2018-08-03 | 2020-02-03 | Else Corp Srl | A 3D visual search and AI-based recommendation system |
EP3916678A4 (en) * | 2019-01-28 | 2022-04-27 | Samsung Electronics Co., Ltd. | Electronic device and graphic object control method of electronic device |
US12112444B2 (en) | 2019-01-28 | 2024-10-08 | Samsung Electronics Co., Ltd | Electronic device and graphic object control method of electronic device |
Also Published As
Publication number | Publication date |
---|---|
AU2007280092A1 (en) | 2008-02-07 |
JP2009545019A (en) | 2009-12-17 |
WO2008015571A3 (en) | 2011-02-24 |
EP2021962A2 (en) | 2009-02-11 |
CA2652762A1 (en) | 2008-02-07 |
KR20090028713A (en) | 2009-03-19 |
US20080097975A1 (en) | 2008-04-24 |
WO2008015571A8 (en) | 2008-10-02 |
BRPI0713114A2 (en) | 2012-04-17 |
US20120072405A1 (en) | 2012-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20120072405A1 (en) | Simulation-assisted search | |
US20190073710A1 (en) | Color based social networking recommendations | |
US10580057B2 (en) | Photorealistic recommendation of clothing and apparel based on detected web browser input and content tag analysis | |
US20220058715A1 (en) | Method and system for search refinement | |
US11037222B1 (en) | Dynamic recommendations personalized by historical data | |
US20200342320A1 (en) | Non-binary gender filter | |
US20180158128A1 (en) | Automatic color palette based recommendations for affiliated colors | |
US20180181569A1 (en) | Visual category representation with diverse ranking | |
US9697573B1 (en) | Color-related social networking recommendations using affiliated colors | |
US10776417B1 (en) | Parts-based visual similarity search | |
US20180276727A1 (en) | System and Method for Automated Retrieval of Apparel Items and the Like | |
US20100313141A1 (en) | System and Method for Learning User Genres and Styles and for Matching Products to User Preferences | |
CA2764056A1 (en) | System and method for learning user genres and styles and matching products to user preferences | |
Al-Lohibi et al. | Awjedni: a reverse-image-search application | |
US10762548B1 (en) | Digital data processing methods and apparatus for personalized user interface generation through personalized sorting | |
KR102043440B1 (en) | Method and system for coordination searching based on coordination of a plurality of objects in image | |
US11238515B1 (en) | Systems and method for visual search with attribute manipulation | |
US20150058172A1 (en) | Systems and Methods for Searching for Items of Fashion and other Items of Creation | |
US11972466B2 (en) | Computer storage media, method, and system for exploring and recommending matching products across categories | |
US11488223B1 (en) | Modification of user interface based on dynamically-ranked product attributes | |
KR20200115044A (en) | Identifying physical objects using visual search query | |
JP2019057255A (en) | Customer ornament matching system | |
WO2020046795A1 (en) | System, method, and computer program product for determining compatibility between items in images | |
Shamoi et al. | Apparel online shop reflecting customer perception | |
CN114220433B (en) | Shoe fitting method and shoe cabinet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2652762 Country of ref document: CA Ref document number: 2007280092 Country of ref document: AU Ref document number: 2009511609 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007825352 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2007280092 Country of ref document: AU Date of ref document: 20070521 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087030675 Country of ref document: KR |
|
ENP | Entry into the national phase |
Ref document number: PI0713114 Country of ref document: BR Kind code of ref document: A2 Effective date: 20081119 |