US20200273069A1 - Generating Keyword Lists Related to Topics Represented by an Array of Topic Records, for Use in Targeting Online Advertisements and Other Uses - Google Patents

Generating Keyword Lists Related to Topics Represented by an Array of Topic Records, for Use in Targeting Online Advertisements and Other Uses Download PDF

Info

Publication number
US20200273069A1
US20200273069A1 US16/803,214 US202016803214A US2020273069A1 US 20200273069 A1 US20200273069 A1 US 20200273069A1 US 202016803214 A US202016803214 A US 202016803214A US 2020273069 A1 US2020273069 A1 US 2020273069A1
Authority
US
United States
Prior art keywords
topic
array
vectors
keywords
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US16/803,214
Inventor
Leon Palaic
Markus Hans Gross
Sasha Anna Schriber
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanocorp AG
Original Assignee
Nanocorp AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanocorp AG filed Critical Nanocorp AG
Priority to US16/803,214 priority Critical patent/US20200273069A1/en
Publication of US20200273069A1 publication Critical patent/US20200273069A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0254Targeted advertisements based on statistics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24578Query processing with adaptation to user needs using ranking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • G06Q30/0203Market surveys; Market polls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0242Determining effectiveness of advertisements
    • G06Q30/0244Optimization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0277Online advertisement

Definitions

  • the present disclosure generally relates to computer systems that process semantic data.
  • the disclosure relates more particularly to apparatus and techniques for processing data related to topic records wherein topic records represent meaning to users and outputting data relating to weighted, filtered, and/or sorted lists of keywords usable as inputs to an online advertising system.
  • Online advertising can be a useful method of advertising, if the right advertisements reach the right consumers or potential consumers. With rapidly changing interests, many different products/services to offer, and competing advertisers, achieving favorable results from an advertising campaign can be difficult.
  • a marketing campaign manager inputs desired keywords, considers bids and offers to place advertisements based on those keywords and then places advertisements.
  • a semantic processor is programmed to evaluate a topic data structure using a topic module, keyword module, and the optimization module in a process of keyword selection for keywords to be used as an input to a keyword-based online advertising purchasing computer system, as well as other purposes.
  • the semantic processor can derive keyword sets in a fully automated way without human intervention and without need for time-consuming processes of exploring large keyword databases.
  • a computer-implemented method of generating, from an array of topic records, an output array of keywords for use in targeting online advertisements related to topics represented by the array of topic records includes obtaining the array of topic records in computer-readable form, determining a relevancy value for words and topics, classifying the topic vectors in the plurality of topic vectors into a high-volume class or a low-volume class, generating and storing an array of seed keywords derived by sampling from an embedded space, ranking the seed keywords to form an array of ranked keywords, updating the array of ranked keywords based on keyword cost-per-click data, iterating the ranking and updating at least once, and evaluating an optimization improvement value for an iteration when either the optimization improvement value for the iteration is below a pre-determined threshold or the maximum time limit for the optimization is reached, generate the output array of keywords from the array of ranked keywords.
  • FIG. 1 illustrates an overview of how a semantic processor might be used to generate a keyword set from a topics list, including a topic module, a ranker module and a keyword suggester module.
  • FIG. 6 is a block diagram of various components, including inputs from human users and computer processes.
  • FIG. 8 is a block diagram of an example of memory structures as might be used to implement functions described herein.
  • a topic data element can represent concepts that can be expressed by user input. For example, topics might relate to interests, descriptions of business purposes, general themes, products, locations, times of day, types of business, keywords, and other expressions. Topics might be used to describe a target customer group or a product group of an enterprise.
  • An embodiment might also have keyword data structures, wherein a keyword data structure represents some standardized representation of a targeting scope of a campaign, expressed in a natural or in another language.
  • keyword data structures include, but are not limited to, actual keywords (e.g., keywords that form inputs to the Google advertising platform or other keyword-based advertising platform or where advertising is purchased or presented based on selected keywords), a selection of user interests on the Facebook advertising platform or other social media platforms that filter or present content based on data structures representing user interests), an encoding of a distribution of targets in terms of single users or user groups, and embeddings in a learned or predefined targeting space.
  • Filters might be applied, such as target customer age ranges, customer location, and/or customer language.
  • the vocabulary V topic can be constructed for all high volume topics and reused as vocabulary V shared .
  • Embedding matrices W and T are constructed for their corresponding vocabularies, where W corresponds to a seed keywords vocabulary and T corresponds to a topic vocabulary.
  • the rest of the architecture can be constructed according to the above-mentioned equations.
  • a custom heuristic can be used that operates on a Common Crawl Corpus. Documents gathered from a Common Crawl process might be automatically annotated with appropriate topics tags so that the semantic processor can learn topic vectors.
  • documents are clustered inside 39 topics, ranging from Greek cuisine to gardening.
  • 100 dimensions for topic and word vectors were used, with a window size of four words and learning rate of 0.025. Optimization of the model can be done by SGD.
  • the selection module is an iterative procedure of ranking and generating new keyword ideas. An overview of a selection module process is shown in FIG. 4 .
  • Two main parts of the selection module are the ranker and the keyword suggester.
  • the ranker might score keyword sets by a predetermined performance metric and might also filter non-relevant keywords.
  • the ranker can be implemented in a way that the keyword set score is computed by summing individual keyword scores from the set.
  • a keyword set score is illustrated by Equation 10.
  • the final keyword set should contain keywords that have a search volume that can lead to user conversion and low cost-per-click so that it boosts the effectiveness of a campaign.
  • Keyword performance metric might be a computed value that is computed from a reciprocal value of a keyword cost-per-click value multiplied by a normalized keyword search volume value.
  • the result can be used as a keyword score for a computer process that evaluates keywords.
  • the semantic processor can rule out keywords that have a high cost-per-click value by taking a reciprocal value of the keyword cost per click and favoring low-cost/moderate cost keywords with sufficient search volume.
  • the keyword score might be generated as in Equation 9, where d is a distance function, w keyword is a keyword vector and t initial is a topic vector of interest. In other variations, different distance functions might be used.
  • the distance function might be cosine distance, Euclidean distance, or some natural language processing distance function.
  • the array of keywords might be updated by calculating a keyword score for the array.
  • the keyword score might be calculated as a normalized search volume divided by a keyword cost per click.
  • the distance term indicates how related a selected keyword is to the original topic. A high distance would penalize the keyword score, since the keyword is not related at all to the initial topics, and a small distance would mean that the keyword is highly related to the original topic and it is therefore beneficial to include in the original set.
  • the keyword suggester might be programmed to process a heuristic that operates on large keyword databases and retrieve keyword suggestions based on the defined input.
  • the semantic processor might use a third-party service, such as the Google Targeting Idea service, as keyword suggester module. That service allows the semantic processor to retrieve targeting keyword ideas from various parameters, such as keyword list, location, language, product category and others.
  • FIG. 5 illustrates an example of a keyword list data structure.
  • FIG. 6 is a block diagram of various components, including inputs from human users and computer processes.
  • an advertisement designer 606 might provide user input to a visual ad generator 604 .
  • the visual ad generator 604 could generate campaign data 610 that can be passed to a campaign storage and management system 612 .
  • the visual ad generator 604 can also pull in campaign data previously stored.
  • a marketing reviewer 616 can provide feedback to a keyword learning system 614 that can get feedback also from an A/B testing system 618 .
  • a budget reviewer 620 can submit a budget plan to a budget allocator system 622 that interacts with an adjustment triggering system 624 , which can in turn provide feedback to the visual ad generator 604 .
  • FIG. 7 is a block diagram that illustrates a computer system 700 upon which an embodiment of the invention may be implemented.
  • Computer system 700 includes a bus 702 or other communication mechanism for communicating information, and a processor 704 coupled with bus 702 for processing information.
  • Processor 704 may be, for example, a general purpose microprocessor.
  • Computer system 700 also includes a main memory 706 , such as a random access memory (RAM) or other dynamic storage device, coupled to bus 702 for storing information and instructions to be executed by processor 704 .
  • Main memory 706 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 704 .
  • Such instructions when stored in non-transitory storage media accessible to processor 704 , render computer system 700 into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • Computer system 700 may be coupled via bus 702 to a display 712 , such as a computer monitor, for displaying information to a computer user.
  • a display 712 such as a computer monitor
  • An input device 714 is coupled to bus 702 for communicating information and command selections to processor 704 .
  • cursor control 716 is Another type of user input device
  • cursor control 716 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 704 and for controlling cursor movement on display 712 .
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis e.g., y), that allows the device to specify positions in a plane.
  • Computer system 700 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 700 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 700 in response to processor 704 executing one or more sequences of one or more instructions contained in main memory 706 . Such instructions may be read into main memory 706 from another storage medium, such as storage device 710 . Execution of the sequences of instructions contained in main memory 706 causes processor 704 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • Non-volatile media includes, for example, optical or magnetic disks, such as storage device 710 .
  • Volatile media includes dynamic memory, such as main memory 706 .
  • Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 702 .
  • transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
  • Computer system 700 also includes a communication interface 718 coupled to bus 702 .
  • Communication interface 718 provides a two-way data communication coupling to a network link 720 that is connected to a local network 722 .
  • network link 720 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line.
  • ISDN integrated services digital network
  • Wireless links may also be implemented.
  • communication interface 718 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • Network link 720 typically provides data communication through one or more networks to other data devices.
  • network link 720 may provide a connection through local network 722 to a host computer 724 or to data equipment operated by an Internet Service Provider (ISP) 726 .
  • ISP 726 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 728 .
  • Internet 728 uses electrical, electromagnetic or optical signals that carry digital data streams.
  • the signals through the various networks and the signals on network link 720 and through communication interface 718 which carry the digital data to and from computer system 700 , are example forms of transmission media.
  • Computer system 700 can send messages and receive data, including program code, through the network(s), network link 720 and communication interface 718 .
  • a server 730 might transmit a requested code for an application program through Internet 728 , ISP 726 , local network 722 and communication interface 718 .
  • the received code may be executed by processor 704 as it is received, and/or stored in storage device 710 , or other non-volatile storage for later execution.
  • FIG. 8 illustrates an example of memory elements that might be used by a processor to implement elements of the embodiments described herein.
  • FIG. 8 is a simplified functional block diagram of a storage device 848 having an application that can be accessed and executed by a processor in a computer system.
  • the application can one or more of the applications described herein, running on servers, clients or other platforms or devices and might represent memory of one of the clients and/or servers illustrated elsewhere.
  • Storage device 848 can be one or more memory devices that can be accessed by a processor and storage device 848 can have stored thereon application code 850 that can be configured to store one or more processor readable instructions.
  • the application code 850 can include application logic 852 , library functions 854 , and file I/O functions 856 associated with the application.
  • Storage device 848 can also include application variables 862 that can include one or more storage locations configured to receive input variables 864 .
  • the application variables 862 can include variables that are generated by the application or otherwise local to the application.
  • the application variables 862 can be generated, for example, from data retrieved from an external source, such as a user or an external device or application.
  • the processor can execute the application code 850 to generate the application variables 862 provided to storage device 848 .
  • One or more memory locations can be configured to store device data 866 .
  • Device data 866 can include data that is sourced by an external source, such as a user or an external device.
  • Device data 866 can include, for example, records being passed between servers prior to being transmitted or after being received.
  • Other data 868 might also be supplied.
  • Storage device 848 can also include a log file 880 having one or more storage locations 884 configured to store results of the application or inputs provided to the application.
  • the log file 880 can be configured to store a history of actions.
  • the memory elements of FIG. 8 might be used for a server or computer that interfaces with a user, generates keyword lists, and/or manages other aspects of a process described herein.
  • Processes described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context.
  • Processes described herein may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof.
  • the code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors.
  • the computer-readable storage medium may be non-transitory.

Abstract

A computer-implemented method of generating, from an array of topic records, an output array of keywords for use in targeting online advertisements related to topics represented by the array of topic records, includes obtaining the array of topic records in computer-readable form, determining a relevancy value for words and topics, classifying the topic vectors in the plurality of topic vectors into a high-volume class or a low-volume class, generating and storing an array of seed keywords derived by sampling from an embedded space, ranking the seed keywords to form an array of ranked keywords, updating the array of ranked keywords based on keyword cost-per-click data, iterating the ranking and updating at least once, and evaluating an optimization improvement value for an iteration.

Description

    FIELD OF THE INVENTION
  • The present disclosure generally relates to computer systems that process semantic data. The disclosure relates more particularly to apparatus and techniques for processing data related to topic records wherein topic records represent meaning to users and outputting data relating to weighted, filtered, and/or sorted lists of keywords usable as inputs to an online advertising system.
  • BACKGROUND
  • Online advertising can be a useful method of advertising, if the right advertisements reach the right consumers or potential consumers. With rapidly changing interests, many different products/services to offer, and competing advertisers, achieving favorable results from an advertising campaign can be difficult. In a simple implementation, a marketing campaign manager inputs desired keywords, considers bids and offers to place advertisements based on those keywords and then places advertisements.
  • Selection of right and relevant keywords for specific business domain is an important task when it comes to boosting the performance of an advertising campaign, especially when the costs are a function of the number of viewers who react to an advertisement, such as clicking on a displayed advertisement (the pay-per-click model). Many marketing specialists base their keyword selection on the exploration of large keyword databases, which can be a time-consuming process. One drawback to that approach is that it requires considerable human intervention and thus is often only available to larger organizations.
  • SUMMARY
  • A semantic processor is programmed to evaluate a topic data structure using a topic module, keyword module, and the optimization module in a process of keyword selection for keywords to be used as an input to a keyword-based online advertising purchasing computer system, as well as other purposes. In some embodiments, the semantic processor can derive keyword sets in a fully automated way without human intervention and without need for time-consuming processes of exploring large keyword databases.
  • A computer-implemented method of generating, from an array of topic records, an output array of keywords for use in targeting online advertisements related to topics represented by the array of topic records, includes obtaining the array of topic records in computer-readable form, determining a relevancy value for words and topics, classifying the topic vectors in the plurality of topic vectors into a high-volume class or a low-volume class, generating and storing an array of seed keywords derived by sampling from an embedded space, ranking the seed keywords to form an array of ranked keywords, updating the array of ranked keywords based on keyword cost-per-click data, iterating the ranking and updating at least once, and evaluating an optimization improvement value for an iteration when either the optimization improvement value for the iteration is below a pre-determined threshold or the maximum time limit for the optimization is reached, generate the output array of keywords from the array of ranked keywords.
  • When classifying the topic vectors in the plurality of topic vectors into a high-volume class that night comprise a paragraph vector model step and classifying the topic vectors in the plurality of topic vectors into a low-volume class comprises a PMI-SVD model step. Obtaining the array of topic records in computer-readable form might comprise sending user interface data representing a user interface to a user device, obtaining a user reply from the user device, and generating the array of topic records from the user reply. Determining the relevancy value for the word and topic might comprise computing a cosine distance between the word vector and the topic vector.
  • These operations might be performed by executing executable instructions stored in a non-transitory computer-readable storage medium that, when executed by one or more processors of a computer system, cause the computer system to perform those operations.
  • The following detailed description together with the accompanying drawings will provide a better understanding of the nature and advantages of the present invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Various embodiments in accordance with the present disclosure will be described with reference to the drawings, in which:
  • FIG. 1 illustrates an overview of how a semantic processor might be used to generate a keyword set from a topics list, including a topic module, a ranker module and a keyword suggester module.
  • FIG. 2 illustrates internal logic of the topic module of FIG. 1.
  • FIG. 3 illustrates a paragraph vector model.
  • FIG. 4 illustrates an example of a process that a semantic processor might use to implement a selection module.
  • FIG. 5 illustrates an example of a keyword list data structure.
  • FIG. 6 is a block diagram of various components, including inputs from human users and computer processes.
  • FIG. 7 is a block diagram that illustrates a computer system upon which an embodiment of the invention may be implemented.
  • FIG. 8 is a block diagram of an example of memory structures as might be used to implement functions described herein.
  • DETAILED DESCRIPTION
  • In the following description, various embodiments will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the embodiments may be practiced without the specific details. Furthermore, well-known features may be omitted or simplified in order not to obscure the embodiment being described.
  • According to one embodiment, the techniques described herein are implemented by one or generalized computing systems programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Special-purpose computing devices may be used, such as desktop computer systems, portable computer systems, handheld devices, networking devices or any other device that incorporates hard-wired and/or program logic to implement the techniques.
  • In this description, an embodiment might have as data structures for input, output, or processing, such as data structures for topics, keywords, campaigns and other data structures to represent data being handled. Some of these data structures might be storable as simple strings, or as arrays of strings or other data elements.
  • A topic data element can represent concepts that can be expressed by user input. For example, topics might relate to interests, descriptions of business purposes, general themes, products, locations, times of day, types of business, keywords, and other expressions. Topics might be used to describe a target customer group or a product group of an enterprise.
  • An array of topic records could be used in a storable and readable representation, digital or otherwise, representing a plurality of topic data elements. Arrays of topic records might be mutable and can be inputs or outputs of different processes that are performed as described herein.
  • An embodiment might also have keyword data structures, wherein a keyword data structure represents some standardized representation of a targeting scope of a campaign, expressed in a natural or in another language. These representations include, but are not limited to, actual keywords (e.g., keywords that form inputs to the Google advertising platform or other keyword-based advertising platform or where advertising is purchased or presented based on selected keywords), a selection of user interests on the Facebook advertising platform or other social media platforms that filter or present content based on data structures representing user interests), an encoding of a distribution of targets in terms of single users or user groups, and embeddings in a learned or predefined targeting space.
  • Arrays of keyword records might be stored as readable data representing, keyword entities and might be mutable and inputs or outputs of different processes that are performed as described herein.
  • Users of the systems described herein might be humans that are interacting with the systems, such as business owners who make use of the system to generate campaigns, and potential customers who are targeted by those campaigns in an online setting, as well as those maintaining and managing such systems. A marketer might be a human who interacts with a backend portion of the system. These interactions might include manual optimization of different campaign parameters, making decisions about future steps in the campaign's lifecycle, and adding knowledge into the system to inform future decisions. A campaign data structure is a representation of data that is in part generated and in part based on user input that describes a targeted online advertising process. As an example, campaign data might describe aspects of a targeted online advertising process over a lifetime of a campaign. A campaign might encompass different parameters such as topics, keywords, budget allocations, and visual elements. A campaign might be mutable over its lifetime and might be optimized before, during, and/or after by adjusting the parameters of the campaign. Campaign data structures can be compared to detect similarities between campaigns and inform subsequent decisions.
  • FIG. 1 illustrates how a semantic processor might derive keywords sets from given topic list. The topic list is obtained from a user or other source and formed into a topic list data structure. The topic list data structure is passed to the topic module. A task of the topic module is to retrieve most relevant and semantically close words for the inputted list. The retrieved list will likely represent most relevant words that revolve around the passed topics. In this process, these words are called “seed keywords.” Topics may refer to categories more generally, such as an area of business of a user. For example, for a baker, the category of interest there might be one of “I am a baker” or the like.
  • These seed keywords are passed to a selection module, which can be an iterative process for deriving new keyword ideas and selecting most relevant ones for inputted topic list. For that purpose, selection module uses two parts: a ranker and a keyword suggester. The keyword suggester is responsible for deriving new keywords ideas and ranker can be seen as a procedure that ranks and filters ideas that are coining from keyword module. The selection module then iterates between ranking and deriving new keywords idea until keyword set reaches desired performance threshold. Now each module will be described.
  • A keyword selection process might start with seed keywords, which represent core products or services an entity might want to advertise in their PPC campaigns. A main task of a topic module is to derive relevant seed keywords from inputted topic list so that the selection module can narrow them down to niche specific keywords. For these purposes, the topics and seed keywords can be modeled as vector representations for determining relevancy of every seed keyword inside each topic and retrieving the most relevant ones for the selection module. Relevancy of each word inside given topic might be determined by a cosine distance, computed as in Equation 1.
  • cos ( t , w ) = t · w t · w ( Eqn . 1 )
  • In Equation 1, w represents a word vector and t represents a topic vector. The semantic processor seeds keywords with a lowest cosine distance to a given set of topics or a single topic is used as seed keywords for the selection module. In order to achieve quality embeddings, the semantic processor might use two methods for embeddings based on topic classification based on an available amount of training data, with one method for low volume topics and another method for high volume topics.
  • For low volume topics, the semantic processor uses a PMI-SVD method and for high volume topics, the semantic processor uses a Paragraph Vector model method.
  • The topic module can be constructed from simple handcrafted heuristic processes based on a topic amount of data and the semantic processor selects appropriate embeddings for deriving seed keywords.
  • The word vectors might be obtained by training the semantic processor with corpuses of text from real world language and/or keyword sets from previous campaigns. The training might be distinct for distinct topic categories. The mapping could be from an initial category to all possible mappings that could be embedded using natural language processing, such as interests, keywords, behaviors, etc.
  • In turn, seed keywords might be derived by sampling from an embedded space based on selected topics, sampling the best keyword candidates from the space. The embedded space might be obtained by training vector models with large corpuses of natural language data and from previous keyword sets of campaigns.
  • Data for the update criteria might be keyword cost per click data and the search volume of keywords. These metrics can be obtained from third-party sources that track this data over time. This could be modeled as a mapping of an input array of generic topic records to both keywords and interests.
  • Filters might be applied, such as target customer age ranges, customer location, and/or customer language.
  • FIG. 2 illustrates internal logic of the topic module of FIG. 1, including a pipeline of the topic module. The topic module first classifies topics as low volume and high volume topics. Based on their classification, different type of embeddings are used. The semantic processor can be programmed to implement the various computations represented by the equations herein.
  • The PMI-SVD method relies on co-occurrence based word association measure point-wise mutual information (PMI) and a factorization technique singular value decomposition (SVD). First, the semantic processor creates a shared vocabulary, Vshared, as shown in Equation 2.

  • V shared =U i=1 N V topic 1 (2)  (Eqn. 2)
  • In Equation 2, N denotes the number of topics and Vtopic-i denotes the vocabulary of the i-th topic. Each individual topic vocabulary is created by taking top T words ordered by their frequency.
  • Topics data sets are tokenized with vocabulary Vshared. Tokenized datasets are used in the semantic processor's PMI-SVD process to compose PMI matrices for each and every low volume topic, resulting in P={Ptopic1, . . . , Ptopicn} PMI matrices. Each entry in PMI matrix represents a PMI value of word x and wordy calculated from context window c of topic “i.”
  • The PMI value can be calculated as in Equation 3. The semantic processor might reduce the dimensionality of each PMI matrix with SVD in order to yield seed keyword vectors of size 100. Topic vectors are an average of all seed keyword vectors of a topic, calculated as in Equation 4.
  • PMI x , y topic i = log p ( x , y ) p ( x ) p ( y ) ( Eqn . 3 ) topic i = 1 P topic i i = 1 P topic i seed keyword i ( Eqn . 4 )
  • In the Distributed Paragraph Vector model, the semantic processor extends the word vector model, referred to herein as “CBOW” in a way that it adds another input to the model. That additional input to the model is randomly initialized vector that represents the document/topic/interest the target word is part of. In that sense, the additional vector can be seen as additional memory that captures all essential information needed to represent properties of the topic/interest/document.
  • Formally, the semantic processor could predict a next word in a sequence with a softmax multiclass classifier as in Equation 5, where ywt is an un-normalized log-probability of output word wt, and each of yi is an un-normalized log-probability for each output word i.
  • P ( w t | w t - k , , w t + k ) = e y wt Σ i e y i ( Eqn . 5 )
  • The semantic processor can compute un-normalized log-probabilities as in Equation 6, where b stands for bias, U is weight matrix and h is as shown in Equation 7, where w1, . . . , wn are word vectors from an embedding matrix W, ti is a topic embedding from embedding matrix T, and h denotes concatenation or averaging of word vectors and a topic vector.

  • y=b+Uh  (Eqn. 6)

  • h=[w i ; . . . ,w n ;t i]  (Eqn. 7)
  • The Distributed Paragraph Vector model can share the same objective function with that of a CBOW model. Formally, given a sequence of words word1, word2, . . . , wordn, the objective of the model is to maximize an average log probability as in Equation 8, wherein c stands for word context window size.
  • 1 N i = 1 N log p ( word 1 | word 1 - c , word i - 1 , word i + 1 , , word i + C ) ( Eqn . 8 )
  • An overview of a Distributed Paragraph Vector model can be seen in FIG. 3. The vocabulary Vtopic can be constructed for all high volume topics and reused as vocabulary Vshared. Embedding matrices W and T are constructed for their corresponding vocabularies, where W corresponds to a seed keywords vocabulary and T corresponds to a topic vocabulary.
  • The rest of the architecture can be constructed according to the above-mentioned equations. For data gathering purposes, a custom heuristic can be used that operates on a Common Crawl Corpus. Documents gathered from a Common Crawl process might be automatically annotated with appropriate topics tags so that the semantic processor can learn topic vectors.
  • In one specific implementation, documents are clustered inside 39 topics, ranging from Greek cuisine to gardening. For the training, 100 dimensions for topic and word vectors were used, with a window size of four words and learning rate of 0.025. Optimization of the model can be done by SGD.
  • The selection module is an iterative procedure of ranking and generating new keyword ideas. An overview of a selection module process is shown in FIG. 4. Two main parts of the selection module are the ranker and the keyword suggester. The ranker might score keyword sets by a predetermined performance metric and might also filter non-relevant keywords. In the semantic processor, the ranker can be implemented in a way that the keyword set score is computed by summing individual keyword scores from the set.
  • A keyword set score is illustrated by Equation 10. The final keyword set should contain keywords that have a search volume that can lead to user conversion and low cost-per-click so that it boosts the effectiveness of a campaign. Keyword performance metric might be a computed value that is computed from a reciprocal value of a keyword cost-per-click value multiplied by a normalized keyword search volume value. The result can be used as a keyword score for a computer process that evaluates keywords. With the keyword score, the semantic processor can rule out keywords that have a high cost-per-click value by taking a reciprocal value of the keyword cost per click and favoring low-cost/moderate cost keywords with sufficient search volume. The keyword score might be generated as in Equation 9, where d is a distance function, wkeyword is a keyword vector and tinitial is a topic vector of interest. In other variations, different distance functions might be used. The distance function might be cosine distance, Euclidean distance, or some natural language processing distance function.
  • Keyword score = 1 Keyword cpc * Keyword NSV * 1 d ( w keyword , t initial ) ( Eqn . 9 ) KeywordSet score = i = 1 N Keyword score i ( Eqn . 10 )
  • The array of keywords might be updated by calculating a keyword score for the array. The keyword score might be calculated as a normalized search volume divided by a keyword cost per click. The distance term indicates how related a selected keyword is to the original topic. A high distance would penalize the keyword score, since the keyword is not related at all to the initial topics, and a small distance would mean that the keyword is highly related to the original topic and it is therefore beneficial to include in the original set.
  • The keyword suggester might be programmed to process a heuristic that operates on large keyword databases and retrieve keyword suggestions based on the defined input. The semantic processor might use a third-party service, such as the Google Targeting Idea service, as keyword suggester module. That service allows the semantic processor to retrieve targeting keyword ideas from various parameters, such as keyword list, location, language, product category and others.
  • The selection module iterates between the ranker and the keyword suggester until there is no further improvement in keyword set or a desired number of optimization rounds is reached. The final set from iteration is output in a form that can be used in PPC campaign targeting settings and other uses.
  • FIG. 5 illustrates an example of a keyword list data structure.
  • FIG. 6 is a block diagram of various components, including inputs from human users and computer processes. As shown there, an advertisement designer 606 might provide user input to a visual ad generator 604. The visual ad generator 604 could generate campaign data 610 that can be passed to a campaign storage and management system 612. The visual ad generator 604 can also pull in campaign data previously stored.
  • A marketing reviewer 616 can provide feedback to a keyword learning system 614 that can get feedback also from an A/B testing system 618. A budget reviewer 620 can submit a budget plan to a budget allocator system 622 that interacts with an adjustment triggering system 624, which can in turn provide feedback to the visual ad generator 604.
  • FIG. 7 is a block diagram that illustrates a computer system 700 upon which an embodiment of the invention may be implemented. Computer system 700 includes a bus 702 or other communication mechanism for communicating information, and a processor 704 coupled with bus 702 for processing information. Processor 704 may be, for example, a general purpose microprocessor.
  • Computer system 700 also includes a main memory 706, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 702 for storing information and instructions to be executed by processor 704. Main memory 706 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 704. Such instructions, when stored in non-transitory storage media accessible to processor 704, render computer system 700 into a special-purpose machine that is customized to perform the operations specified in the instructions.
  • Computer system 700 further includes a read only memory (ROM) 708 or other static storage device coupled to bus 702 for storing static information and instructions for processor 704. A storage device 710, such as a magnetic disk or optical disk, is provided and coupled to bus 702 for storing information and instructions.
  • Computer system 700 may be coupled via bus 702 to a display 712, such as a computer monitor, for displaying information to a computer user. An input device 714, including alphanumeric and other keys, is coupled to bus 702 for communicating information and command selections to processor 704. Another type of user input device is cursor control 716, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 704 and for controlling cursor movement on display 712. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis e.g., y), that allows the device to specify positions in a plane.
  • Computer system 700 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 700 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 700 in response to processor 704 executing one or more sequences of one or more instructions contained in main memory 706. Such instructions may be read into main memory 706 from another storage medium, such as storage device 710. Execution of the sequences of instructions contained in main memory 706 causes processor 704 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
  • The term “storage media” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operation in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 710. Volatile media includes dynamic memory, such as main memory 706. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 702. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
  • Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 704 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a network connection. A modem or network interface local to computer system 700 can receive the data. Bus 702 carries the data to main memory 706, from which processor 704 retrieves and executes the instructions. The instructions received by main memory 706 may optionally be stored on storage device 710 either before or after execution by processor 704.
  • Computer system 700 also includes a communication interface 718 coupled to bus 702. Communication interface 718 provides a two-way data communication coupling to a network link 720 that is connected to a local network 722. For example, communication interface 718 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. Wireless links may also be implemented. In any such implementation, communication interface 718 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
  • Network link 720 typically provides data communication through one or more networks to other data devices. For example, network link 720 may provide a connection through local network 722 to a host computer 724 or to data equipment operated by an Internet Service Provider (ISP) 726. ISP 726 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 728. Local network 722 and Internet 728 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 720 and through communication interface 718, which carry the digital data to and from computer system 700, are example forms of transmission media.
  • Computer system 700 can send messages and receive data, including program code, through the network(s), network link 720 and communication interface 718. In the Internet example, a server 730 might transmit a requested code for an application program through Internet 728, ISP 726, local network 722 and communication interface 718. The received code may be executed by processor 704 as it is received, and/or stored in storage device 710, or other non-volatile storage for later execution.
  • FIG. 8 illustrates an example of memory elements that might be used by a processor to implement elements of the embodiments described herein. For example, where a functional block is referenced, it might be implemented as program code stored in memory. FIG. 8 is a simplified functional block diagram of a storage device 848 having an application that can be accessed and executed by a processor in a computer system. The application can one or more of the applications described herein, running on servers, clients or other platforms or devices and might represent memory of one of the clients and/or servers illustrated elsewhere. Storage device 848 can be one or more memory devices that can be accessed by a processor and storage device 848 can have stored thereon application code 850 that can be configured to store one or more processor readable instructions. The application code 850 can include application logic 852, library functions 854, and file I/O functions 856 associated with the application.
  • Storage device 848 can also include application variables 862 that can include one or more storage locations configured to receive input variables 864. The application variables 862 can include variables that are generated by the application or otherwise local to the application. The application variables 862 can be generated, for example, from data retrieved from an external source, such as a user or an external device or application. The processor can execute the application code 850 to generate the application variables 862 provided to storage device 848.
  • One or more memory locations can be configured to store device data 866. Device data 866 can include data that is sourced by an external source, such as a user or an external device. Device data 866 can include, for example, records being passed between servers prior to being transmitted or after being received. Other data 868 might also be supplied.
  • Storage device 848 can also include a log file 880 having one or more storage locations 884 configured to store results of the application or inputs provided to the application. For example, the log file 880 can be configured to store a history of actions.
  • The memory elements of FIG. 8 might be used for a server or computer that interfaces with a user, generates keyword lists, and/or manages other aspects of a process described herein.
  • Operations of processes described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. Processes described herein (or variations and/or combinations thereof) may be performed under the control of one or more computer systems configured with executable instructions and may be implemented as code (e.g., executable instructions, one or more computer programs or one or more applications) executing collectively on one or more processors, by hardware or combinations thereof. The code may be stored on a computer-readable storage medium, for example, in the form of a computer program comprising a plurality of instructions executable by one or more processors. The computer-readable storage medium may be non-transitory.
  • Conjunctive language, such as phrases of the form “at least one of A, B, and C.” or “at least one of A, B and C,” unless specifically stated otherwise or otherwise clearly contradicted by context, is otherwise understood with the context as used in general to present that an item, term, etc., may be either A or B or C, or any nonempty subset of the set of A and B and C. For instance, in the illustrative example of a set having three members, the conjunctive phrases “at least one of A, B, and C” and “at least one of A, B and C” refer to any of the following sets: {A}, {B}, {C}, {A, B}, {A, C}, {B, C}, {A, B, C}, Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of A, at least one of B and at least one of C each to be present.
  • The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate embodiments of the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. No language in the specification should be construed as indicating any non-claimed element as essential to the practice of the invention.
  • In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. The sole and exclusive indicator of the scope of the invention, and what is intended by the applicants to be the scope of the invention, is the literal and equivalent scope of the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction.
  • Further embodiments can be envisioned to one of ordinary skill in the art after reading this disclosure. In other embodiments, combinations or sub-combinations of the above-disclosed invention can be advantageously made. The example arrangements of components are shown for purposes of illustration and it should be understood that combinations, additions, re-arrangements, and the like are contemplated in alternative embodiments of the present invention. Thus, while the invention has been described with respect to exemplary embodiments, one skilled in the art will recognize that numerous modifications are possible.
  • For example, the processes described herein may be implemented using hardware components, software components, and/or any combination thereof. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. It will, however, be evident that various modifications and changes may be made thereunto without departing from the broader spirit and scope of the invention as set forth in the claims and that the invention is intended to cover all modifications and equivalents within the scope of the following claims.
  • All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.

Claims (5)

What is claimed is:
1. A computer-implemented method of generating, from an array of topic records, an output array of keywords for use in targeting online advertisements related to topics represented by the array of topic records, the method comprising:
obtaining the array of topic records in computer-readable form;
storing the array of topic records as an array of topic vectors, wherein topic vectors encode for topics;
obtaining data that is used to represent word vectors and storing the data as an array of word vectors;
determining, for each word vector of a plurality of word vectors in the array of word vectors and topic vector of a plurality of topic vectors in the array of topic vectors, a relevancy value for the word and topic;
classifying the topic vectors in the plurality of topic vectors into a high-volume class or a low-volume class;
generating and storing an array of seed keywords derived by sampling from an embedded space;
ranking the seed keywords to form an array of ranked keywords;
updating the array of ranked keywords based on a keyword score that can be computed from measurable metrics;
iterating the ranking and updating at least once;
evaluating an optimization improvement value for an iteration; and
when either the optimization improvement value for the iteration is below a pre-determined threshold or the maximum time limit for the optimization is reached, generating the output array of keywords from the array of ranked keywords.
2. The computer-implemented method of claim 1, wherein classifying the topic vectors in the plurality of topic vectors into a high-volume class comprises a paragraph vector model step and classifying the topic vectors in the plurality of topic vectors into a low-volume class comprises a PMI-SVD model step.
3. The computer-implemented method of claim 1, wherein obtaining the array of topic records in computer-readable form comprises:
sending user interface data representing a user interface to a user device;
obtaining a user reply from the user device; and
generating the array of topic records from the user reply.
4. The computer-implemented method of claim 1, wherein determining the relevancy value for the word and topic comprises computing a cosine distance between the word vector and the topic vector.
5. A non-transitory computer-readable storage medium having stored thereon executable instructions that, when executed by one or more processors of a computer system, cause the computer system to at least:
perform the operations described in claim 1.
US16/803,214 2019-02-27 2020-02-27 Generating Keyword Lists Related to Topics Represented by an Array of Topic Records, for Use in Targeting Online Advertisements and Other Uses Pending US20200273069A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/803,214 US20200273069A1 (en) 2019-02-27 2020-02-27 Generating Keyword Lists Related to Topics Represented by an Array of Topic Records, for Use in Targeting Online Advertisements and Other Uses

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962811444P 2019-02-27 2019-02-27
US16/803,214 US20200273069A1 (en) 2019-02-27 2020-02-27 Generating Keyword Lists Related to Topics Represented by an Array of Topic Records, for Use in Targeting Online Advertisements and Other Uses

Publications (1)

Publication Number Publication Date
US20200273069A1 true US20200273069A1 (en) 2020-08-27

Family

ID=69784498

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/803,214 Pending US20200273069A1 (en) 2019-02-27 2020-02-27 Generating Keyword Lists Related to Topics Represented by an Array of Topic Records, for Use in Targeting Online Advertisements and Other Uses
US16/803,279 Active US11727438B2 (en) 2019-02-27 2020-02-27 Method and system for comparing human-generated online campaigns and machine-generated online campaigns based on online platform feedback

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/803,279 Active US11727438B2 (en) 2019-02-27 2020-02-27 Method and system for comparing human-generated online campaigns and machine-generated online campaigns based on online platform feedback

Country Status (2)

Country Link
US (2) US20200273069A1 (en)
WO (2) WO2020174439A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230010334A1 (en) * 2021-07-08 2023-01-12 The Literal Company System and method for implementing a search engine access point enhanced for retailer brand suggested listing navigation

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11128724B1 (en) * 2020-03-09 2021-09-21 Adobe Inc. Real-time interactive event analytics
CN112163071A (en) * 2020-09-28 2021-01-01 广州数鹏通科技有限公司 Unsupervised learning analysis method and system for information correlation degree of emergency
WO2022130524A1 (en) * 2020-12-16 2022-06-23 株式会社日立製作所 Target selection system, target selection method, and target selection program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050137939A1 (en) * 2003-12-19 2005-06-23 Palo Alto Research Center Incorporated Server-based keyword advertisement management
US20070067281A1 (en) * 2005-09-16 2007-03-22 Irina Matveeva Generalized latent semantic analysis
US20080201324A1 (en) * 2007-02-20 2008-08-21 Kenshoo Ltd. Computer implemented system and method for enhancing keyword expansion
US20140236715A1 (en) * 2013-02-20 2014-08-21 Kenshoo Ltd. Targeted advertising in social media networks
US20180060437A1 (en) * 2016-08-29 2018-03-01 EverString Innovation Technology Keyword and business tag extraction
US20190138615A1 (en) * 2017-11-07 2019-05-09 Thomson Reuters Global Resources Unlimited Company System and methods for context aware searching
US10847140B1 (en) * 2018-11-02 2020-11-24 Noble Systems Corporation Using semantically related search terms for speech and text analytics

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080140476A1 (en) * 2006-12-12 2008-06-12 Shubhasheesh Anand Smart advertisement generating system
US8768960B2 (en) * 2009-01-20 2014-07-01 Microsoft Corporation Enhancing keyword advertising using online encyclopedia semantics
US20100257022A1 (en) * 2009-04-07 2010-10-07 Yahoo! Inc. Finding Similar Campaigns for Internet Advertisement Targeting
US9348935B2 (en) 2010-06-29 2016-05-24 Vibrant Media, Inc. Systems and methods for augmenting a keyword of a web page with video content
US20120004983A1 (en) * 2010-06-30 2012-01-05 Cbs Interactive Inc. Systems and methods for determining the efficacy of advertising
US20130080244A1 (en) * 2011-09-22 2013-03-28 Sitecore A/S Method and a system for managing advertising campaigns
WO2013151546A1 (en) * 2012-04-05 2013-10-10 Thomson Licensing Contextually propagating semantic knowledge over large datasets
US20140258001A1 (en) * 2013-03-08 2014-09-11 DataPop, Inc. Systems and Methods for Determining Net-New Keywords in Expanding Live Advertising Campaigns in Targeted Advertising Systems
US20160210655A1 (en) * 2015-01-20 2016-07-21 Facebook, Inc. Managing Content Item Presentation Based On Cost of Presenting the Content Items and Completion of Objectives Associated with the Content Items
US10937054B2 (en) * 2018-06-15 2021-03-02 The Nielsen Company (Us), Llc Methods, systems, apparatus and articles of manufacture to determine causal effects

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050137939A1 (en) * 2003-12-19 2005-06-23 Palo Alto Research Center Incorporated Server-based keyword advertisement management
US20070067281A1 (en) * 2005-09-16 2007-03-22 Irina Matveeva Generalized latent semantic analysis
US20080201324A1 (en) * 2007-02-20 2008-08-21 Kenshoo Ltd. Computer implemented system and method for enhancing keyword expansion
US20140236715A1 (en) * 2013-02-20 2014-08-21 Kenshoo Ltd. Targeted advertising in social media networks
US20180060437A1 (en) * 2016-08-29 2018-03-01 EverString Innovation Technology Keyword and business tag extraction
US20190138615A1 (en) * 2017-11-07 2019-05-09 Thomson Reuters Global Resources Unlimited Company System and methods for context aware searching
US10847140B1 (en) * 2018-11-02 2020-11-24 Noble Systems Corporation Using semantically related search terms for speech and text analytics

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Concatenate definition, from Merriam-Webster [online], downloaded from https://www.merriam-webster.com/dictionary/concatenate on 18 October 2023 (Year: 2023) *
CONCATENATE function, from Microsoft Support [online], downloaded from https://support.microsoft.com/en-us/office/concatenate-function-8f8ae884-2ca8-4f7a-b093-75d702bea31d om 18 October 2023 (Year: 2023) *
Jurafsky, Daniel and Martin, James H., Speech and Language Processing, Chap. 19, dated 9 July 2015, downloaded via the Archive.org WayBack Machine at https://web.archive.org/web/20150714045336/http://web.stanford.edu/~jurafsky/slp3/19.pdf on 17 October 2023 (Year: 2015) *
Jurafsky, Daniel, LSA 311: Computational Lexical Semantics, dated Summer 2015, downloaded 17 October 2023 via the Archive.org WayBack Machine at https://web.archive.org/web/20150711211959/https://web.stanford.edu/~jurafsky/li15/ (Year: 2015) *
NSS, An Intuitive Understanding of Word Embeddings: From Count Vectors to Word2Vec, Analytics Vidhya [online], dated 4 June 2017 retrieved the Web Archive WayBack Machine on 16 February 2023 (Year: 2017) *
Springer Link, Embedding Space, from Encyclopedia of Biometrics, (2009) Li, S.Z., Jain, A. (eds) Encyclopedia of Biometrics. Springer, Boston, MA, retrieved from https://link.springer.com/referenceworkentry/10.1007/978-0-387-73003-5_573 on 16 February 2023 (Year: 2009) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230010334A1 (en) * 2021-07-08 2023-01-12 The Literal Company System and method for implementing a search engine access point enhanced for retailer brand suggested listing navigation

Also Published As

Publication number Publication date
US20200273064A1 (en) 2020-08-27
US11727438B2 (en) 2023-08-15
WO2020174439A1 (en) 2020-09-03
WO2020174441A1 (en) 2020-09-03

Similar Documents

Publication Publication Date Title
US20200273069A1 (en) Generating Keyword Lists Related to Topics Represented by an Array of Topic Records, for Use in Targeting Online Advertisements and Other Uses
US11204972B2 (en) Comprehensive search engine scoring and modeling of user relevance
US8572099B2 (en) Advertiser and user association
US8055664B2 (en) Inferring user interests
US8572011B1 (en) Outcome estimation models trained using regression and ranking techniques
US8799260B2 (en) Method and system for generating web pages for topics unassociated with a dominant URL
US9589277B2 (en) Search service advertisement selection
US20120066073A1 (en) User interest analysis systems and methods
US20110289025A1 (en) Learning user intent from rule-based training data
US20140289239A1 (en) Recommendation tuning using interest correlation
US20120054040A1 (en) Adaptive Targeting for Finding Look-Alike Users
US11295375B1 (en) Machine learning based computer platform, computer-implemented method, and computer program product for finding right-fit technology solutions for business needs
EP2188712A2 (en) Recommendation systems and methods
US20150026105A1 (en) Systems and method for determining influence of entities with respect to contexts
US11164153B1 (en) Generating skill data through machine learning
US20110131093A1 (en) System and method for optimizing selection of online advertisements
US20160350669A1 (en) Blending content pools into content feeds
US10990643B2 (en) Automatically linking pages in a website
US20190130360A1 (en) Model-based recommendation of career services
US11366817B2 (en) Intent based second pass ranker for ranking aggregates
Yuan Supply side optimisation in online display advertising
US20160350310A1 (en) Segment-based content pools for inclusion in content feeds
Bi Neural Approaches to Feedback in Information Retrieval
Myrberg A Recommender System for an Online Auction
Battelle Understanding Customer Intent for Keyphrase Selection

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED