US20230262016A1 - Methods and systems for generating a virtual assistant in a messaging user interface - Google Patents
Methods and systems for generating a virtual assistant in a messaging user interface Download PDFInfo
- Publication number
- US20230262016A1 US20230262016A1 US18/138,522 US202318138522A US2023262016A1 US 20230262016 A1 US20230262016 A1 US 20230262016A1 US 202318138522 A US202318138522 A US 202318138522A US 2023262016 A1 US2023262016 A1 US 2023262016A1
- Authority
- US
- United States
- Prior art keywords
- user
- message
- computing device
- agenda
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 62
- 230000004044 response Effects 0.000 claims abstract description 140
- 230000009471 action Effects 0.000 claims abstract description 79
- 238000004891 communication Methods 0.000 claims description 44
- 230000006870 function Effects 0.000 claims description 36
- 238000012545 processing Methods 0.000 claims description 11
- 230000036541 health Effects 0.000 claims description 3
- 230000002489 hematologic effect Effects 0.000 claims description 3
- 230000000977 initiatory effect Effects 0.000 claims description 3
- 230000008859 change Effects 0.000 claims 3
- 238000010801 machine learning Methods 0.000 description 22
- 238000012549 training Methods 0.000 description 21
- 230000008569 process Effects 0.000 description 18
- 238000004422 calculation algorithm Methods 0.000 description 9
- 238000013479 data entry Methods 0.000 description 8
- 230000003287 optical effect Effects 0.000 description 6
- 230000007958 sleep Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000007635 classification algorithm Methods 0.000 description 4
- 235000012054 meals Nutrition 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 230000006399 behavior Effects 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 238000003066 decision tree Methods 0.000 description 3
- 238000007792 addition Methods 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 235000013305 food Nutrition 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- WQZGKKKJIJFFOK-GASJEMHNSA-N Glucose Natural products OC[C@H]1OC(O)[C@H](O)[C@@H](O)[C@@H]1O WQZGKKKJIJFFOK-GASJEMHNSA-N 0.000 description 1
- 206010033372 Pain and discomfort Diseases 0.000 description 1
- 230000002155 anti-virotic effect Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000036772 blood pressure Effects 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005108 dry cleaning Methods 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 238000013073 enabling process Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000796 flavoring agent Substances 0.000 description 1
- 235000019634 flavors Nutrition 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 239000008103 glucose Substances 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 230000037230 mobility Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 238000003058 natural language processing Methods 0.000 description 1
- 235000016709 nutrition Nutrition 0.000 description 1
- 238000012015 optical character recognition Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000000399 orthopedic effect Effects 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000009333 weeding Methods 0.000 description 1
- 230000036642 wellbeing Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/18—Commands or executable codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/31—Indexing; Data structures therefor; Storage structures
- G06F16/316—Indexing structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/006—Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/02—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
- H04L51/046—Interoperability with other network applications or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computing arrangements using knowledge-based models
- G06N5/01—Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N7/00—Computing arrangements based on specific mathematical models
- G06N7/01—Probabilistic graphical models, e.g. probabilistic networks
Abstract
In an aspect, a system for generating a virtual assistant in a messaging user interface, the system comprising a computing device designed and configured to initiate a virtual message user interface between a user client device and the computing device; receive a user message entered by a user into the virtual message user interface; select a conversation profile for the virtual assistant as a function of the user message, wherein the conversation profile comprises behavior that the user uses to communicate; prioritize an agenda action of a plurality of agenda actions contained within a user agenda list as a function of the user message; and generate a response to the user message comprising the prioritized agenda action, wherein the response is further generated as a function of the selected conversation profile.
Description
- This application is a continuation of Non-provisional application Ser. No. 16/912,040, filed on Jun. 25, 2020, and entitled “METHODS AND SYSTEMS FOR GENERATING A VIRTUAL ASSISTANT IN A MESSAGING USER INTERFACE,” the entirety of which is incorporated herein by reference.
- The present invention generally relates to the field of computing. In particular, the present invention is directed to methods and systems for generating a virtual assistant in a messaging user interface.
- Keeping track of, prioritizing, and completing one's agenda actions can be difficult. Frequently, agenda actions can be forgotten or lost within a multitude of agenda actions that need to be completed. Further, integrating agenda actions relating to all aspects of one's responsibilities can be challenging.
- In an aspect, a system for generating a virtual assistant in a messaging user interface, the system comprising a computing device designed and configured to initiate a virtual message user interface between a user client device and the computing device; receive a user message entered by a user into the virtual message user interface; select a conversation profile for the virtual assistant as a function of the user message, wherein the conversation profile comprises behavior that the user uses to communicate; prioritize an agenda action of a plurality of agenda actions contained within a user agenda list as a function of the user message; and generate a response to the user message comprising the prioritized agenda action, wherein the response is further generated as a function of the selected conversation profile.
- In an aspect, a method for generating a virtual assistant in a messaging user interface, the method comprising a computing device designed and configured to initiate a virtual message user interface between a user client device and the computing device; receiving a user message entered by a user into the virtual message user interface; selecting a conversation profile for the virtual assistant as a function of the user message, wherein the conversation profile comprises behavior that the user uses to communicate; prioritizing an agenda action of a plurality of agenda actions contained within a user agenda list as a function of the user message; and generating a response to the user message comprising the prioritized agenda action, wherein the response is further generated as a function of the selected conversation profile.
- These and other aspects and features of non-limiting embodiments of the present invention will become apparent to those skilled in the art upon review of the following description of specific non-limiting embodiments of the invention in conjunction with the accompanying drawings.
- For the purpose of illustrating the invention, the drawings show aspects of one or more embodiments of the invention. However, it should be understood that the present invention is not limited to the precise arrangements and instrumentalities shown in the drawings, wherein:
-
FIG. 1 is a block diagram illustrating an exemplary embodiment of a system for generating a virtual assistant in a messaging user interface; -
FIG. 2 is a block diagram illustrating an exemplary embodiment of a user database; -
FIG. 3 is a diagrammatic representation of an exemplary embodiment of a conversation profile; -
FIG. 4 is a block diagram illustrating an exemplary embodiment of a response database; -
FIG. 5 is a diagrammatic representation of an exemplary embodiment of a context classifier; -
FIG. 6 is a diagrammatic representation of an exemplary embodiment of a conversation; -
FIG. 7 is a block diagram illustrating an exemplary embodiment of a method of generating a virtual assistant in a messaging user interface; and -
FIG. 8 is a block diagram of a computing system that can be used to implement any one or more of the methodologies disclosed herein and any one or more portions thereof. - The drawings are not necessarily to scale and may be illustrated by phantom lines, diagrammatic representations, and fragmentary views. In certain instances, details that are not necessary for an understanding of the embodiments or that render other details difficult to perceive may have been omitted.
- At a high level, aspects of the present disclosure are directed to systems and methods for generating a virtual assistant in a messaging user interface. In an embodiment, a virtual message user interface is initiated between a user client device and a computing device. A user message is entered by a user into the messaging user interface. The computing device analyzes the user message and generates a user-action learner to identify and generate a response to the user message.
- Referring now to
FIG. 1 , an exemplary embodiment of asystem 100 for generating a virtual assistant in a messaging user interface is illustrated.System 100 includes acomputing device 104.Computing device 104 may include anycomputing device 104 as described in this disclosure, including without limitation a microcontroller, microprocessor, digital signal processor (DSP) and/or system on a chip (SoC) as described in this disclosure.Computing device 104 may include, be included in, and/or connect with a mobile device such as a mobile telephone or smartphone.Computing device 104 may include asingle computing device 104 operating independently or may include two ormore computing device 104 operating in concert, in parallel, sequentially or the like; two ormore computing devices 104 may be included together in asingle computing device 104 or in two ormore computing devices 104.Computing device 104 may interface or connect with one or more additional devices as described below in further detail via a network interface device. Network interface device may be utilized for connectingcomputing device 104 to one or more of a variety of networks, and one or more devices. Examples of a network interface device include, but are not limited to, a network interface card (e.g., a mobile network interface card, a LAN card), a modem, and any combination thereof. Examples of a network include, but are not limited to, a wide area network (e.g., the Internet, an enterprise network), a local area network (e.g., a network associated with an association, a building, a campus or other relatively small geographic space), a telephone network, a data network associated with a telephone/voice provider (e.g., a mobile communications provider data and/or voice network), a direct connection between twocomputing devices 104, and any combinations thereof. A network may employ a wired and/or a wireless mode of communication. In general, any network topology may be used. Information (e.g., data, software etc.) may be transmitted to and/or from a computer and/or acomputing device 104.Computing device 104 may include but is not limited to, for example, acomputing device 104 or cluster ofcomputing devices 104 in a first position and asecond computing device 104 or cluster ofcomputing devices 104 in a second position.Computing device 104 may include one ormore computing devices 104 dedicated to data storage, security, dispersal of traffic for load balancing, and the like.Computing device 104 may distribute one or more computing tasks as described below across a plurality ofcomputing devices 104 ofcomputing device 104, which may operate in parallel, in series, redundantly, or in any other manner used for dispersal of tasks or memory betweencomputing devices 104.Computing device 104 may be implemented using a “shared nothing” architecture in which data is cached at the operative, in an embodiment, this may enable scalability ofsystem 100 and/orcomputing device 104. - Continuing to refer to
FIG. 1 ,computing device 104 may be designed and/or configured to perform any method, method step, or sequence of method steps in any embodiment described in this disclosure, in any order and with any degree of repetition. For instance,computing device 104 may be configured to perform a single step or sequence recurrently until a desired or commanded outcome is achieved; repetition of a step or a sequence of steps may be performed iteratively and/or recursively using outputs of previous repetitions as inputs to subsequent repetitions, assembling inputs and/or outputs of repetitions to produce an aggregate result, reduction or decrement of one or more variables such as global variables, and/or division of a larger processing task into a set of iteratively addressed smaller processing tasks.Computing device 104 may perform any step or sequence of steps as described in this disclosure in parallel, such as simultaneously and/or substantially simultaneously performing a step two or more times using two or more parallel threads, processor cores, or the like; division of tasks between parallel threads and/or processes may be performed according to any protocol suitable for division of tasks between iterations. Persons skilled in the art, upon reviewing the entirety of this disclosure, will be aware of various ways in which steps, sequences of steps, processing tasks, and/or data may be subdivided, shared, or otherwise dealt with using iteration, recursion, and/or parallel processing. - With continued reference to
FIG. 1 ,computing device 104 is configured to initiate a virtualmessage user interface 108 between a user device andcomputing device 104. A “virtual message user interface,” as used in this disclosure, is a messaging service for communication within a user interface. A messaging service may include an application, script, and/or program capable of generating messages using language, including any numeric, symbolic, and/or character based language. A user interface allows users to interface with electronic devices through graphical icons and audio indicators, including primary notation, text based user interfaces, typed command labels, and/or text navigation. Virtualmessage user interface 108 may include a form or other graphical element having display fields, where one or more elements of information may be displayed. Virtualmessage user interface 108 may include sliders or other user commands that may allow a user to select one or more characters. Virtualmessage user interface 108 may include free form textual entries, where a user may type in responses and/or messages. Virtualmessage user interface 108 may display data output fields including text, images, or the like containing one or more messages. Virtualmessage user interface 108 may include data input fields such as text entry windows, drop-down lists, buttons, checkboxes, radio buttons, sliders, links, or any other data input interface that may capture user interaction as may occur to persons skilled in the art upon reviewing the entirety of this disclosure. Virtualmessage user interface 108 may be provided, without limitation, using a web browser, a native application, a mobile application, or the like. - With continued reference to
FIG. 1 ,computing device 104 initiates virtualmessage user interface 108 anduser client device 112. A “user client device,” as used in this disclosure, is any computing device, including a mobile device such as a smartphone and/or tablet, laptop, desktop, and/or any other type of computing device.User client device 112 is operated by auser 116. A “user,” as used in this disclosure, is any human being.Computing device 104 initiates virtualmessage user interface 108 utilizing any network methodology as described herein. - With continued reference to
FIG. 1 ,computing device 104 receives auser message 120 entered by a user into virtualmessage user interface 108. A “message,” as used in this disclosure, is any communication, including any verbal, written, and/or recorded communication. A communication may include textual and/or non-textual communications, including one or more images, graphics, symbols, characters, and/or numerical representations of the communication. Auser message 120 may contain a calendar entry. A “calendar entry,” as used in this disclosure, is a user's schedule. A calendar entry may include one or more appointments that may be scheduled on a particular day and/or time of the user's schedule. A calendar entry may include a user's work-related schedule, including for example any scheduled meetings, work related holidays, working hours, and the like. For instance and without limitation, a calendar entry may include a conference call that a user has every Wednesday morning with a supplier of parts for the user's business. A calendar entry may include a user's personal schedule, including personal appointments such as meetings with friends, medical appointments, recreational sporting activities, concerts, and the like. - With continued reference to
FIG. 1 , auser message 120 may include an elemental entry. An “elemental entry,” as used in this disclosure, is any communication relating to a user's body. An elemental entry may include information describing the health state of a user, including the user's physical, mental, and social well-being. An elemental entry may include entries describing any meals a user consumed over the course of the past week. For example, an elemental entry may contain a log of all meals that a user has consumed on a meals and nutritional platform over the course of the past six weeks. An elemental entry may include entries describing a user's fitness habits, including logs and/or descriptions of any work-outs that a user participated in. An elemental entry may include entries describing any sleeping patterns of a user, including the number of hours that a user sleeps on average in a night, how many times the user woke throughout the night, how long it took the user to fall asleep, what time the user woke up, and the like. An elemental entry may include entries describing any social interactions a user had, such as any friends that a user met for coffee, or a support group meeting that a user attended. Information relating to an elemental entry may be received from a sensor that may be attached touser client device 112 and/or in communication withuser client device 112 and/orcomputing device 104. A “sensor,” as used in this disclosure, is any device, module, machine, and/or subsystem configured to detect events or changes in its environment and transmit the information to other devices. A sensor may include an electromagnetic sensor, including without limitation electroencephalographic sensors, magnetoencephalographic sensors, electrocardiographic sensors, electromyographic sensors, or the like. A sensor may include a weight scale. A sensor may include a temperature sensor. A sensor may include any sensor that may be included inuser client device 112 and/or wearable device, including without limitation a motion sensor such as an inertial measurement unit (IMU), one or more accelerometers, one or more gyroscopes, one or more magnetometers, or the like. A wearable and/or user client device sensor may capture step, gait, and/or other mobility data, as well as data describing activity levels, sleep levels and/or physical fitness. A wearable and/or user client device sensor may detect heart rate or the like. A sensor may detect any hematological parameter including blood oxygen level, pulse rate, heart rate, pulse rhythm, blood glucose and/or blood pressure. - With continued reference to
FIG. 1 ,computing device 104 is configured to retrieve data relating to a user agenda list 124. An “agenda list,” as used in this disclosure, is a compilation of one ormore agenda actions 128 that need to be completed. An “agenda action,” as used in this disclosure, is any task that needs to be completed, related to a user's personal and/or professional life. For instance and without limitation, an agenda list may contain anagenda action 128 such as to purchase a blow up air mattress. In yet another non-limiting example, an agenda list may contain a task such as ordering coffee for a work meeting in the morning. An agenda list may be stored withinuser database 132.User database 132 may be implemented without limitation, as a relational database, a key-value retrieval datastore such as a NOSQL database, or any other format or structure for use as a datastore that a person skilled in the art would recognize as suitable upon review of the entirety of this disclosure. - With continued reference to
FIG. 1 ,computing device 104 is configured to retrieve a previous message entered by a user. Information relating to previous messages entered by auser 116, may be stored within user database.Computing device 104 may generate a query to locate a message that was received during a certain period of time, and/or a message that relates to a specific topic or subject. A “query,” as used in this disclosure, is any request generated to retrieve and/or locate information from a database. A query may include choosing parameters to generate a query from a menu of options. In such an instance, a list of parameters may be assembled, from which one or more parameters may be chosen to generate a query. A query may include a query by example, wherecomputing device 104 presents a blank records, and allows a user to specify the fields and values that define the query. A query may include query language, where a request to locate information may be a stylized query written in query language.Computing device 104 may prioritize a plurality of action agendas contained within the user agenda list as a function of retrieving a previous message. Prioritizing action agendas may include determining and/or arranging agenda actions that need to be completed in order of priority and/or importance. Information contained within a previous message may specify the order and/or importance of agenda actions. For instance and without limitation,computing device 104 may retrieve a previous message entered by a user that specifies that the user likes to batch cook home cooked meals every weekend and prefers to have groceries delivered in time before the weekend, either on Thursday or Friday. In such an instance,computing device 104 may prioritize an agenda action contained within a user agenda list to go grocery shopping and prioritize grocery shopping above weeding the user's garden on a Friday afternoon. - With continued reference to
FIG. 1 ,computing device 104 is configured to analyze auser message 120, to identify an agenda action relating to theuser message 120.Computing device 104 analyzes auser message 120, utilizinglinguistic analyzer 136. A “linguistic analyzer,” as used in this disclosure, is any language processing analysis module that enablescomputing device 104 to acquire meaning from messages.Linguistic analyzer 136 may be utilized to identify a structure, phrases, words, and/or other content or characteristics of a message entered by a user.Linguistic analyzer 136 may be utilized to determine the content and/or subject of a user message.Linguistic analyzer 136 may use an identified content to generate a response to a user message, including a response intended to direct a flow of conversation. - With continued reference to
FIG. 1 ,linguistic analyzer 136 analyzes a user message utilizing statistical inferences to learn rules through the analysis of large corpora of documents.Linguistic analyzer 136 may generate one or more machine-learning algorithms that utilize features generated from input data. Machine-learning algorithms may include generating a decision tree. A decision tree includes generating a structure in which each internal node represents a “test” on an attribute, each branch represents the outcome of the test, and each leaf node represents a class label. A path from a root to a leaf may represent classification rules.Linguistic analyzer 136 may generate one or more statistical models, which may be utilized to make probabilistic decisions based on attaching real-valued weights to each input feature. - With continued reference to
FIG. 1 ,linguistic analyzer 136 may analyze auser message 120 utilizing syntax analysis, such as through morphological segmentation. For example,linguistic analyzer 136 may separate words into individual morphemes and identify the class of the morphemes.Linguistic analyzer 136 may analyze auser message 120 by parsing a user message. Parsing may include determining a parsing tree and/or grammatical analysis of a given sentence. Parsing may include dependency parsing where relationships between words in a sentence are analyzed. Parsing may include constituency parsing that focuses on building out a parsing tree using a probabilistic context free grammar, including for example stochastic grammar. Linguistic analyzer 36 may analyze a user message utilizing other forms of syntax analysis, including sentencing breaking, to find sentence boundaries; word segmentation to separate a chunk of continuous text into separate words; lemmatization, and/or grammar induction. - With continued reference to
FIG. 1 ,linguistic analyzer 136 may analyze auser message 120 utilizing semantic analysis such as by analyzing lexical semantics to determine the computational meaning of individual words in context. Semantic analysis may include distributional semantics, to analyze how semantic representations are separate from data. Semantic analysis may include named entity recognition, to determine which items in a string of text map to names of people, places, organization, and the like. Semantic analysis may include natural language understanding, including converting chunks of text into more formal representations such as first-order logic structures. Natural language understanding may include the identification of intended semantics from multiple possible semantics that can be derived from a natural language expression. Semantic analysis may include optical character analysis, including determining corresponding text to an image representing printed text. Semantic analysis may include other forms of analysis including optical character recognition, question answering, recognizing textual entailment, relationship extraction, sentiment analysis, topic segmentation and recognition, and/or word sense disambiguation. - With continued reference to
FIG. 1 ,linguistic analyzer 136 may perform discourse analysis to analyze a user message. Discourse analysis includes automatic summarization may include producing a readable summary of a chunk of text. Discourse analysis may include coreference resolution, which may determine from a given sentence or chunk of text, which words refer to the same objects.Linguistic analyzer 136 may analyze a user message by performing other various forms of analysis, including for example, speech recognition, speech segmentation, text-to-speech analysis, dialogue analysis, and/or cognition analysis. - With continued reference to
FIG. 1 ,computing device 104 identifies usinglinguistic analyzer 136, a description of an agenda action contained within a user message. For example, a user message may contain an entry such as “cancel all meetings on Thursday June 26th.”Linguistic analyzer 136 identifies a description of an agenda action that includes “meetings on June 26th.”Computing device 104 generates a query containing a description of an agenda action. A query includes any of the queries as described above.Computing device 104 utilizes a query to locate anagenda action 128 contained within user agenda list 124, which may be stored withinuser database 132. - With continued reference to
FIG. 1 ,computing device 104 is configured to analyze auser message 120 and identify a need for a user to provide more information. This may occur, for example, when computingdevice 104 is unable to identify a description of anagenda action 128 contained within auser message 120, and/or when there may bevarious agenda actions 128 related to auser message 120.Computing device 104 may identify a need for a user to provide more information, when information contained within auser message 120 may be lacking and require more information. For instance and without limitation, auser message 120 that contains an entry such as “order coffee,” may promptcomputing device 104 to identify a need for a user to provide more information, including what size coffee the user wishes to order, when the user wants the coffee order, what flavor and/or style of coffee the user is looking for, and where the user wants to order coffee from.Computing device 104 selects a conversation profile as a function of analyzing auser message 120. A “conversation profile,” as used in this disclosure, is a compilation of one or more messages organized by style of communication. A style of communication may include one or more behaviors, or ways in which a user communicates. For example, a style of communication may include a direct style of communication, which includes explicit messages that clearly express a user's intentions. In yet another non-limiting example, a style of communication may include an affectionate communication style that includes emotional and sensitive messages that may contain implicit meanings contained within messages. In yet another non-limiting example, a style of communication may include an indirect style of communication, which may contain messages that mask a user's intentions and needs.Computing device 104 analyzes auser message 120 to determine a user's style of communication and select a conversation profile. For instance and without limitation, auser message 120 that contains elaborate figurative language, with lots of metaphors and provers may be determined to be an elaborate style of communication, while auser message 120 that contains lots of emojis, photographs, and heart shaped emojis may be determined to be an affectionate style of communication. - With continued reference to
FIG. 1 , a style of conversation profile contains a plurality of conversations. A “conversation,” as used in this disclosure, is a communication session comprised of a series of one or more messages exchanged between two or more parties. A plurality of conversations contained within a conversation profile, may be organized by subject and/or topic. For example, a conversation profile such as a formal conversation profile, may contain a plurality of conversations organized by topic, including for example, friends, family, cooking, food delivery, shopping, appointments, schoolwork, and the like.Computing device 104 initiates a conversation within virtual message user interface as a function of selecting a conversation profile. In an embodiment, a conversation profile may seek to obtain more information from an analyzed message. For instance and without limitation, auser message 120 may contain a remark containing a user's sleep patterns including descriptions of how long the user slept each night for the past seven nights. In such an instance,computing device 104 may analyze theuser message 120 containing the user's sleep patterns and determine that more information is needed to analyze the user's sleep patterns over the past month, and not just over the prior week.Computing device 104 selects a conversation profile and initiates a conversation within virtualmessage user interface 108 to obtain more sleep pattern data. - With continued reference to
FIG. 1 ,computing device 104 is configured to generate aresponse 140 to auser message 120 as a function of analyzing auser message 120. A “response,” as used in this disclosure, is a reply to a user message. Aresponse 140 may contain a remark and/or replay to auser message 120. For instance and without limitation, auser message 120 that contains aremark 144 to schedule a user a haircut may causecomputing device 104 to generate aresponse 140 that contains a question to ask the user if the user needs to get the user's hair colored or if the user is only looking for a haircut. In yet another non-limiting example, aresponse 140 may contain a non-textual response, such as a photograph of a scheduled and confirmed appointment for the user's haircut. In yet another non-limiting example, aresponse 140 may contain a graphical and/or multimedia response, including for example, aresponse 140 containing an emoji, a photograph, a screenshot, a video, an infographic, music, an illustration, a piece of art, a graphics interchange format, and the like. - With continued reference to
FIG. 1 ,computing device 104 is configured to generate aresponse 140 to auser message 120 by generating a user-action learner 144. User-action learner 144 may generate one or more modules that learn input-output pairs pertaining to previous messages and responses pertaining to previous communications with a particular user. User-action learner 144 may include any elements suitable for use in a machine-learning process. User-action learner 144 may generate a user-action model, relating auser message 108 and a previous user message to a response. User-action learner 144 may be trained by computingdevice 104, utilizing user training data. “Training data,” as used herein, is data containing correlations that a machine-learning process may use to model relationships between two or more categories of data elements. For instance, and without limitation, training data may include a plurality of data entries, each entry representing a set of data elements that were recorded, received, and/or generated together; data elements may be correlated by shared existence in a given data entry, by proximity in a given data entry, or the like. Multiple data entries in training data may evince one or more trends in correlations between categories of data elements; for instance, and without limitation, a higher value of a first data element belonging to a first category of data element may tend to correlate to a higher value of a second data element belonging to a second category of data element, indicating a possible proportional or other mathematical relationship linking values belonging to the two categories. Multiple categories of data elements may be related in training data according to various correlations; correlations may indicate causative and/or predictive links between categories of data elements, which may be modeled as relationships such as mathematical relationships by machine-learning processes as described in further detail below. Training data may be formatted and/or organized by categories of data elements, for instance by associating data elements with one or more descriptors corresponding to categories of data elements. As a non-limiting example, training data may include data entered in standardized forms by persons or processes, such that entry of a given data element in a given field in a form may be mapped to one or more descriptors of categories. Elements in training data may be linked to descriptors of categories by tags, tokens, or other data elements; for instance, and without limitation, training data may be provided in fixed-length formats, formats linking positions of data to categories such as comma-separated value (CSV) formats and/or self-describing formats such as extensible markup language (XML), enabling processes or devices to detect categories of data. - Alternatively or additionally, and continuing to refer to
FIG. 1 , training data may include one or more elements that are not categorized; that is, training data may not be formatted or contain descriptors for some elements of data. Machine-learning algorithms and/or other processes may sort training data according to one or more categorizations using, for instance, natural language processing algorithms, tokenization, detection of correlated values in raw data and the like; categories may be generated using correlation and/or other processing algorithms. As a non-limiting example, in a corpus of text, phrases making up a number “n” of compound words, such as nouns modified by other nouns, may be identified according to a statistically significant prevalence of n-grams containing such words in a particular order; such an n-gram may be categorized as an element of language such as a “word” to be tracked similarly to single words, generating a new category as a result of statistical analysis. Similarly, in a data entry including some textual data, a person's name may be identified by reference to a list, dictionary, or other compendium of terms, permitting ad-hoc categorization by machine-learning algorithms, and/or automated association of data in the data entry with descriptors or into a given format. The ability to categorize data entries automatedly may enable the same training data to be made applicable for two or more distinct machine-learning algorithms as described in further detail below. Training data used by computingdevice 104 and/or another device may correlate any input data as described in this disclosure to any output data as described in this disclosure. “User training data,” as used in this disclosure, includes a plurality of data entries containing previous messages correlated to previous responses. Information pertaining to user training data may be stored withinuser database 132. - With continued reference to
FIG. 1 , user-action learner may be implemented as any machine-learning process. A “machine learning process,” as used in this disclosure, is a process that automatically uses a body of data known as “training data” and/or a “training set” to generate an algorithm that will be performed by computingdevice 104 and/or any module to produce outputs given data provided as inputs; this is in contrast to a non-machine learning software program where the commands to be executed are determined in advance by a user and written in a programming language. A machine-learning process may be implemented, without limitation, as described in U.S. Nonprovisional application Ser. No. 16/502,835, filed on Jul. 3, 2019, and entitled “METHODS AND SYSTEMS FOR ACHIEVING VIBRANT CONSTITUTION BASED ON USER INPUTS,” the entirety of which is incorporated herein by reference. - With continued reference to
FIG. 1 , user-action learner may be generated using a generative machine-learning process. A generative machine learning process is a process of determining the conditional probability of the observable X, given a target y, where, P(X|Y=y). A generative machine learning process may be used to generate random instances or outcomes either of an observation and target (x,y), or of an observation x given a target value y. A generative machine-learning process may include generating one or more generative machine-learning models. A generative machine-learning model may include generating a deep generative model that may combine a generative machine-learning model with a deep neural network. A generative machine-learning model may include one or more machine-learning models including but not limited to a gaussian mixture model, a hidden Markov model, a mixture model, a probabilistic context-free grammar model, a Bayesian network, a Naïve Bayes model, an autoregressive model, an averaged one-dependence estimator model, a latent Dirichlet allocation model, a Boltzmann machine, a restricted Boltzmann machine, a deep belief network, a variational autoencoder, a generative adversarial network, a flow-based generative model, an energy based model and the like. - With continued reference to
FIG. 1 ,computing device 104 identifies aresponse 140 as a function of user-action learner 144 and displays theresponse 140 within virtualmessage user interface 108.Computing device 104displays response 140 utilizing any methodology as described herein. - With continued reference to
FIG. 1 ,computing device 104 generates aresponse 140 by generating determining the context of auser message 120.Computing device 104 generates a context classifier. A “classifier,” as used in this disclosure, is a process wherebycomputing device 104 derives from training data, a model known as a “classifier,” for sorting inputs into categories or bins of data. A “context classifier,” as used in this disclosure, is a classifier that uses a user message as an input and outputs a message context label. A “message context label,” as used in this disclosure, is the identification of any purpose and/or circumstance that from the reason for a user entering auser message 120 within virtualmessage user interface 108. For example, a message context label may identify the setting for auser message 120, such as an upcoming event that a user needs to purchase an outfit for. In yet another non-limiting example, a message context label may identify an idea that a user has, such as to book a travel vacation. Context classifier may be generated using one or more classification algorithms. A classification algorithm may include a binary classifier, that results in only two distinct classes or with two possible outcomes. A classification algorithm may include a multi-class classifier, that results in more than two distinct classes. A classification algorithm may include for example, a naïve-Bayes classifier, a support vector machine, a k-nearest neighbor algorithm, a decision tree, a random forest, logistic regression, stochastic gradient descent, and the like.Computing device 104 identifies as a function of generating context classifier, a message context label for a user message.Computing device 104 selects a response as a function of a message context label.Computing device 104 selects a response by locating a response that relates to a message context label. For instance and without limitation, a message context label that indicates auser message 120 as relating to a birthday party may be used by computingdevice 104 to select a response that relates to a birthday and/or coordination of an event. In yet another non-limiting example, a message context label that indicates auser message 120 as relating to a work meeting may be used by computingdevice 104 to select a response that relates to an office setting. In an embodiment, responses by organized and/or categorized by context, as described below in more detail. - With continued reference to
FIG. 1 ,computing device 104 is configured to identify a third-party device 148 as a function of generating aresponse 140 to auser message 120. A “third party device,” as used in this disclosure, includes any device suitable for use asuser client device 112 as described above in more detail.Third party device 148 is operated by athird party 152, where athird party 152 includes any human being other thanuser 116. Athird party 152 may include an entity such as a corporation including a merchant, a retailer, a manufacturer, a business, and the like. For instance and without limitation,third party device 148 may include a computer operated by a local coffee shop or a mobile device operated by a user's hairdresser. Computing device 10 identifies athird party 152 utilizinglinguistic analyzer 136 to locate and/or identify athird party 152 mentioned within auser message 120. Information relating to third party device identifiers and/or third parties may be contained withinuser database 132. For instance and without limitation, auser message 120 may specify “schedule a meeting with Brian at his earliest convenience.” In such an instance,computing device 104 utilizeslinguistic analyzer 136 to identify Brian as a third-party 152 identified within theuser message 120.Computing device 104 then initiates virtualmessage user interface 108 betweencomputing device 104 and third-party device 148 operated by Brian. In such an instance,computing device 104 may generate and transmit using virtual message user interface 108 a message to Brian asking him when he is available to schedule a meeting with user.Computing device 104 initiates virtualmessage user interface 108 utilizing any network methodology as described herein. - With continued reference to
FIG. 1 ,computing device 104 is configured to complete an agenda action related to auser message 120. Completing an agenda action includes performing any step, method, and/or task to complete an agenda action. For example, an agenda action that contains a task such as rescheduling my haircut may be completed by computingdevice 104 when computingdevice 104 has rescheduled a user's haircut and confirmed a new appointment time.Computing device 104 updates a user agenda list 124 as a function of completing anagenda action 128, to reflect the agenda action as being complete.Computing device 104 displays aresponse 140 as a function of completing an agenda action. Aresponse 140 may be displayed within virtualmessage user interface 108 to inform a user that anagenda action 128 has been completed. This may be performed utilizing any methodology as described herein. - Referring now to
FIG. 2 , anexemplary embodiment 200 of auser database 132 is illustrated. User database may be implemented as any data structure as described above in more detail in reference toFIG. 1 . One or more tables contained withinuser database 132 may include user agenda list table 204; user agenda list table 204 may contain information relating to a user's agenda list 124. For instance and without limitation, user agenda list table 204 may contain a user's agenda list 124 containing a list of all tasks that a user needs to perform within the next six weeks. One or more tables contained withinuser database 132 may include agenda action table 208; agenda action table 208 may include one ormore agenda actions 128. For instance and without limitation, agenda action table 208 may include an agenda action that includes a task to visit the dentist next Thursday. One or more tables contained withinuser database 132 may include previous message table 212; previous message table 212 includes any previous messages entered by a user within virtual message user interface. For instance and without limitation, previous message table 212 may contain a log of all previous messages entered by a user into virtual message user interface. One or more tables contained withinuser database 132 may include conversation profile table 216; conversation profile table 216 may include information relating to one or more conversation profiles. For instance and without limitation, conversation profile table 216 may contain a direct conversation profile that includes various conversations organized by subject matter, as described above in more detail in reference toFIG. 1 . One or more tables contained withinuser database 132 may include message context table 220; message context table 220 may include information describing one or more message contexts. For instance and without limitation, message context table 220 may include a message context label that labels auser message 120 as relating to an orthopedic medical appointment. - Referring now to
FIG. 3 , anexemplary embodiment 300 of a conversation profile is illustrated.Computing device 104 receives auser response 120, entered by a user into virtualresponse user interface 108.Computing device 104 analyzes auser response 120, utilizinglinguistic analyzer 136, as described above in more detail in reference toFIG. 1 .Linguistic analyzer 136 identifies words, persons, and/or places for example, contained withinuser response 120.Computing device 104 utilizes contents analyzed bylinguistic analyzer 136, to select a conversation profile. A conversation profile includes any of the conversation profiles as described above in more detail in reference toFIG. 1 . In an embodiment,computing device 104 may evaluateConversation Profile A 304,Conversation Profile B 308, and/orConversation profile C 312. A conversation profile contains a compilation of one or more responses organized by style of communication. For instance and without limitation,Conversation Profile A 304 may relate to a direct style of communication. In such an instance,Conversation Profile A 304 may contain afirst response 316, asecond response 320, and athird response 324.First response 316 may contain a direct style response relating to the weather,second response 320 may contain a direct style response relating to ordering food to be delivered, andthird response 324 may contain a direct style response relating to a television purchase.Conversation Profile B 308 may relate to a graphic style of communication whereConversation Profile B 308 may contain afirst response 328, asecond response 332, and athird response 336.Conversation Profile C 312 may contain afirst response 340, asecond response 344, and athird response 348. One ormore responses 140, contained within a conversation profile may be stored withinresponse database 352.Response database 352 may be implemented as any data structure suitable for use asuser database 132, as described above in more detail in reference toFIG. 1 . One or more responses may be utilized to generate a response stored within a conversation profile. Responses contained within response database may be organized by communication style. - Referring now to
FIG. 4 , anexemplary embodiment 400 ofresponse database 352 is illustrated.Response database 352 may be implemented as any data structure suitable for use asuser database 132, as described above in more detail in reference toFIG. 1 . One or more tables contained withinresponse database 352 may include question response table 404; question response table 404 may include one ormore responses 144 that contain a question. One or more tables contained withinresponse database 352 may include multimedia response table 408; multimedia response table 408 may include one ormore responses 144 that contain multimedia including for example, text, audio, images, animations, video, and/or interactive content. One or more tables contained withinresponse database 352 may include greeting response table 412; greeting response table 412 may include one ormore responses 144 that contain a greeting. One or more tables contained withinresponse database 352 may include emoji response table 416; emoji response table 416 may include one ormore responses 144 that contain an emoji. One or more tables contained withinresponse database 352 may include help response table 420; help response table 420 may include one ormore responses 144 that contain a response to help a user utilizesystem 100. One or more tables contained withinresponse database 352 may include non-textual response table 424; non-textual response table 424 may include one ormore responses 144 that contain a non-textual response. - Referring now to
FIG. 5 , anexemplary embodiment 500 of context classifier is illustrated.Computing device 104 receives auser message 120, entered by a user. Auser message 120 includes any of the user messages as described above in more detail in reference toFIG. 1 .Computing device 104 generates acontext classifier 504, which evaluates thecontext 508 of auser message 120. Acontext 508 of auser message 120 includes the identification of any purpose and/or circumstance that from the reason for a user entering auser message 120 within virtualmessage user interface 108. For instance and without limitation, acontext 508 of auser message 120 may indicate that auser message 120 was generated because a user uploaded information pertaining to a user's exercise routine from the previous three months.Context classifier 504 utilizes auser message 120 as an input, and outputs amessage context label 512. Amessage context label 512 includes any message context label as described above in more detail in reference toFIG. 1 .Computing device 104 utilizes amessage context label 512 in combination withresponse selector 516 to select a response as a function of amessage context label 512.Response selector 516 may aidcomputing device 104 in using amessage context label 512 to select a response.Response selector 516 may evaluate a plurality of responses such asresponse A 520,response B 524, andresponse X 528, to determine which response relates tomessage contest label 512, and/or which response has asimilar context 508, and most appropriately responds touser message 120. For instance and without limitation,message context label 512 may relate to an upcoming birthday party, because the user needs to purchase a present for the birthday girl. In such an instance,response selector 516, may evaluateresponse A 520 which contains a smiley face emoji,response B 524 which contains a list of stores that sell birthday presents, and response x 528 which contains a response containing instructions for how to restart virtual message user interface when it is frozen. In such an instance,response selector 516 may evaluate the responses andselect response B 524, because it most closely relates to thecontext 508 of user'smessage 120.Computing device 104 stores one or more responses withinresponse database 352. - Referring now to
FIG. 6 , anexemplary embodiment 600 of a conversation displayed onuser client device 112 is illustrated. In an embodiment, virtualmessage user interface 108 receives afirst user message 604, containing a request forcomputing device 104 to “Cancel my meeting.”Computing device 104 utilizeslinguistic analyzer 136 to analyze auser message 108 and identifies a need for a user to provide more information. For example, usingfirst user message 604,computing device 104 may be unable to identify an agenda action related tofirst user message 604, because there is no information specifying which meeting needs to be canceled, and when the meeting is scheduled.Computing device 104 selects a conversation profile and initiates a conversation within virtualmessage user interface 108. In such an instance,computing device 104 follows up by asking a user more information, contained withinresponse 608, by specifying “Which one?” A conversation then ensues, where a user replies withmessage 612 “11:00 am,” and computing device then completes the request for cancel a user's 11:00 am meeting by generating a response that states inresponse 616 “11:00 am canceled!” A user then replies by stating inmessage 620 “Thank you!” In such an instance, after canceling user's 11:00 am meeting and rescheduling the meeting,computing device 104 may update user agenda list, to contain the newly generated meeting time as well as to reflect the proper cancelation of the user's previously scheduled 11:00 am meeting. - Referring now to
FIG. 7 , anexemplary embodiment 700 of a method of generating a virtual assistant in a messaging user interface is illustrated. Atstep 705,computing device 104 initiates a virtualmessage user interface 108 between auser client device 112 andcomputing device 104. Initiating virtualmessage user interface 108 may be performed utilizing any methodology as described above in more detail in reference toFIG. 1 . Virtualmessage user interface 108 may be implemented betweenuser client device 112 andcomputing device 104 utilizing any of the network methodologies as described above. In an embodiment, virtualmessage user interface 108 may allow for communications to be transmitted and received betweenuser client device 112 andcomputing device 104. Virtualmessage user interface 108 may contain a graphical user interface to allow for the transmission and/or display of both textual and/or non-textual outputs. Auser client device 112 includes any of theuser client devices 112 as described above in more detail in reference toFIG. 1 .User client device 112 is operated by auser 116. - With continued reference to
FIG. 7 , atstep 710,computing device 104 receives auser message 120 entered by auser 116, into virtualmessage user interface 108. Auser message 120 includes any of theuser messages 120 as described above in more detail in reference toFIG. 1 . Auser message 120 contains any communication, including any verbal, written, and/or recorded communication. Auser message 120 includes a calendar entry, including any of the calendar entries as described above in more detail in reference toFIG. 1 . For instance and without limitation, auser message 120 may include a calendar entry containing a user's calendar containing all upcoming appointments that a user has scheduled over the course of the next twelve months. Auser message 120 includes an elemental entry, including any of the elemental entries as described above in more detail in reference toFIG. 1 . For instance and without limitation, auser message 120 may contain a sensor reading containing a user's average heart rate while engaging in an exercise routine over the course of the past seven days. In yet another non-limiting example, auser message 120 may contain a complaint that a user has pain and discomfort in the user's lower back. Information pertaining to auser message 120 may be stored withinuser database 132, as described above in more detail in reference toFIG. 1 . - With continued reference to
FIG. 7 , atstep 715,computing device 104 retrieves data relating to a user agenda list, including a plurality of agenda actions. A user agenda list 124 includes any of the user agenda lists as described above in more detail in reference toFIG. 1 . A user agenda list 124 includes a plurality ofagenda actions 128. Anagenda action 128 includes any of the agenda actions as described above in more detail in reference toFIG. 1 . Anagenda action 128 includes a task that a user needs to complete. For instance and without limitation, anagenda action 128 may include a task such as pick up dry cleaning or mow the lawn. Anagenda action 128 may include any work and/or personal related tasks, including for example, a work related task to purchase a new anti-virus software or a work related task to prepare a presentation to attract new clients. Anagenda action 128 may include a personal task, such as purchasing more coffee at the grocery store, or scheduling a dental appointment within the next few weeks. Information pertaining to one or more agenda actions may be stored withinuser database 132.Computing device 104 prioritizes a plurality ofagenda actions 128 contained within a user agenda list 124 to update user agenda list 124 by priority of whichagenda actions 128 are most important and which agenda actions need to be completed first.Computing device 104 retrieves a previous message generated by a user. Previous messages may be stored withinuser database 132 as described above in more detail in reference toFIG. 1 .Computing device 104 prioritizes a plurality ofagenda actions 128 contained within user agenda list 124 as a function of a retrieved message. For instance and without limitation, a previous message may indicate that a user routinely obtains a haircut every six or seven weeks, depending on the user's availability, while the user also routinely has a personal training appointment every Wednesday, and the user has never missed an appointment. In such an instance,computing device 104 utilizes the information contained within a previous user message, to prioritizeagenda actions 128 contained within a user agenda list 124. In such an instance,computing device 104 prioritizes anagenda action 128 of the user's personal training appointment as having a higher priority and being of more importance than a user's haircut, which the user is much more flexible about. - With continued reference to
FIG. 7 , atstep 720,computing device 104 analyzes auser message 120 to identify an agenda action related to auser message 120.Computing device 104 analyzes auser message 120, utilizinglinguistic analyzer 136, as described above in more detail in reference toFIG. 1 .Linguistic analyzer 136 may be implemented as any data structure as described above in more detail in reference toFIG. 1 .Computing device 104 identifies usinglinguistic analyzer 136, a description of an agenda action contained withinuser message 120. For instance and without limitation,linguistic analyzer 136 may utilize one or more language processing techniques to identify one or more words, phrases, persons, and/or places contained within a user message. Language processing techniques employed bylinguistic analyzer 136 include any of the language processing techniques as described above in more detail in reference toFIG. 1 .Computing device 104 generates a query containing a description of anagenda action 128 to locate theagenda action 128 contained within a user agenda list 124. A query includes any of the queries as described above in more detail in reference toFIG. 1 . - With continued reference to
FIG. 7 , atstep 725,computing device 104 identifies a need for a user to provide more information.Computing device 104 identifies a need for a user to provide more information utilizinglinguistic analyzer 136 to analyze auser message 120, and determine a need for auser 116, to provide more information. This may occur, for example, when auser message 120 is missing details, is not complete, when an agenda action cannot be located that relates to auser message 120, and the like.Computing device 104 selects a conversation profile as a function of analyzing auser message 120 and initiates a conversation within virtualmessage user interface 108. This may be performed utilizing any of the methodologies as described above in more detail in reference toFIGS. 1-6 . A conversation profile includes any of the conversation profiles as described above in more detail. A conversation profile may contain one ormore responses 140, that may be utilized to initiate a conversation within virtualmessage user interface 108. - With continued reference to
FIG. 7 , atstep 730,computing device 104 generates aresponse 140 to a user message as a function of analyzing a user message. Aresponse 140 includes any of the responses as described above in more detail in reference toFIG. 1 . A response may include a textual and/or non-textual output. Atstep 735,computing device 104 generates a response by generating a user-action learner. User-action learner includes any of the user-action learners as described above in more detail in reference toFIG. 1 . User-action learner uses a previous message and a user message as an input, and outputs a response. User-action learner may be implemented as any machine-learning process as described above in more detail in reference toFIG. 1 . User-action learner may be generated as a generative machine-learning process as described above in more detail in reference toFIG. 1 . Atstep 740,computing device 104 identifies a response as a function of generating the user-action learner. Atstep 745,computing device 104 displays the response within virtualmessage user interface 108. - With continued reference to
FIG. 7 ,computing device 104 selects a response by generating context classifier. Context classifier may be generated utilizing any of the methodologies as described above in more detail in reference toFIG. 1 . Context classifier uses auser message 120 as an input, and outputs a message context label. A message context label includes any of the message context labels as described above in more detail in reference toFIG. 1 .Computing device 104 identifies as a function of generating context classifier, a message context label for auser message 120, and selects a response as a function of the message context label. This may be performed utilizing any of the methods as described above in more detail in reference toFIG. 1 . For instance and without limitation, a message context labels that indicates auser message 120 is about golf equipment may be utilized to select a response that relates to golf equipment. In an embodiment, responses stored withinresponse database 352 may be organized and stored based on context of which each response relates to.Computing device 104 evaluates responses organized by context withinresponse database 352, to select a response that matches and/or relates to a message context label. - With continued reference to
FIG. 7 ,computing device 104 identifies a third-party device 148 as a function of generating aresponse 140 to auser message 140. In an embodiment, a third-party device may be identified bylinguistic analyzer 136 when analyzing auser message 120. For example, auser message 120 may include a message to schedule a massage for a user on a specified day with the user's masseuse, Arnold. In such an instance,linguistic analyzer 136 analyzesuser message 120 and identifies Arnold as athird party 148 and initiates virtualmessage user interface 108 with third-party device 148, operated by the spa where Arnold works, to send a message to the third-party device within virtualmessage user interface 108 and schedule the massage. Initiating virtualmessage user interface 108 betweencomputing device 104 and third-party device 148 allowscomputing device 104 to send messages to third-party device 148 and complete agenda actions contained within user message.Computing device 104 completes anagenda action 128 related to auser message 120 and updates a user agenda list 124 as a function of completing theagenda action 128.Computing device 104 displays aresponse 140 as a function of completing anagenda action 128. Aresponse 140 may be displayed within virtualmessage user interface 108 and inform a user about the completion of anagenda action 128. - It is to be noted that any one or more of the aspects and embodiments described herein may be conveniently implemented using one or more machines (e.g., one or more computing devices that are utilized as a user computing device for an electronic document, one or more server devices, such as a document server, etc.) programmed according to the teachings of the present specification, as will be apparent to those of ordinary skill in the computer art. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those of ordinary skill in the software art. Aspects and implementations discussed above employing software and/or software modules may also include appropriate hardware for assisting in the implementation of the machine executable instructions of the software and/or software module.
- Such software may be a computer program product that employs a machine-readable storage medium. A machine-readable storage medium may be any medium that is capable of storing and/or encoding a sequence of instructions for execution by a machine (e.g., a computing device) and that causes the machine to perform any one of the methodologies and/or embodiments described herein. Examples of a machine-readable storage medium include, but are not limited to, a magnetic disk, an optical disc (e.g., CD, CD-R, DVD, DVD-R, etc.), a magneto-optical disk, a read-only memory “ROM” device, a random access memory “RAM” device, a magnetic card, an optical card, a solid-state memory device, an EPROM, an EEPROM, and any combinations thereof. A machine-readable medium, as used herein, is intended to include a single medium as well as a collection of physically separate media, such as, for example, a collection of compact discs or one or more hard disk drives in combination with a computer memory. As used herein, a machine-readable storage medium does not include transitory forms of signal transmission.
- Such software may also include information (e.g., data) carried as a data signal on a data carrier, such as a carrier wave. For example, machine-executable information may be included as a data-carrying signal embodied in a data carrier in which the signal encodes a sequence of instruction, or portion thereof, for execution by a machine (e.g., a computing device) and any related information (e.g., data structures and data) that causes the machine to perform any one of the methodologies and/or embodiments described herein.
- Examples of a computing device include, but are not limited to, an electronic book reading device, a computer workstation, a terminal computer, a server computer, a handheld device (e.g., a tablet computer, a smartphone, etc.), a web appliance, a network router, a network switch, a network bridge, any machine capable of executing a sequence of instructions that specify an action to be taken by that machine, and any combinations thereof. In one example, a computing device may include and/or be included in a kiosk.
-
FIG. 8 shows a diagrammatic representation of one embodiment of a computing device in the exemplary form of acomputer system 800 within which a set of instructions for causing a control system to perform any one or more of the aspects and/or methodologies of the present disclosure may be executed. It is also contemplated that multiple computing devices may be utilized to implement a specially configured set of instructions for causing one or more of the devices to perform any one or more of the aspects and/or methodologies of the present disclosure.Computer system 800 includes aprocessor 804 and amemory 808 that communicate with each other, and with other components, via abus 812.Bus 812 may include any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures. -
Memory 808 may include various components (e.g., machine-readable media) including, but not limited to, a random access memory component, a read only component, and any combinations thereof. In one example, a basic input/output system 816 (BIOS), including basic routines that help to transfer information between elements withincomputer system 800, such as during start-up, may be stored inmemory 808.Memory 808 may also include (e.g., stored on one or more machine-readable media) instructions (e.g., software) 820 embodying any one or more of the aspects and/or methodologies of the present disclosure. In another example,memory 808 may further include any number of program modules including, but not limited to, an operating system, one or more application programs, other program modules, program data, and any combinations thereof. -
Computer system 800 may also include astorage device 824. Examples of a storage device (e.g., storage device 824) include, but are not limited to, a hard disk drive, a magnetic disk drive, an optical disc drive in combination with an optical medium, a solid-state memory device, and any combinations thereof.Storage device 824 may be connected tobus 812 by an appropriate interface (not shown). Example interfaces include, but are not limited to, SCSI, advanced technology attachment (ATA), serial ATA, universal serial bus (USB), IEEE 1394 (FIREWIRE), and any combinations thereof. In one example, storage device 824 (or one or more components thereof) may be removably interfaced with computer system 800 (e.g., via an external port connector (not shown)). Particularly,storage device 824 and an associated machine-readable medium 828 may provide nonvolatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data forcomputer system 800. In one example,software 820 may reside, completely or partially, within machine-readable medium 828. In another example,software 820 may reside, completely or partially, withinprocessor 804. -
Computer system 800 may also include aninput device 832. In one example, a user ofcomputer system 800 may enter commands and/or other information intocomputer system 800 viainput device 832. Examples of aninput device 832 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device, a joystick, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), a cursor control device (e.g., a mouse), a touchpad, an optical scanner, a video capture device (e.g., a still camera, a video camera), a touchscreen, and any combinations thereof.Input device 832 may be interfaced tobus 812 via any of a variety of interfaces (not shown) including, but not limited to, a serial interface, a parallel interface, a game port, a USB interface, a FIREWIRE interface, a direct interface tobus 812, and any combinations thereof.Input device 832 may include a touch screen interface that may be a part of or separate fromdisplay 836, discussed further below.Input device 832 may be utilized as a user selection device for selecting one or more graphical representations in a graphical interface as described above. - A user may also input commands and/or other information to
computer system 800 via storage device 824 (e.g., a removable disk drive, a flash drive, etc.) and/ornetwork interface device 840. A network interface device, such asnetwork interface device 840, may be utilized for connectingcomputer system 800 to one or more of a variety of networks, such asnetwork 844, and one or moreremote devices 848 connected thereto. Examples of a network interface device include, but are not limited to, a network interface card (e.g., a mobile network interface card, a LAN card), a modem, and any combination thereof. Examples of a network include, but are not limited to, a wide area network (e.g., the Internet, an enterprise network), a local area network (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a data network associated with a telephone/voice provider (e.g., a mobile communications provider data and/or voice network), a direct connection between two computing devices, and any combinations thereof. A network, such asnetwork 844, may employ a wired and/or a wireless mode of communication. In general, any network topology may be used. Information (e.g., data,software 820, etc.) may be communicated to and/or fromcomputer system 800 vianetwork interface device 840. -
Computer system 800 may further include avideo display adapter 852 for communicating a displayable image to a display device, such asdisplay device 836. Examples of a display device include, but are not limited to, a liquid crystal display (LCD), a cathode ray tube (CRT), a plasma display, a light emitting diode (LED) display, and any combinations thereof.Display adapter 852 anddisplay device 836 may be utilized in combination withprocessor 804 to provide graphical representations of aspects of the present disclosure. In addition to a display device,computer system 800 may include one or more other peripheral output devices including, but not limited to, an audio speaker, a printer, and any combinations thereof. Such peripheral output devices may be connected tobus 812 via aperipheral interface 856. Examples of a peripheral interface include, but are not limited to, a serial port, a USB connection, a FIREWIRE connection, a parallel connection, and any combinations thereof. - The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope of this invention. Features of each of the various embodiments described above may be combined with features of other described embodiments as appropriate in order to provide a multiplicity of feature combinations in associated new embodiments. Furthermore, while the foregoing describes a number of separate embodiments, what has been described herein is merely illustrative of the application of the principles of the present invention. Additionally, although particular methods herein may be illustrated and/or described as being performed in a specific order, the ordering is highly variable within ordinary skill to achieve methods, systems, and software according to the present disclosure. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
- Exemplary embodiments have been disclosed above and illustrated in the accompanying drawings. It will be understood by those skilled in the art that various changes, omissions, and additions may be made to that which is specifically disclosed herein without departing from the spirit and scope of the present invention.
Claims (21)
1-20. (canceled)
21. A system for generating a virtual assistant in a messaging user interface, the system comprising a computing device designed and configured to:
receive, from a sensor, a change in the biological state of a user;
prioritize an agenda action of a plurality of agenda actions contained within a user agenda list as a function of the change in the biological state of the user;
initiate a virtual message user interface between a user client device and the computing device;
select a conversation profile for the virtual assistant, wherein the conversation profile comprises a behavior that the user uses to communicate;
generate a communication to the user comprising the prioritized agenda action, wherein the communication is further generated as a function of the selected conversation profile;
identify a third-party device as a function of the communication; and
automatedly share information with the third-party device as a function of the communication.
22. The system of claim 1, wherein the conversation profile comprises an affectionate communication style.
23. The system of claim 1, wherein the conversation profile comprises a direct communication style.
24. The system of claim 1, wherein the conversation profile relates to a graphic style of communication.
25. The system of claim 1, wherein the computing device identifies a third-party by utilizing a linguistic analyzer.
26. The system of claim 1, wherein the third-party device comprises a computer operated by a health-related service.
27. The system of claim 1, wherein the computing device is configured to identify the plurality of agenda actions using a language processing module.
28. The system of claim 1, wherein the sensor is configured to detect a hematological parameter.
29. The system of claim 1, wherein the agenda action is further prioritized as a function of a previous user message.
30. The system of claim 1, wherein the response to the user is generated by generating a user-action learner.
31. A method of generating a virtual assistant in a messaging user interface, the method comprising:
receiving, from a sensor, a change in the biological statue of a user;
prioritizing by the computing device an agenda action of a plurality of agenda actions contained within a user list as a function of the user message;
initiating by a computing device, a virtual message user interface between a user client device and a computing device;
selecting by the computing device, a conversation profile for the virtual assistant, wherein the conversation profile comprises behavior that the user uses to communicate;
generating by the computing device, a communication to the user comprising the prioritized agenda action, wherein the communication is further generated as a function of the selected conversation profile.
identifying a third-party device as a function of the communication; and
automatedly sharing information with the third-party device as a function of the communication.
32. The method of claim 11, wherein the conversation profile comprises an affectionate communication style.
33. The method of claim 11, wherein the conversation profile comprises a direct communication style.
34. The method of claim 11, wherein the conversation profile relates to a graphic style of communication.
35. The method of claim 11, wherein the computing device identifies a third-party by using a linguistic analyzer.
36. The method of claim 11, wherein the third-party device comprises computer operated by health-related service.
37. The method of claim 11, wherein identifying the plurality of agenda actions further comprises identifying the plurality of user actions using a language processing module.
38. The method of claim 11, wherein the sensor is configured to detect a hematological parameter.
39. The method of claim 11, wherein the agenda action is further prioritized as a function of a previous user message.
40. The method of claim 11, wherein the response to the user is generated by generating a user-action learner.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/138,522 US20230262016A1 (en) | 2020-06-25 | 2023-04-24 | Methods and systems for generating a virtual assistant in a messaging user interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/912,040 US11665118B2 (en) | 2020-06-25 | 2020-06-25 | Methods and systems for generating a virtual assistant in a messaging user interface |
US18/138,522 US20230262016A1 (en) | 2020-06-25 | 2023-04-24 | Methods and systems for generating a virtual assistant in a messaging user interface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/912,040 Continuation US11665118B2 (en) | 2020-06-25 | 2020-06-25 | Methods and systems for generating a virtual assistant in a messaging user interface |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230262016A1 true US20230262016A1 (en) | 2023-08-17 |
Family
ID=79030659
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/912,040 Active 2040-11-04 US11665118B2 (en) | 2020-06-25 | 2020-06-25 | Methods and systems for generating a virtual assistant in a messaging user interface |
US18/138,522 Pending US20230262016A1 (en) | 2020-06-25 | 2023-04-24 | Methods and systems for generating a virtual assistant in a messaging user interface |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/912,040 Active 2040-11-04 US11665118B2 (en) | 2020-06-25 | 2020-06-25 | Methods and systems for generating a virtual assistant in a messaging user interface |
Country Status (1)
Country | Link |
---|---|
US (2) | US11665118B2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210027222A1 (en) * | 2019-07-23 | 2021-01-28 | WorkStarr, Inc. | Methods and systems for processing electronic communications |
KR20220046964A (en) * | 2020-10-08 | 2022-04-15 | 삼성전자주식회사 | Electronic apparatus for responding to question using multi chat-bot and control method thereof |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2218019A4 (en) | 2007-11-02 | 2012-04-18 | Hunch Inc | Interactive machine learning advice facility |
US9201865B2 (en) | 2013-03-15 | 2015-12-01 | Bao Tran | Automated assistance for user request that determines semantics by domain, task, and parameter |
JP2018503208A (en) | 2014-12-23 | 2018-02-01 | エジェンタ, インコーポレイテッド | Intelligent personal agent platform and system and method for using the same |
US10318096B2 (en) | 2016-09-16 | 2019-06-11 | Microsoft Technology Licensing, Llc | Intelligent productivity monitoring with a digital assistant |
US20180121432A1 (en) | 2016-11-02 | 2018-05-03 | Microsoft Technology Licensing, Llc | Digital assistant integration with music services |
US10524092B2 (en) | 2017-01-12 | 2019-12-31 | Microsoft Technology Licensing, Llc | Task automation using location-awareness of multiple devices |
US10127227B1 (en) * | 2017-05-15 | 2018-11-13 | Google Llc | Providing access to user-controlled resources by automated assistants |
WO2019104411A1 (en) | 2017-11-28 | 2019-06-06 | Macadamian Technologies Inc. | System and method for voice-enabled disease management |
US10810322B2 (en) | 2017-12-05 | 2020-10-20 | Microsoft Technology Licensing, Llc | Sharing user information with and between bots |
US11076039B2 (en) | 2018-06-03 | 2021-07-27 | Apple Inc. | Accelerated task performance |
US11631118B2 (en) * | 2018-12-21 | 2023-04-18 | Soham Inc | Distributed demand generation platform |
US11335335B2 (en) * | 2020-02-03 | 2022-05-17 | International Business Machines Corporation | Disambiguation of generic commands for controlling objects |
-
2020
- 2020-06-25 US US16/912,040 patent/US11665118B2/en active Active
-
2023
- 2023-04-24 US US18/138,522 patent/US20230262016A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US11665118B2 (en) | 2023-05-30 |
US20210409363A1 (en) | 2021-12-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102457486B1 (en) | Emotion type classification for interactive dialog system | |
US20220237368A1 (en) | Systems and methods for machine content generation | |
US11062270B2 (en) | Generating enriched action items | |
US20230262016A1 (en) | Methods and systems for generating a virtual assistant in a messaging user interface | |
Burgoon et al. | There are many ways to see the forest for the trees: A tour guide for abstraction | |
Schiaffino et al. | Intelligent user profiling | |
US10817782B1 (en) | Methods and systems for textual analysis of task performances | |
JP5607917B2 (en) | How to deliver context-based content to users | |
WO2020005648A1 (en) | Meeting preparation manager | |
CN114556354A (en) | Automatically determining and presenting personalized action items from an event | |
Sprague et al. | Exploring how and why people use visualizations in casual contexts: Modeling user goals and regulated motivations | |
US20120259927A1 (en) | System and Method for Processing Interactive Multimedia Messages | |
US20120259926A1 (en) | System and Method for Generating and Transmitting Interactive Multimedia Messages | |
Joshi et al. | Communicating with distant others: The functional use of abstraction | |
US20230252224A1 (en) | Systems and methods for machine content generation | |
Sprenger | Communicated into being: Systems theory and the shifting of ontological status | |
US20230177471A1 (en) | Methods and systems for exploiting value in certain domains | |
US20230336694A1 (en) | Tagging Characteristics of an Interpersonal Encounter Based on Vocal Features | |
Rivera Pelayo | Design and application of quantified self approaches for reflective learning in the workplace | |
US20210027222A1 (en) | Methods and systems for processing electronic communications | |
Armstrong | Big Data, Big Design: Why Designers Should Care about Artificial Intelligence | |
CN114041145A (en) | System and method for generating and providing suggested actions | |
King | ‘Chick Crack’: Self-Esteem, Science and Women’s Dating Advice | |
KR20230129875A (en) | Method and system for goods recommendation | |
Adib et al. | Analyzing happiness: investigation on happy moments using a bag-of-words approach and related ethical discussions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |