US20210072952A1 - Systems and methods for operating a mobile application using a conversation interface - Google Patents
Systems and methods for operating a mobile application using a conversation interface Download PDFInfo
- Publication number
- US20210072952A1 US20210072952A1 US16/567,870 US201916567870A US2021072952A1 US 20210072952 A1 US20210072952 A1 US 20210072952A1 US 201916567870 A US201916567870 A US 201916567870A US 2021072952 A1 US2021072952 A1 US 2021072952A1
- Authority
- US
- United States
- Prior art keywords
- user
- mobile application
- chat
- application
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L15/18—Speech classification or search using natural language modelling
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1636—Sensing arrangement for detection of a tap gesture on the housing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/02—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail using automatic reactions or user delegation, e.g. automatic replies or chatbot-generated messages
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
Definitions
- the present disclosure is generally related to user interfaces. More particularly, the present disclosure is directed to systems and methods for controlling the operation of a mobile application using a conversation interface.
- a user interface is a system by which users interact with a computing device, such as a computer or a mobile electronic device.
- a user interface allows users to input information to manipulate the computing device.
- a user interface also allows the computing device to output information as to the effects of the manipulation.
- a graphical user interface is a type of user interface that allows users to interact with computing devices with images rather than text commands. That is, a GUI represents the information and actions available to a user through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels, or text navigation. The actions are usually performed through direct manipulation of the graphical elements.
- An electronic form is a type of GUI view that is specifically designed to allow a user to enter data in a structured manner for processing by a computing device.
- An electronic form is an electronic version of a physical form, such as a paper document with blank spaces for insertion of required or requested information.
- An electronic form provides an input template comprising various combinations of checkboxes, radio buttons, form fields, and other GUI elements designed to query and display data.
- GUI interfaces are intuitive and provide a convenient interface, they may present some challenges for some users, especially unsophisticated or elderly operators. An inexperienced user may often have difficulties locating the correct icon or form within the GUI of application when attempting to invoke desired functionality. Accordingly, a user may be forced to underutilize the capabilities of an application, or worse, end up with an unsatisfactory result. There is an ongoing need for improved systems and methods to allow users to interact with an application that operates with a GUI interface.
- various features and functionality can be provided to enable or otherwise facilitate control and operation of a selected mobile application comprising a native GUI via an exchange of natural language commands in a conversation interface.
- the system for controlling the operation of a selected mobile application may include obtaining a command for initiating a conversation interface.
- the command may be originating from a computing device operated by a user.
- the conversation interface may be displayed adjacent to a mobile application, the mobile application comprising a graphical user interface (GUI).
- GUI graphical user interface
- the conversation interface may be configured to receive user input comprising one or more user commands. In some embodiments, the conversation interface may display assistant user input comprising one or more responses generated by an assistant user based on the user input.
- a first user input comprising a first text command may be obtained.
- the first user input may be entered by the user via the client computing device and displayed in the conversation interface.
- a first response generated by the assistant user in response to the first user input may be obtained.
- the first response may be displayed in the conversation interface.
- the mobile application may be updated based on the first user input received from the user.
- updating the mobile application may include outputting an output command associated with one or more actions that may occur in the mobile application.
- the actions that occur within the mobile application may include at least one of a travel reservation, a dining reservation, and a purchase transaction.
- updating of the mobile application may include updating the GUI of the mobile application.
- the output command may comprise output data associated with the updating of the mobile application based on the user input.
- the output command may comprise output data associated with the first user input.
- a modified user input comprising a modified text command may be obtained.
- the first user input entered via the client computing device and displayed in the conversation interface may be modified by the user.
- the mobile application may be updated based on the modified user input.
- the output command may comprise output data associated with the updating of the mobile application based on the modified user input.
- a graphical representation of the output command data associated with the first and modified user input may be generated and displayed in the conversation interface.
- the command for initiating the conversation interface may be obtained during an active session of the mobile application operation by the user.
- account information associated with the mobile application operated by the user may be obtained.
- the account information comprising historic user data indicating commands previously generated by the user and received by the mobile application may be obtained.
- a geographic location associated with the computing device operated by the user and indicating a real world location of the user may be obtained.
- a second response generated by the assistant user may be obtained.
- the second response may be based on the historic user data and the real world location of the user.
- FIG. 1 illustrates example systems and a network environment, according to an implementation of the disclosure.
- FIG. 2 illustrates an example chat interface server of the example system of FIG. 1 , according to an implementation of the disclosure.
- FIG. 3 illustrates an example process for initiating a client chat application, according to an implementation of the disclosure.
- FIGS. 4A-4D illustrate an example chat interface used to operate a mobile application, according to an implementation of the disclosure.
- FIGS. 5A-5E illustrate an example chat interface used to operate a mobile application, according to an implementation of the disclosure.
- FIG. 6 illustrates an example computing system that may be used in implementing various features of embodiments of the disclosed technology.
- Described herein are systems and methods for controlling the operation of a selected mobile application with a native GUI via an exchange of textual data between users and experts in a conversation interfaces.
- the details of some example embodiments of the systems and methods of the present disclosure are set forth in the description below.
- Other features, objects, and advantages of the disclosure will be apparent to one of skill in the art upon examination of the following description, drawings, examples, and claims. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
- GUI GUI interfaces
- users may experience challenges when interacting with a mobile application with a GUI.
- users that use mobile applications do so with a particular purpose and often to solve a particular problem (e.g., reserve airline tickets or purchase an item).
- a particular problem e.g., reserve airline tickets or purchase an item.
- users are forced to do so by interacting with an application via an interface, more specifically a GUI.
- GUI interfaces are essentially artificial creations invented to enable interactions between a user and a device. Accordingly, users have to adapt to interfaces, i.e., learn the rules on how to operate them. Because of these added cognitive demands associated with using a GUI, users focus less on solving the problem and more on learning the GUI.
- a user can interact with a mobile application by invoking a conversation or chat interface of a chat application rather than use native GUI of the mobile application.
- a user can interact with a particular mobile application using text commands in a natural language without having to learn new skills to interact with the GUI.
- the chat interface is invoked contemporaneously with and is displayed next to the application, the user can control existing mobile applications including their underlining functionalities and utilize data associated with those applications without the need to reenter or otherwise provide information necessary to complete their task.
- the chat interface next to the mobile application GUI allows the user to visually observer the results of their interactions as if they were interacting with the application via the GUI rather than the chat interface. In other words, by virtue of seeing the outcome of their interactions allows the users to preserve the positive aspects of a GUI (e.g., visual representation of results) while improving users' ability to effectively interact with the mobile application.
- FIG. 1 illustrates one such example environment 100 .
- FIG. 1 illustrates an example environment 100 for providing a chat interface for interacting with mobile applications, as described herein.
- environment 100 may include a chat interface server 120 , a one or more expert servers 130 , a mobile application server 140 , a one or more client computing devices 104 , and a network 103 .
- a user 150 may be associated with client computing device 104 as described in detail below.
- client chat application 127 i.e., the chat application running on client computing device 104 provided by a distributed chat application 126
- client chat application 127 may comprise a chat interface (e.g., as illustrated in FIG. 2 ) and may be configured to control and/or operate a client mobile application 148 running on client computing device 104 (client mobile application 148 may be provided by a distributed mobile application 146 running on mobile application server 140 ).
- client chat application 127 may be initiated after client mobile application 148 has been initiated.
- the various components of FIG. 1 may be configured to initiate client chat application 127 upon receiving user input associated with initiating client chat application 127 as will be described in detail below.
- user 150 may first initiate client mobile application 148 via one or more user inputs associated with initiating client mobile application 148 .
- user 150 may provide additional user input configured to initiate client chat application 127 to run alongside or next to client mobile application 148 on client computing device 104 .
- the various components of FIG. 1 may be configured to initiate client chat application 127 upon initiating client mobile application 148 automatically, i.e., without receiving additional user input.
- client chat application 127 may be initiated upon receiving user input associated with initiating the chat application as will be described in detail below.
- user 150 may provide user input (e.g., knocking or tapping) within a GUI associated with client computing device 104 .
- the various components of FIG. 1 may be configured to initiate client mobile application 148 by user input within the chat interface of client chat application 127 .
- user 150 may provide user input within the chat interface to initiate client mobile application 148 .
- chat interface server 120 may include a processor, a memory, and network communication capabilities.
- chat interface server 120 may be a hardware server.
- chat interface server 120 may be provided in a virtualized environment, e.g., chat interface server 120 may be a virtual machine that is executed on a hardware server that may include one or more other virtual machines.
- Chat interface server 120 may be communicatively coupled to a network 103 .
- chat interface server 120 may transmit and receive information to and from one or more of client computing devices 104 , mobile application server 140 , one or more expert servers 130 , and/or other servers via network 103 .
- chat interface server 120 may include chat application 126 , as alluded to above.
- Chat application 126 may be a distributed application implemented on one or more client computing devices 104 as client chat application 127 , as described herein.
- chat application 126 included in chat interface server 120 may provide client functionality to enable user 150 to operate client mobile application 148 running on client computing device 104 .
- the client chat application 127 may include a chat interface (not illustrated) which allows user 150 to use natural language commands to operate client mobile application 148 .
- the chat interface may allow user 150 to operate client mobile application 148 by exchanging messages with one or more chat assistants (e.g., human users or automated software agents or bots).
- these chat assistants may help user 150 to operate the client mobile application 148 by eliciting commands from user 150 intended for client mobile application 148 , generating responses, and effectuating client mobile application 148 to generate results associated with the commands received from user 150 .
- user 150 can operate client mobile application 148 without having to learn an interface associated with mobile application 148 , resulting in a more efficient and streamlined user experience.
- distributed chat application 126 may be implemented using a combination of hardware and software.
- chat application 126 may be a server application, a server module of a client-server application, or a distributed application (e.g., with a corresponding client chat application 127 running on one or more client computing devices 104 ).
- chat interface server 120 may also include a database 122 .
- database 122 may store communications or messages exchanged via the chat interface, user data associated with user 150 , and/or other information.
- chat interface server 120 may comprise computer program components operable by the processor to enable exchange of messages between user 150 and one or more human users or bots.
- chat interface server 120 may include one or human expert users or agents assisting user 150 in operating client mobile application 148 provided on client computing device 104 .
- human chat assistants may be selected from a group of specially trained assistants or experts.
- expert assistants may be skilled in providing assistance to users operating a particular mobile application (e.g., Uber, Expedia, and so on).
- the experts may be implemented on expert server 130 .
- the experts may be implemented on client device 104 and not on chat interface server 120 .
- automated software assistants or bots may be provided by distributed chat application 126 .
- the automated assistant or bot may interact with users through text, e.g., via chat interface of client chat application 127 .
- an automated assistant may be implemented by an automated assistant provider such that it is not the same as the provider of distributed chat application 126 .
- one or more expert servers 130 may implement chat assistant services, including human expert services, bot services, and/or other similar services as described in further detail below.
- one or more expert servers 130 may include one or more processors, memory and network communication capabilities (not shown).
- expert server 130 may be a hardware server connected to network 103 , using wired connections, such as Ethernet, coaxial cable, fiber-optic cable, etc., or wireless connections, such as Wi-Fi, Bluetooth, or other wireless technology.
- expert server 130 may transmit data between one or more of the chat interface server 120 and client computing device 104 via network 103 .
- expert server 130 may be managed by the same party that manages chat interface server 120 .
- expert server 130 may a be third-party server, e.g., controlled by a party different from the party that provides chat interface services (i.e., chat interface server 120 ).
- user 150 may exchange messages with one or more assistants within a chat interface of client chat application 127 provided on client user device 104 .
- user 150 may enter natural language commands and receive responses from the expert agents.
- client computing device 104 may include a variety of electronic computing devices, such as, for example, a smartphone, tablet, laptop, computer, wearable device, television, virtual reality device, augmented reality device, displays, connected home device, Internet of Things (IOT) device, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices, and/or other devices.
- IOT Internet of Things
- EGPS enhanced general packet radio service
- client computing device 104 may present content to a user and receive user input.
- client computing device 104 may parse, classify, and otherwise process user input.
- client computing device 104 may store user input including commands for initiating client chat application 127 , as will be described in detail below.
- client computing device 104 may be equipped with GPS location tracking and may transmit geolocation information via a wireless link and network 103 .
- chat interface server 120 and/or distributed chat application 126 may use the geolocation information to determine a geographic location associated with user 150 .
- chat interface server 120 may use signal transmitted by client computing device 104 to determine the geolocation of user 150 based on one or more of signal strength, GPS, cell tower triangulation, Wi-Fi location, or other input.
- the geolocation associated with user 150 may be used by one or more computer program components associated with the chat application 126 during user 150 interaction with chat interface of the client chat application 127 .
- mobile application server 140 may include one or more processors, memory and network communication capabilities.
- mobile application server 140 may be a hardware server connected to network 103 , using wired connections, such as Ethernet, coaxial cable, fiber-optic cable, etc., or wireless connections, such as Wi-Fi, Bluetooth, or other wireless technology.
- mobile application server 140 may transmit data between one or more of chat interface server 120 , client computing device 104 , and/or other components via network 103 .
- mobile application server 140 may include one or more distributed mobile applications (e.g., mobile application 146 ) implemented on client computing device 104 as client mobile application 148 .
- user 150 may instruct the mobile server 140 to download mobile application 146 on client computing device 104 as client mobile application 148 .
- the mobile application server 140 may transmit the data to client computing device 104 to execute client mobile application 148 on client computing device 104 .
- mobile application 146 may communicate and interface with a framework implemented by distributed chat application 126 using an application program interface (API) that provides a set of predefined protocols and other tools to enable the communication.
- API application program interface
- the API can be used to communicate particular data from chat application 126 used to connect to and synchronize with client mobile application 148 that user 150 is operating via the chat interface of client chat application 127 .
- mobile application server 140 may include a database 142 .
- database 142 may store user data associated with user 150 , and/or other information.
- user data may include user account information such as login name, password, preferences, and so on.
- user data may include historic information indicating previous interactions between user 150 and client mobile application 148 .
- historic information may include purchase transaction data or travel reservation data previously made by user 150 .
- user data including user account data and historic data may be communicated from mobile application server 140 to chat application 126 .
- client mobile application 148 By virtue of providing communication between distributed chat application 126 and/or client chat application 127 and client mobile application 148 results in an efficient control of mobile application functions. For example, user commands received as input to the chat interface can be input to client mobile application 148 , allowing mobile application 146 to respond to commands and messages from the chat interface of client chat application 127 .
- user data obtained from mobile application server 140 may be used by chat application 126 .
- user data may be by chat assistants or by automated software agents or bots when helping user 150 to operate client mobile application 148 .
- chat assistant may utilize user data to determine user preferences based on prior interactions with client mobile application 148 . By virtue of using user data, allows to eliminate potential exchanges between chat assistant and user 150 .
- client mobile application 148 may send events or notifications directly to client chat application 127 to be displayed via the chat interface.
- client chat application 127 may be displayed via the chat interface.
- client mobile application 148 may send events or notifications directly to client chat application 127 to be displayed via the chat interface.
- client chat application 127 allows users not currently actively interacting with client mobile application 148 to stay informed of any changes and/or events.
- user 150 searching for airline tickets may receive a notification that a price of an airline ticket decreased prompting user 150 to initiate a purchase transaction. This may improve the engagement, retention, and awareness of users about the occurrence of events and changes within mobile application relevant to user 150 .
- a single chat interface may be used to control multiple mobile applications simultaneously. For example, by using the chat interface with several mobile applications in association with a single chat interface allows users to interact with the mobile application with reduced user input and reduced time. That is, when using a single chat interface to interact with multiple mobile applications allows to reduce consumption of device resources that would otherwise be needed to process, display, and receive user input in multiple mobile applications. Further, using a single chat application with multiple mobile applications may reduce and/or eliminate time required for switching displays of different interfaces for different mobile applications on a client computing device, reducing copying and pasting displayed data in one mobile application to another mobile application and its interface, and reducing the repeating of commands.
- a standard API can be used between distributed chat application 126 and/or client chat application 127 and client mobile application 148 , allowing user 150 to control a large variety of mobile applications provided by many different providers.
- FIG. 2 illustrates an example chat interface server 120 configured in accordance with one embodiment.
- chat interface server 120 may include a distributed chat application 126 and a corresponding client chat application 127 running on one or more client computing devices 104 .
- the corresponding client chat application 127 may comprise a chat interface 129 and may be configured to provide client functionality to enable user 150 to operate mobile application 148 provided on client computing device 104 via natural language commands entered via chat interface 129 rather than a GUI 149 associated with mobile application 148 .
- a user can interact with a particular mobile application using text commands in a natural language entered into the chat interface in order to complete a particular task, as alluded to above.
- distributed chat application 126 may be operable by one or more processor(s) 124 configured to execute one or more computer program components.
- the computer program components may include one or more of a chat interface component 106 , a chat component 108 , a chat processing component 110 , a response component 112 , a GUI response component 114 , and/or other such components.
- chat interface component 106 may be configured to initiate client chat application 127 on client computing device 104 .
- chat interface component 106 may be configured to detect one or more user inputs or interactions from one of the client computing devices 104 and interpret the detected input or interaction as a command for initiating client chat application 127 .
- user 150 may initiate the client chat application 127 by interacting with an icon corresponding to client chat application 127 which has been downloaded onto client computing device 104 over network 103 .
- client chat application 127 may be initiated upon receiving input from user 150 (i.e., the user selects the icon).
- user 150 may initiate chat application 127 via one or more haptic commands, voice commands, and/or a combination of haptic and voice commands.
- the haptic commands may include user 150 knocking, tapping, and/or scratching on client computing device 104 .
- user 150 may initiate client chat application 127 by speaking a voice command (e.g., “Start Chat”).
- the haptic commands associated with initiating the client chat application 127 may be selected by the chat application 126 running on the chat interface server 120 .
- the chat application 126 may include a double knocking command used to initiate the client chat application 127 .
- user 150 may modify the haptic command selection to another command available to the user.
- user 150 may indicate that instead of double knocking, the user wants to initiate client chat application 127 by scratching client computing device 104 .
- user 150 may create a new haptic or voice command by recording the user input associated with the command.
- chat interface component 106 may be configured to capture audio signal produced from the haptic input (such as knocking, tapping, or scratching) or voice input (such as a command spoken by a user) by the device microphone. For example, user 150 may knock twice on the device resulting in an audio signal.
- the captured audio signal may be obtained by chat interface component 106 to determine whether the audio signal corresponds to the audio signal used to initiate client chat application 127 .
- the audio signal may be obtained from a microphone of client computing device 104 .
- chat interface component 106 may be configured to manipulate the audio signal obtained by transmitting the audio signal to the chat interface server 120 .
- chat interface component 106 may be configured to process audio signal.
- chat interface component 106 may be configured perform at least one of a noise removal, windowing, and a spectrum analysis during processing of the audio signal.
- chat interface component 106 may be configured to determine if the audio signal received from the microphone of client computing device 104 is a valid haptic input or a voice command by matching the processed audio signal to a valid audio signal.
- the valid audio signal may be obtained from database 122 .
- chat interface component 106 may be configured to initiate client chat application 127 on client computing device 104 .
- FIG. 3 illustrates a flow diagram describing a method for initiating a chat application on a client computing device, in accordance with one embodiment.
- method 300 can be implemented, for example, on a server system, e.g., chat interface server 120 , as illustrated in FIG. 1 .
- chat interface component 106 determines whether received user input command (i.e., haptic or voice command) for initiating the chat application is valid.
- chat interface component 106 may process an audio signal obtained from a microphone of client computing device and compare it a valid audio signal. At operation 315 , upon determining that the received user input for initiating chat application is valid, chat interface component 106 determines whether the mobile application has already been initialized within the client computing device. At operation 320 , upon determining that the mobile application has not been initialized within the client computing device, chat interface component 106 initiates the chat application within the client computing device. At operation 325 , upon determining that the mobile application has been initialized within the client computing device, chat interface component 106 may be configured to determine whether the particular mobile application is compatible to be used with the chat interface of the chat application. At operation 330 , upon determining that the mobile application initialized within the client computing device is compatible to be used with the chat interface of the chat application, chat interface component 106 initiates the chat application such that the chat application is displayed adjacent to the mobile application on the client computing device.
- chat interface component 106 may be configured to initiate client chat application 127 upon initiating client mobile application 148 automatically, i.e., without receiving additional user input.
- the automatic initiation of client chat application 127 may be determined by the one or more initiation settings associated with the chat application 126 running on the chat interface server 120 .
- the chat application 126 may be associated with one or more mobile applications that would cause client chat application 127 to be initiated on client computing device 104 upon initiation of those mobile application without any additional user input.
- user 150 may modify which mobile applications would trigger automatic initiation of chat application 127 .
- user 150 may indicate that only a particular mobile application would cause the automatic initiation of client chat application 127 .
- chat interface component 106 may be configured to initiate client mobile application 148 upon receiving user input entered into the chat interface 129 of client chat application 127 .
- user 150 may initiate client mobile application 148 within client computing device 104 by entering a corresponding text command within the chat interface of client chat application 127 on client computing device 104 .
- the text command may be explicit (e.g., “Start Uber”).
- chat interface component 106 may be configured initiate client mobile application 148 upon receiving a text command that is not explicit (e.g., “I need a ride”).
- chat component 108 may be configured to obtain, manage, and route user input provided and/or exchanged during a chat session.
- user 150 can enter user input as one or more natural language commands (i.e., chat messages) entered via chat interface 129 of client chat application 127 on client computing device 104 .
- user 150 can provide user input to chat interface 129 via a touchscreen, physical buttons, or a keyboard associated with client computing device 104 .
- user 150 can provide user input comprising voice input or other types of input.
- chat component 108 may be configured to store user input obtained within one or more previously mentioned memory components associated with chat interface server 120 .
- chat component 108 may store the messages within database 122 .
- chat processing component 110 may be configured to process user input obtained by chat component 108 .
- chat processing component 110 may process audio input entered via a microphone of client computing device 104 .
- chat processing component 110 may process user input comprising an audio file by performing one or more operations including, for example, voice recognition, conversion of voice messages into textual format, and/or other such operations.
- chat processing component 110 may convert the user input comprising an audio file into a text file by converting the audio file into the text file according to a voice recognition process that may be implemented by the chat application 146 .
- chat processing component 110 may convert the audio file to the text file according to the voice recognition process algorithm implemented by distributed chat application 126 and/or client chat application 127 .
- chat processing component 110 may perform voice recognition by means of a pattern matching method and/or other similar method. For example, when using a pattern matching method to perform voice recognition, a training stage and a recognition stage may be used.
- user input including natural language commands entered via chat interface 129 of client chat application 127 may be used to operate a mobile application (e.g., mobile application 148 ).
- a mobile application e.g., mobile application 148
- the user is provided with a “store front” experience. That is, rather than deciding what element of the graphical user interface associated with client mobile application 148 to engage with, user 150 is instead prompted by messages from the chat assistant with chat interface 129 of client chat application 127 .
- response component 112 may be configured to handle responses to commands from user 150 within the chat interface 129 .
- response component 112 may be configured to handle responses to user commands generated by automated chat assistants, as alluded to earlier.
- an automated assistant may be implemented as a computer program or application (e.g., a software application) that is configured to interact with user 150 via client chat application 127 to provide information or to perform specific actions within mobile application 148 .
- the response component 112 may be configured to provide information items relevant to user command (e.g., a flight from Denver to Miami with no layovers at a particular price).
- information items relevant to user command e.g., a flight from Denver to Miami with no layovers at a particular price.
- user 150 may be interested in purchasing direct flight tickets from Denver to Miami, but wants to do so from a provider that has the lowest prices.
- a user would have to visit multiple mobile applications for individual providers and compare prices.
- a human chat assistant may improve user experience, as it would eliminate user 150 from visiting the provider applications personally, it would still require the human assistant to manually determine which provider offers the best pricing.
- an automated chat assistant may determine which mobile application is best suited for a particular user purpose (i.e., offers lowest concert tickets) by obtaining information from multiple mobile applications within a reduced time frame, thereby improving the response time.
- response component 112 and/or other components may be configured to use machine learning, i.e., a machine learning model that utilizes machine learning to determine responses to user requests.
- machine learning i.e., a machine learning model that utilizes machine learning to determine responses to user requests.
- the expert server or other component
- training data e.g., message training data
- the machine learning model can be trained using synthetic data, e.g., data that is automatically generated by a computer, with no use of user information.
- the machine learning model can be trained based on sample data, e.g., sample message data, for which permissions to utilize user data for training have been obtained expressly from users providing the message data.
- sample data may include received messages and responses that were sent to the received messages.
- the model can predict message responses to received messages, which may then be provided as suggested items.
- response component 112 may be configured to use one or more of a deep learning model, a logistic regression model, a Long Short Term Memory (LSTM) network, supervised or unsupervised model, etc.
- response component 112 may utilize a trained machine learning classification model.
- the machine learning may include decision trees and forests, hidden Markov models, statistical models, cache language model, and/or other models.
- the machine learning may be unsupervised, semi-supervised, and/or incorporate deep learning techniques.
- GUI response component 114 may be configured to effectuate actions within mobile interface 149 of client mobile application 148 based on user commands entered via chat interface 129 .
- GUI response component 114 may effectuate presentation of items relevant and/or responsive to users' request within GUI 149 of client mobile application 148 (e.g., available flights at a particular date, tickets at a particular price, etc.)
- user commands may include one or more actions executed by client mobile application 148 .
- the user commands may include booking a flight, making a dinner reservation, requesting to be picked up by a ride share driver, purchasing a pair of shoes, and so on.
- GUI response component 114 may execute one or more actions within mobile interface 149 based on user commands. For example, upon receiving a user command to book a ticket, mobile interface 149 may display the flight reservation information associated with the order included in the user command.
- FIGS. 4A-4D illustrate an example chat application comprising a chat interface initiated by a user to control and/or operate a mobile application displayed by the client computing device.
- a chat interface 429 of a chat application 427 is displayed on a client computing device 401 operated by a user 410 .
- a chat conversation between user 410 and chat assistant 412 has been initiated.
- the chat conversation may occur between user 410 and a human chat assistant or an automated chat assistant.
- user commands entered by user 150 to the chat conversation may be displayed in chat interface 429 by chat application 427 .
- user commands from the user 410 can be entered in a text input field 430 of the chat interface 429 (e.g., via input devices such as a physical keyboard, displayed touchscreen keyboard, voice input, etc.).
- user 410 may initiate a mobile application 448 within client computing device 404 by entering a text command within the chat interface 429 of chat application 427 on client computing device 404 .
- user 410 may enter a text command (e.g., text command 437 illustrated in FIG. 4A ) via a text input field 430 using a keyboard 415 associated with chat interface 429 .
- user 410 may enter voice commands by initiating a voice command interface 420 .
- chat interface 429 of the chat application 427 may display an icon or avatar associated with user 410 and indicate that user 410 is currently engaged in a chat session with chat assistant 412 .
- Chat assistant 412 may be associated with a particular icon or avatar as indicated in FIGS. 4A-4B .
- chat assistant 412 displayed as a text message 433 : “Hi John! Welcome to Chat Interface. What service would you like to use?”
- user 410 may enter a text command 437 , indicating that they want to initiate interaction with a ride sharing service application, by stating “I need an Uber.”
- user 410 has entered a command in the text input field 430 , where the received command is displayed as message 437 in chat interface 429 after being input.
- This command specifies a mobile application to be displayed in association with the chat application, e.g., in this case a mobile application provided by the ridesharing company Uber Technologies, Inc.
- the user command may not have specified the ridesharing application and included only “I need a ride”. In that case, the user may select a mobile application from a list provided by the chat assistant or selected by the chat assistant based on at least one of a user preferences, ridesharing availability, best prices, and/or other such parameter.
- a response message 439 may be displayed in chat interface 429 .
- response message 439 indicates that the selected mobile application is being initiated.
- mobile application 448 may be initiated and displayed within the client computing device 404 , as illustrated in FIG. 4B .
- mobile application interface 449 is displayed based on data received by the client computing device 404 over the network, e.g., from mobile application 448 at least partially executing on a remote session server or other device connected over the network.
- mobile application interface 449 is displayed underneath chat interface 429 .
- mobile interface 449 is displayed such that chat interface 429 is at least partially displayed, e.g., allowing one or more chat messages in the chat interface 420 to be simultaneously displayed with the mobile application interface 449 .
- the size of the chat interface 429 may be reduced so as to accommodate the display of both the mobile application interface 449 and the chat interface 429 .
- mobile application interface 449 may be configured to display content data associated with mobile application 448 (e.g., map data, driver information, etc.), and a screen control 460 allowing the user to provide user input to enlarge mobile application interface 449 to fit the entire screen (or other display area) of client computing device 404 .
- user 410 may optionally toggle between the full-screen and reduced-screen chat interface 429 illustrated in FIGS. 4A, 4B , respectively.
- the user may reduce the full-screen chat interface 429 illustrated in FIG. 4A via a screen size button 458 .
- user 410 may enlarge reduced-screen chat interface 429 illustrated in FIG. 4B via a screen size button 459 .
- user 410 may reduce the full-screen chat interface 429 or enlarge the reduced-screen chat interface 429 by entering an appropriate text command via text input field 430 .
- chat assistant 412 may further inquire via a text message 440 : “John, do you want an Uber Expert to help you find a ride?” User 410 may provide a response by entering a command 441 indicating that he indeed would like assistance from an expert assistant familiar with this particular mobile application. In some embodiments, chat assistant 412 may inquire whether the user is in need of additional assistance after an occurrence of an event (e.g., user input not received within mobile interface within a particular time period).
- an event e.g., user input not received within mobile interface within a particular time period.
- FIGS. 4C-4D illustrate a chat interface used for a conversation between a user and an expert assistant to control a mobile application.
- chat interface 429 illustrated in FIG. 4C may display an icon or avatar associated with user 410 and indicate that user 410 is currently engaged in a chat session with an expert chat assistant 413 .
- expert chat assistant 413 may include a particular icon or avatar associated with mobile application 448 currently initiated by the user 410 .
- expert chat assistant 413 may further inquire in a chat interface 427 via a text message 443 : “When would you like your Uber driver to arrive?” User 410 may provide a response by entering a command 445 indicating that he wants the Uber driver to arrive in 30 minutes.
- expert chat assistant 413 may have the ability to obtain user's location from client computing device 404 when user 410 permits access to location information. For example, expert chat assistant 413 may inquire whether user 410 needs to be picked up at his current location in a message 447 . User 410 may provide a response by entering a command 449 indicating that he wants the Uber driver to arrive to his brother's house.
- expert chat assistant 413 may have the ability to obtain location information associated with user's contacts from client computing device 404 or from application data stored with mobile application 448 when the user 410 permits access to their contact information. Expert chat assistant 413 may further inquire whether user 410 wants to use a particular service (e.g., Uber X, Uber Pool, or Uber Select) associated with mobile application 448 in a message 450 . User 410 may provide a response by entering a command 451 indicating that he wants to use the Uber X service.
- a particular service e.g., Uber X, Uber Pool, or Uber Select
- expert chat assistant 413 may respond in a message 453 indicating that the Uber driver has been reserved and is arriving to a location specified by user 410 (i.e., user's brother's house) at a time specified by user 410 (i.e., in 30 minutes).
- mobile interface 449 may be displayed underneath chat interface 429 .
- mobile interface 449 includes information based on data received from mobile application 448 .
- FIGS. 5A-5F illustrate additional example chat applications comprising a chat interface initiated by a user to control and/or operate a mobile application displayed by the client computing device.
- a chat interface 529 of a chat application 527 is displayed on a client computing device 504 operated by a user 510 .
- a chat conversation between user 510 and an expert chat assistant 513 has been initiated.
- user commands from the user to the chat conversation may be entered via client computing device 504 and displayed in chat interface 529 by chat application 527 .
- user commands from user 510 can be entered in a text input field 530 of chat interface 529 or by initiating a voice interface 520 .
- user 510 may initiate mobile application 548 within client computing device 504 by entering a text command within chat interface 529 of chat application 527 on client computing device 504 .
- the user 510 may enter a text command (e.g., text command 537 illustrated in FIG. 5A ) via a text input field 530 .
- user 510 may be greeted by expert chat interface assistant 513 displayed as a text message 535 : “Hi John! Welcome to Chat Interface. What service would you like to use?”
- user 510 may enter a text command 537 by indicating that he wants to initiate interaction with a flight booking application by stating: “I need to go from Grand Juncture to San Juan on Saturday.”
- expert chat assistant 513 may obtain additional details associated with user's 510 request. For example, expert chat assistant 513 may inquire in a message 539 whether layovers during the flight are acceptable. User 510 may enter a user command 541 indicating that layovers are not acceptable. Next, expert chat assistant 513 may inquire in a message 543 whether early morning departures are acceptable. User 510 may enter a user command 545 indicating that early morning departures are not acceptable.
- expert chat assistant 513 may respond in a message 547 indicating that the best flight matching user's requirements has been located.
- the details of the flight 550 may be displayed within mobile application interface 549 of mobile application 548 displayed underneath chat interface 529 .
- mobile application interface 549 includes information based on data received from mobile application 548 , which in turn was provided by chat interface 529 of chat application 527 .
- user 510 may modify one or more previously provided user commands. For example, user 510 may locate a previously provided user command (e.g., by using a scrolling motion), modify it, and obtain a new response based on the modified information.
- the response by chat assistant 513 may be provided based on both previously entered user commands and new information.
- the chat interface may generate a new response without requiring user 150 to re-enter all of the previously inputted information which can be time consuming and tedious.
- chat application 527 may provide a chat application GUI 569 , in addition to chat interface 529 .
- Chat application GUI 569 may be used to display the original response and a new response generated based on modified information entered by user 510 .
- the old response and the new response i.e., a differential of the old response
- both old and new user commands and old and new responses generated in response to user commands may be displayed graphically to allow user 510 to visualize the conversation.
- user commands and responses may be displayed hierarchically. That is, a modified user command may be displayed as a branch off of the original user command.
- a corresponding new response generated in response to the modified user command may be displayed alongside the old response.
- user 510 may modify user commands in chat application GUI 569 by directly accessing individual branches corresponding to user commands (e.g., by tapping or pressing the icon representing particular branch).
- user 510 may compare results for different branches by dragging one branch to the other.
- the user may modify previously entered command 537 illustrated in FIG. 5A .
- user command 537 input in FIG. 5A indicated that user 510 wants to travel from Grand Juncture to San Juan on Saturday.
- the modified user command 547 illustrated in FIG. 5C indicates that user 510 wishes travel from Grand Juncture to San Juan on Sunday.
- Expert chat assistant 513 may respond in a message 549 acknowledging that a modification has been made and asking user 510 to confirm it.
- Chat application GUI 569 may display the conversation between user 510 and chat assistant graphically using a branch visualization or such similar method.
- original (i.e., the oldest) user command input 537 in FIG. 5A may be displayed as element 571 .
- User commands that followed original user command input 537 may be displayed following original user command element 571 as elements 572 , 573 , respectively.
- Modified user command 547 may be displayed as element 571 a , i.e., a branch off of original user command element 571 .
- Original response 550 illustrated in FIG. 5B may be displayed as element 574 .
- a modified response may be displayed under the corresponding user command 571 a as element 574 a .
- Modified response 574 a may be generated based on modified user command 571 a and original user commands 572 , 573 thus allowing user 510 to avoid re-entering the information.
- user 510 may drag element 571 a to element 571 to compare the results for original and modified user commands.
- user 510 may request a comparison by input user command asking for a comparison rather than by dragging or otherwise manipulating elements within chat application GUI 569 .
- user 510 may input a user command 581 asking for a comparison.
- expert chat assistant 513 may generate a comparison as indicated by a response 582 .
- the comparison may include original response 550 (i.e., details of the flight from Grand Juncture to San Juan on Saturday) illustrated in FIG. 5B , and a new response 551 (i.e., details of the flight from Grand Juncture to San Juan on Sunday)
- the modified response to modified user commands may be displayed within mobile interface 549 of mobile application 548 .
- expert chat assistant 513 may respond in a message 583 acknowledging that a modification has been made and asking user 510 to confirm it.
- User 510 may confirm by input user command 584 .
- expert chat assistant 513 may respond in a message 585 indicating that the best flight matching the user's requirements has been located.
- Modified responses i.e., details of the flight from Grand Juncture to San Juan on Sunday rather than Saturday
- the mobile interface 549 includes information based on data received from the mobile application 548 which in turn was provided from the chat interface 529 of the chat application 527 .
- FIG. 6 illustrates an example computing module 600 , an example of which may be a processor/controller resident on a mobile device, or a processor/controller used to operate a payment transaction device, that may be used to implement various features and/or functionality of the systems and methods disclosed in the present disclosure.
- module might describe a given unit of functionality that can be performed in accordance with one or more embodiments of the present application.
- a module might be implemented utilizing any form of hardware, software, or a combination thereof.
- processors, controllers, ASICs, PLAs, PALs, CPLDs, FPGAs, logical components, software routines or other mechanisms might be implemented to make up a module.
- the various modules described herein might be implemented as discrete modules or the functions and features described can be shared in part or in total among one or more modules.
- FIG. 6 One such example computing module is shown in FIG. 6 .
- FIG. 6 Various embodiments are described in terms of this example-computing module 600 . After reading this description, it will become apparent to a person skilled in the relevant art how to implement the application using other computing modules or architectures.
- computing module 600 may represent, for example, computing or processing capabilities found within desktop, laptop, notebook, and tablet computers; hand-held computing devices (tablets, PDA's, smart phones, cell phones, palmtops, etc.); mainframes, supercomputers, workstations or servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment.
- Computing module 600 might also represent computing capabilities embedded within or otherwise available to a given device.
- a computing module might be found in other electronic devices such as, for example, digital cameras, navigation systems, cellular telephones, portable computing devices, modems, routers, WAPs, terminals, and other electronic devices that might include some form of processing capability.
- Computing module 600 might include, for example, one or more processors, controllers, control modules, or other processing devices, such as a processor 604 .
- Processor 604 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, controller, or other control logic.
- processor 604 is connected to a bus 602 , although any communication medium can be used to facilitate interaction with other components of computing module 600 or to communicate externally.
- Computing module 600 might also include one or more memory modules, simply referred to herein as main memory 608 .
- main memory 608 preferably random access memory (RAM) or other dynamic memory might be used for storing information and instructions to be executed by processor 604 .
- Main memory 608 might also be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 604 .
- Computing module 600 might likewise include a read only memory (“ROM”) or other static storage device coupled to bus 602 for storing static information and instructions for processor 604 .
- ROM read only memory
- the computing module 600 might also include one or more various forms of information storage devices 610 , which might include, for example, a media drive 612 and a storage unit interface 620 .
- the media drive 612 might include a drive or other mechanism to support fixed or removable storage media 614 .
- a hard disk drive, a floppy disk drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive might be provided.
- storage media 614 might include, for example, a hard disk, a floppy disk, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to or accessed by media drive 612 .
- the storage media 614 can include a computer usable storage medium having stored therein computer software or data.
- information storage devices 610 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into computing module 600 .
- Such instrumentalities might include, for example, a fixed or removable storage unit 622 and a storage unit interface 620 .
- storage units 622 and storage unit interfaces 620 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, a PCMCIA slot and card, and other fixed or removable storage units 622 and interfaces 620 that allow software and data to be transferred from the storage unit 622 to computing module 600 .
- Computing module 600 might also include a communications interface 624 .
- Communications interface 624 might be used to allow software and data to be transferred between computing module 600 and external devices.
- Examples of communications interface 624 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface.
- Software and data transferred via communications interface 624 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a given communications interface 624 . These signals might be provided to communications interface 624 via a channel 628 .
- This channel 628 might carry signals and might be implemented using a wired or wireless communication medium.
- Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels.
- computer program medium and “computer usable medium” are used to generally refer to transitory or non-transitory media such as, for example, memory 608 , storage unit interface 620 , media 614 , and channel 628 .
- These and other various forms of computer program media or computer usable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution.
- Such instructions embodied on the medium are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable the computing module 600 to perform features or functions of the present application as discussed herein.
- module does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 16/565,452 filed on Sep. 9, 2019, the contents of which are incorporated herein by reference in its entirety.
- The present disclosure is generally related to user interfaces. More particularly, the present disclosure is directed to systems and methods for controlling the operation of a mobile application using a conversation interface.
- A user interface is a system by which users interact with a computing device, such as a computer or a mobile electronic device. In general, a user interface allows users to input information to manipulate the computing device. A user interface also allows the computing device to output information as to the effects of the manipulation. In computing, a graphical user interface (GUI) is a type of user interface that allows users to interact with computing devices with images rather than text commands. That is, a GUI represents the information and actions available to a user through graphical icons and visual indicators such as secondary notation, as opposed to text-based interfaces, typed command labels, or text navigation. The actions are usually performed through direct manipulation of the graphical elements.
- An electronic form is a type of GUI view that is specifically designed to allow a user to enter data in a structured manner for processing by a computing device. An electronic form is an electronic version of a physical form, such as a paper document with blank spaces for insertion of required or requested information. An electronic form provides an input template comprising various combinations of checkboxes, radio buttons, form fields, and other GUI elements designed to query and display data.
- While GUI interfaces are intuitive and provide a convenient interface, they may present some challenges for some users, especially unsophisticated or elderly operators. An inexperienced user may often have difficulties locating the correct icon or form within the GUI of application when attempting to invoke desired functionality. Accordingly, a user may be forced to underutilize the capabilities of an application, or worse, end up with an unsatisfactory result. There is an ongoing need for improved systems and methods to allow users to interact with an application that operates with a GUI interface.
- In accordance with one or more embodiments, various features and functionality can be provided to enable or otherwise facilitate control and operation of a selected mobile application comprising a native GUI via an exchange of natural language commands in a conversation interface.
- In some embodiments, the system for controlling the operation of a selected mobile application may include obtaining a command for initiating a conversation interface. For example, the command may be originating from a computing device operated by a user. In some embodiments, the conversation interface may be displayed adjacent to a mobile application, the mobile application comprising a graphical user interface (GUI).
- In some embodiments, the conversation interface may be configured to receive user input comprising one or more user commands. In some embodiments, the conversation interface may display assistant user input comprising one or more responses generated by an assistant user based on the user input.
- In some embodiments, a first user input comprising a first text command may be obtained. For example, the first user input may be entered by the user via the client computing device and displayed in the conversation interface. In some embodiments, a first response generated by the assistant user in response to the first user input may be obtained. For example, the first response may be displayed in the conversation interface.
- In some embodiments, the mobile application may be updated based on the first user input received from the user. In some embodiments, updating the mobile application may include outputting an output command associated with one or more actions that may occur in the mobile application. For example, the actions that occur within the mobile application may include at least one of a travel reservation, a dining reservation, and a purchase transaction. In some embodiments, updating of the mobile application may include updating the GUI of the mobile application.
- In some embodiments, the output command may comprise output data associated with the updating of the mobile application based on the user input. For example, the output command may comprise output data associated with the first user input.
- In some embodiments, a modified user input comprising a modified text command may be obtained. For example, the first user input entered via the client computing device and displayed in the conversation interface may be modified by the user.
- In some embodiments, the mobile application may be updated based on the modified user input. In some embodiments, the output command may comprise output data associated with the updating of the mobile application based on the modified user input. In some embodiments, a graphical representation of the output command data associated with the first and modified user input may be generated and displayed in the conversation interface.
- In some embodiments, the command for initiating the conversation interface may be obtained during an active session of the mobile application operation by the user.
- In some embodiments, account information associated with the mobile application operated by the user may be obtained. For example, the account information comprising historic user data indicating commands previously generated by the user and received by the mobile application may be obtained. In some embodiments, a geographic location associated with the computing device operated by the user and indicating a real world location of the user may be obtained.
- In some embodiments, a second response generated by the assistant user may be obtained. For example, the second response may be based on the historic user data and the real world location of the user.
- Other features and aspects of the disclosed technology will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the features in accordance with embodiments of the disclosed technology. The summary is not intended to limit the scope of any inventions described herein, which are defined solely by the claims attached hereto.
-
FIG. 1 illustrates example systems and a network environment, according to an implementation of the disclosure. -
FIG. 2 illustrates an example chat interface server of the example system ofFIG. 1 , according to an implementation of the disclosure. -
FIG. 3 illustrates an example process for initiating a client chat application, according to an implementation of the disclosure. -
FIGS. 4A-4D illustrate an example chat interface used to operate a mobile application, according to an implementation of the disclosure. -
FIGS. 5A-5E illustrate an example chat interface used to operate a mobile application, according to an implementation of the disclosure. -
FIG. 6 illustrates an example computing system that may be used in implementing various features of embodiments of the disclosed technology. - Described herein are systems and methods for controlling the operation of a selected mobile application with a native GUI via an exchange of textual data between users and experts in a conversation interfaces. The details of some example embodiments of the systems and methods of the present disclosure are set forth in the description below. Other features, objects, and advantages of the disclosure will be apparent to one of skill in the art upon examination of the following description, drawings, examples, and claims. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the present disclosure, and be protected by the accompanying claims.
- As alluded to above, users may experience challenges when interacting with a mobile application with a GUI. In particular, users that use mobile applications do so with a particular purpose and often to solve a particular problem (e.g., reserve airline tickets or purchase an item). When using an application to complete a task, users are forced to do so by interacting with an application via an interface, more specifically a GUI. However, all GUI interfaces are essentially artificial creations invented to enable interactions between a user and a device. Accordingly, users have to adapt to interfaces, i.e., learn the rules on how to operate them. Because of these added cognitive demands associated with using a GUI, users focus less on solving the problem and more on learning the GUI.
- Humans use spoken language as a natural interface. Thus, the most comfortable way for humans to solve problems is through a conversation. By allowing users to interact with an application using natural language, results in a more user-friendly interaction between users and the application and produce satisfactory user experience.
- In accordance with various embodiments, a user can interact with a mobile application by invoking a conversation or chat interface of a chat application rather than use native GUI of the mobile application. For example, a user can interact with a particular mobile application using text commands in a natural language without having to learn new skills to interact with the GUI. Because the chat interface is invoked contemporaneously with and is displayed next to the application, the user can control existing mobile applications including their underlining functionalities and utilize data associated with those applications without the need to reenter or otherwise provide information necessary to complete their task. Finally, by presenting the chat interface next to the mobile application GUI, allows the user to visually observer the results of their interactions as if they were interacting with the application via the GUI rather than the chat interface. In other words, by virtue of seeing the outcome of their interactions allows the users to preserve the positive aspects of a GUI (e.g., visual representation of results) while improving users' ability to effectively interact with the mobile application.
- Before describing the technology in detail, it is useful to describe an example environment in which the presently disclosed technology can be implemented.
FIG. 1 illustrates onesuch example environment 100. -
FIG. 1 illustrates anexample environment 100 for providing a chat interface for interacting with mobile applications, as described herein. In some embodiments,environment 100 may include achat interface server 120, a one ormore expert servers 130, amobile application server 140, a one or moreclient computing devices 104, and anetwork 103. A user 150 may be associated withclient computing device 104 as described in detail below. - In some embodiments, the various below-described components of
FIG. 1 may be used to initiate a client chat application 127 (i.e., the chat application running onclient computing device 104 provided by a distributed chat application 126) withinclient computing device 104. For example,client chat application 127 may comprise a chat interface (e.g., as illustrated inFIG. 2 ) and may be configured to control and/or operate a clientmobile application 148 running on client computing device 104 (clientmobile application 148 may be provided by a distributedmobile application 146 running on mobile application server 140). - In some embodiments,
client chat application 127 may be initiated after clientmobile application 148 has been initiated. For example, the various components ofFIG. 1 may be configured to initiateclient chat application 127 upon receiving user input associated with initiatingclient chat application 127 as will be described in detail below. For example, user 150 may first initiate clientmobile application 148 via one or more user inputs associated with initiating clientmobile application 148. Next, user 150 may provide additional user input configured to initiateclient chat application 127 to run alongside or next to clientmobile application 148 onclient computing device 104. In some embodiments, the various components ofFIG. 1 may be configured to initiateclient chat application 127 upon initiating clientmobile application 148 automatically, i.e., without receiving additional user input. - In some embodiments,
client chat application 127 may be initiated upon receiving user input associated with initiating the chat application as will be described in detail below. For example, user 150 may provide user input (e.g., knocking or tapping) within a GUI associated withclient computing device 104. In some embodiments, the various components ofFIG. 1 may be configured to initiate clientmobile application 148 by user input within the chat interface ofclient chat application 127. For example, user 150 may provide user input within the chat interface to initiate clientmobile application 148. - In some embodiments and as will be described in detail in
FIG. 2 ,chat interface server 120 may include a processor, a memory, and network communication capabilities. In some embodiments,chat interface server 120 may be a hardware server. In some implementations,chat interface server 120 may be provided in a virtualized environment, e.g.,chat interface server 120 may be a virtual machine that is executed on a hardware server that may include one or more other virtual machines.Chat interface server 120 may be communicatively coupled to anetwork 103. In some embodiments,chat interface server 120 may transmit and receive information to and from one or more ofclient computing devices 104,mobile application server 140, one ormore expert servers 130, and/or other servers vianetwork 103. - In some embodiments,
chat interface server 120 may includechat application 126, as alluded to above.Chat application 126 may be a distributed application implemented on one or moreclient computing devices 104 asclient chat application 127, as described herein. In some embodiments,chat application 126 included inchat interface server 120 may provide client functionality to enable user 150 to operate clientmobile application 148 running onclient computing device 104. For example, theclient chat application 127 may include a chat interface (not illustrated) which allows user 150 to use natural language commands to operate clientmobile application 148. In some embodiments, the chat interface may allow user 150 to operate clientmobile application 148 by exchanging messages with one or more chat assistants (e.g., human users or automated software agents or bots). For example, these chat assistants may help user 150 to operate the clientmobile application 148 by eliciting commands from user 150 intended for clientmobile application 148, generating responses, and effectuating clientmobile application 148 to generate results associated with the commands received from user 150. By virtue of exchanging messages with an assistant, user 150 can operate clientmobile application 148 without having to learn an interface associated withmobile application 148, resulting in a more efficient and streamlined user experience. - In some embodiments, distributed
chat application 126 may be implemented using a combination of hardware and software. In some embodiments,chat application 126 may be a server application, a server module of a client-server application, or a distributed application (e.g., with a correspondingclient chat application 127 running on one or more client computing devices 104). - In some embodiments,
chat interface server 120 may also include adatabase 122. For example,database 122 may store communications or messages exchanged via the chat interface, user data associated with user 150, and/or other information. - In some embodiments, as alluded to above,
chat interface server 120 may comprise computer program components operable by the processor to enable exchange of messages between user 150 and one or more human users or bots. In some embodiments,chat interface server 120 may include one or human expert users or agents assisting user 150 in operating clientmobile application 148 provided onclient computing device 104. - In some embodiments, human chat assistants may be selected from a group of specially trained assistants or experts. For example, expert assistants may be skilled in providing assistance to users operating a particular mobile application (e.g., Uber, Expedia, and so on). In some embodiments, the experts may be implemented on
expert server 130. In yet other embodiments, the experts may be implemented onclient device 104 and not onchat interface server 120. - In some embodiments, automated software assistants or bots may be provided by distributed
chat application 126. For example, the automated assistant or bot may interact with users through text, e.g., via chat interface ofclient chat application 127. In some embodiments, an automated assistant may be implemented by an automated assistant provider such that it is not the same as the provider of distributedchat application 126. - In some embodiments, as alluded to above, one or
more expert servers 130 may implement chat assistant services, including human expert services, bot services, and/or other similar services as described in further detail below. In some embodiments, one ormore expert servers 130 may include one or more processors, memory and network communication capabilities (not shown). In some embodiments,expert server 130 may be a hardware server connected to network 103, using wired connections, such as Ethernet, coaxial cable, fiber-optic cable, etc., or wireless connections, such as Wi-Fi, Bluetooth, or other wireless technology. In some embodiments,expert server 130 may transmit data between one or more of thechat interface server 120 andclient computing device 104 vianetwork 103. In some embodiments,expert server 130 may be managed by the same party that manageschat interface server 120. In other embodiments,expert server 130 may a be third-party server, e.g., controlled by a party different from the party that provides chat interface services (i.e., chat interface server 120). - In some embodiments, as alluded to above, user 150 may exchange messages with one or more assistants within a chat interface of
client chat application 127 provided onclient user device 104. For example, user 150 may enter natural language commands and receive responses from the expert agents. - In some embodiments,
client computing device 104 may include a variety of electronic computing devices, such as, for example, a smartphone, tablet, laptop, computer, wearable device, television, virtual reality device, augmented reality device, displays, connected home device, Internet of Things (IOT) device, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices, and/or other devices. In some embodiments,client computing device 104 may present content to a user and receive user input. In some embodiments,client computing device 104 may parse, classify, and otherwise process user input. For example,client computing device 104 may store user input including commands for initiatingclient chat application 127, as will be described in detail below. - In some embodiments,
client computing device 104 may be equipped with GPS location tracking and may transmit geolocation information via a wireless link andnetwork 103. In some embodiments,chat interface server 120 and/or distributedchat application 126 may use the geolocation information to determine a geographic location associated with user 150. In some embodiments,chat interface server 120 may use signal transmitted byclient computing device 104 to determine the geolocation of user 150 based on one or more of signal strength, GPS, cell tower triangulation, Wi-Fi location, or other input. In some embodiments, the geolocation associated with user 150 may be used by one or more computer program components associated with thechat application 126 during user 150 interaction with chat interface of theclient chat application 127. - In some embodiments,
mobile application server 140 may include one or more processors, memory and network communication capabilities. In some embodiments,mobile application server 140 may be a hardware server connected to network 103, using wired connections, such as Ethernet, coaxial cable, fiber-optic cable, etc., or wireless connections, such as Wi-Fi, Bluetooth, or other wireless technology. In some embodiments,mobile application server 140 may transmit data between one or more ofchat interface server 120,client computing device 104, and/or other components vianetwork 103. - In some embodiments,
mobile application server 140 may include one or more distributed mobile applications (e.g., mobile application 146) implemented onclient computing device 104 as clientmobile application 148. In some embodiments, user 150 may instruct themobile server 140 to downloadmobile application 146 onclient computing device 104 as clientmobile application 148. For example, in response to user 150 requesting to download the clientmobile application 146, themobile application server 140 may transmit the data toclient computing device 104 to execute clientmobile application 148 onclient computing device 104. - In some embodiments,
mobile application 146 may communicate and interface with a framework implemented by distributedchat application 126 using an application program interface (API) that provides a set of predefined protocols and other tools to enable the communication. For example, the API can be used to communicate particular data fromchat application 126 used to connect to and synchronize with clientmobile application 148 that user 150 is operating via the chat interface ofclient chat application 127. - In some embodiments,
mobile application server 140 may include adatabase 142. For example,database 142 may store user data associated with user 150, and/or other information. For example, user data may include user account information such as login name, password, preferences, and so on. In some embodiments, user data may include historic information indicating previous interactions between user 150 and clientmobile application 148. For example, historic information may include purchase transaction data or travel reservation data previously made by user 150. In some embodiments, user data including user account data and historic data may be communicated frommobile application server 140 to chatapplication 126. - By virtue of providing communication between distributed
chat application 126 and/orclient chat application 127 and clientmobile application 148 results in an efficient control of mobile application functions. For example, user commands received as input to the chat interface can be input to clientmobile application 148, allowingmobile application 146 to respond to commands and messages from the chat interface ofclient chat application 127. - In some embodiments, user data obtained from
mobile application server 140 may be used bychat application 126. For example, user data may be by chat assistants or by automated software agents or bots when helping user 150 to operate clientmobile application 148. For example, chat assistant may utilize user data to determine user preferences based on prior interactions with clientmobile application 148. By virtue of using user data, allows to eliminate potential exchanges between chat assistant and user 150. - In some embodiments, client
mobile application 148 may send events or notifications directly toclient chat application 127 to be displayed via the chat interface. By virtue of displaying the notifications withinclient chat application 127 allows users not currently actively interacting with clientmobile application 148 to stay informed of any changes and/or events. For example, user 150 searching for airline tickets may receive a notification that a price of an airline ticket decreased prompting user 150 to initiate a purchase transaction. This may improve the engagement, retention, and awareness of users about the occurrence of events and changes within mobile application relevant to user 150. - In some embodiments, a single chat interface may be used to control multiple mobile applications simultaneously. For example, by using the chat interface with several mobile applications in association with a single chat interface allows users to interact with the mobile application with reduced user input and reduced time. That is, when using a single chat interface to interact with multiple mobile applications allows to reduce consumption of device resources that would otherwise be needed to process, display, and receive user input in multiple mobile applications. Further, using a single chat application with multiple mobile applications may reduce and/or eliminate time required for switching displays of different interfaces for different mobile applications on a client computing device, reducing copying and pasting displayed data in one mobile application to another mobile application and its interface, and reducing the repeating of commands.
- In some embodiments, a standard API can be used between distributed
chat application 126 and/orclient chat application 127 and clientmobile application 148, allowing user 150 to control a large variety of mobile applications provided by many different providers. -
FIG. 2 illustrates an examplechat interface server 120 configured in accordance with one embodiment. In some embodiments, as alluded to above,chat interface server 120 may include a distributedchat application 126 and a correspondingclient chat application 127 running on one or moreclient computing devices 104. The correspondingclient chat application 127 may comprise achat interface 129 and may be configured to provide client functionality to enable user 150 to operatemobile application 148 provided onclient computing device 104 via natural language commands entered viachat interface 129 rather than aGUI 149 associated withmobile application 148. For example, a user can interact with a particular mobile application using text commands in a natural language entered into the chat interface in order to complete a particular task, as alluded to above. - In some embodiments, distributed
chat application 126 may be operable by one or more processor(s) 124 configured to execute one or more computer program components. In some embodiments, the computer program components may include one or more of achat interface component 106, achat component 108, achat processing component 110, aresponse component 112, aGUI response component 114, and/or other such components. - In some embodiments, as alluded to above, user 150 may access the
chat interface server 120 viaclient computing device 104. In some embodiments,chat interface component 106 may be configured to initiateclient chat application 127 onclient computing device 104. For example,chat interface component 106 may be configured to detect one or more user inputs or interactions from one of theclient computing devices 104 and interpret the detected input or interaction as a command for initiatingclient chat application 127. - In some embodiments, user 150 may initiate the
client chat application 127 by interacting with an icon corresponding toclient chat application 127 which has been downloaded ontoclient computing device 104 overnetwork 103. For example,client chat application 127 may be initiated upon receiving input from user 150 (i.e., the user selects the icon). In other embodiments, user 150 may initiatechat application 127 via one or more haptic commands, voice commands, and/or a combination of haptic and voice commands. For example, the haptic commands may include user 150 knocking, tapping, and/or scratching onclient computing device 104. Alternatively, user 150 may initiateclient chat application 127 by speaking a voice command (e.g., “Start Chat”). - In some embodiments, the haptic commands associated with initiating the
client chat application 127 may be selected by thechat application 126 running on thechat interface server 120. For example, thechat application 126 may include a double knocking command used to initiate theclient chat application 127. In some embodiments, user 150 may modify the haptic command selection to another command available to the user. For example, user 150 may indicate that instead of double knocking, the user wants to initiateclient chat application 127 by scratchingclient computing device 104. In some embodiments, user 150 may create a new haptic or voice command by recording the user input associated with the command. - In some embodiments,
chat interface component 106 may be configured to capture audio signal produced from the haptic input (such as knocking, tapping, or scratching) or voice input (such as a command spoken by a user) by the device microphone. For example, user 150 may knock twice on the device resulting in an audio signal. In some embodiments, the captured audio signal may be obtained bychat interface component 106 to determine whether the audio signal corresponds to the audio signal used to initiateclient chat application 127. For example, the audio signal may be obtained from a microphone ofclient computing device 104. In some embodiments,chat interface component 106 may be configured to manipulate the audio signal obtained by transmitting the audio signal to thechat interface server 120. In some embodiments,chat interface component 106 may be configured to process audio signal. For example,chat interface component 106 may be configured perform at least one of a noise removal, windowing, and a spectrum analysis during processing of the audio signal. In some embodiments,chat interface component 106 may be configured to determine if the audio signal received from the microphone ofclient computing device 104 is a valid haptic input or a voice command by matching the processed audio signal to a valid audio signal. In some embodiments, the valid audio signal may be obtained fromdatabase 122. - In some embodiments, upon determining that the haptic or voice command is valid,
chat interface component 106 may be configured to initiateclient chat application 127 onclient computing device 104.FIG. 3 illustrates a flow diagram describing a method for initiating a chat application on a client computing device, in accordance with one embodiment. In some embodiments,method 300 can be implemented, for example, on a server system, e.g.,chat interface server 120, as illustrated inFIG. 1 . At operation 310,chat interface component 106 determines whether received user input command (i.e., haptic or voice command) for initiating the chat application is valid. For example, as alluded to earlier,chat interface component 106 may process an audio signal obtained from a microphone of client computing device and compare it a valid audio signal. Atoperation 315, upon determining that the received user input for initiating chat application is valid,chat interface component 106 determines whether the mobile application has already been initialized within the client computing device. Atoperation 320, upon determining that the mobile application has not been initialized within the client computing device,chat interface component 106 initiates the chat application within the client computing device. Atoperation 325, upon determining that the mobile application has been initialized within the client computing device,chat interface component 106 may be configured to determine whether the particular mobile application is compatible to be used with the chat interface of the chat application. Atoperation 330, upon determining that the mobile application initialized within the client computing device is compatible to be used with the chat interface of the chat application,chat interface component 106 initiates the chat application such that the chat application is displayed adjacent to the mobile application on the client computing device. - Referring back to
FIG. 2 , in some embodiments,chat interface component 106 may be configured to initiateclient chat application 127 upon initiating clientmobile application 148 automatically, i.e., without receiving additional user input. In some embodiments, the automatic initiation ofclient chat application 127 may be determined by the one or more initiation settings associated with thechat application 126 running on thechat interface server 120. For example, thechat application 126 may be associated with one or more mobile applications that would causeclient chat application 127 to be initiated onclient computing device 104 upon initiation of those mobile application without any additional user input. In some embodiments, user 150 may modify which mobile applications would trigger automatic initiation ofchat application 127. For example, user 150 may indicate that only a particular mobile application would cause the automatic initiation ofclient chat application 127. - In some embodiments,
chat interface component 106 may be configured to initiate clientmobile application 148 upon receiving user input entered into thechat interface 129 ofclient chat application 127. For example, user 150 may initiate clientmobile application 148 withinclient computing device 104 by entering a corresponding text command within the chat interface ofclient chat application 127 onclient computing device 104. In some embodiments, the text command may be explicit (e.g., “Start Uber”). In yet other embodiments,chat interface component 106 may be configured initiate clientmobile application 148 upon receiving a text command that is not explicit (e.g., “I need a ride”). - In some embodiments,
chat component 108 may be configured to obtain, manage, and route user input provided and/or exchanged during a chat session. For example, as alluded to above, user 150 can enter user input as one or more natural language commands (i.e., chat messages) entered viachat interface 129 ofclient chat application 127 onclient computing device 104. In some embodiments, user 150 can provide user input to chatinterface 129 via a touchscreen, physical buttons, or a keyboard associated withclient computing device 104. In yet other embodiments, user 150 can provide user input comprising voice input or other types of input. In some embodiments,chat component 108 may be configured to store user input obtained within one or more previously mentioned memory components associated withchat interface server 120. In some embodiments,chat component 108 may store the messages withindatabase 122. - In some embodiments,
chat processing component 110 may be configured to process user input obtained bychat component 108. For example,chat processing component 110 may process audio input entered via a microphone ofclient computing device 104. In some embodiments,chat processing component 110 may process user input comprising an audio file by performing one or more operations including, for example, voice recognition, conversion of voice messages into textual format, and/or other such operations. - In some embodiments,
chat processing component 110 may convert the user input comprising an audio file into a text file by converting the audio file into the text file according to a voice recognition process that may be implemented by thechat application 146. For example, after obtaining the user audio file,chat processing component 110 may convert the audio file to the text file according to the voice recognition process algorithm implemented by distributedchat application 126 and/orclient chat application 127. In some embodiments,chat processing component 110 may perform voice recognition by means of a pattern matching method and/or other similar method. For example, when using a pattern matching method to perform voice recognition, a training stage and a recognition stage may be used. - As alluded to earlier, user input including natural language commands entered via
chat interface 129 ofclient chat application 127 may be used to operate a mobile application (e.g., mobile application 148). By virtue of using commands to complete a particular task rather than using a graphical user interface of a particular mobile application, the user is provided with a “store front” experience. That is, rather than deciding what element of the graphical user interface associated with clientmobile application 148 to engage with, user 150 is instead prompted by messages from the chat assistant withchat interface 129 ofclient chat application 127. - In some embodiments,
response component 112 may be configured to handle responses to commands from user 150 within thechat interface 129. For example,response component 112 may be configured to handle responses to user commands generated by automated chat assistants, as alluded to earlier. In some embodiments, an automated assistant may be implemented as a computer program or application (e.g., a software application) that is configured to interact with user 150 viaclient chat application 127 to provide information or to perform specific actions withinmobile application 148. - In some embodiments, the
response component 112 may be configured to provide information items relevant to user command (e.g., a flight from Denver to Miami with no layovers at a particular price). By virtue of utilizing an automated assistant rather than a human assistant, permits the automated assistant to review large sets of data in multiple data sources (i.e., mobile application) within a short period of time. - For example, user 150 may be interested in purchasing direct flight tickets from Denver to Miami, but wants to do so from a provider that has the lowest prices. Conventionally, a user would have to visit multiple mobile applications for individual providers and compare prices. While using a human chat assistant may improve user experience, as it would eliminate user 150 from visiting the provider applications personally, it would still require the human assistant to manually determine which provider offers the best pricing. In contrast, an automated chat assistant may determine which mobile application is best suited for a particular user purpose (i.e., offers lowest concert tickets) by obtaining information from multiple mobile applications within a reduced time frame, thereby improving the response time.
- In some embodiments,
response component 112 and/or other components (e.g., the one or more expert servers 130) of theenvironment 100 illustrated inFIG. 1 , may be configured to use machine learning, i.e., a machine learning model that utilizes machine learning to determine responses to user requests. For example, in a training stage, the expert server (or other component) can be trained using training data (e.g., message training data) of actual or generated messages in a messaging application context, and then at an inference stage can determine suggested items to new messages or other data it receives. For example, the machine learning model can be trained using synthetic data, e.g., data that is automatically generated by a computer, with no use of user information. In some embodiments, the machine learning model can be trained based on sample data, e.g., sample message data, for which permissions to utilize user data for training have been obtained expressly from users providing the message data. For example, sample data may include received messages and responses that were sent to the received messages. Based on the sample data, the model can predict message responses to received messages, which may then be provided as suggested items. - In some embodiments,
response component 112 may be configured to use one or more of a deep learning model, a logistic regression model, a Long Short Term Memory (LSTM) network, supervised or unsupervised model, etc. In some embodiments,response component 112 may utilize a trained machine learning classification model. For example, the machine learning may include decision trees and forests, hidden Markov models, statistical models, cache language model, and/or other models. In some embodiments, the machine learning may be unsupervised, semi-supervised, and/or incorporate deep learning techniques. - In some embodiments,
GUI response component 114 may be configured to effectuate actions withinmobile interface 149 of clientmobile application 148 based on user commands entered viachat interface 129. For example,GUI response component 114 may effectuate presentation of items relevant and/or responsive to users' request withinGUI 149 of client mobile application 148 (e.g., available flights at a particular date, tickets at a particular price, etc.) In some embodiments, user commands may include one or more actions executed by clientmobile application 148. For example, the user commands may include booking a flight, making a dinner reservation, requesting to be picked up by a ride share driver, purchasing a pair of shoes, and so on. In some embodiments,GUI response component 114 may execute one or more actions withinmobile interface 149 based on user commands. For example, upon receiving a user command to book a ticket,mobile interface 149 may display the flight reservation information associated with the order included in the user command. -
FIGS. 4A-4D illustrate an example chat application comprising a chat interface initiated by a user to control and/or operate a mobile application displayed by the client computing device. For example, inFIG. 4A , achat interface 429 of achat application 427 is displayed on a client computing device 401 operated by auser 410. In this example, a chat conversation betweenuser 410 andchat assistant 412 has been initiated. In some embodiments, the chat conversation may occur betweenuser 410 and a human chat assistant or an automated chat assistant. In some embodiments, user commands entered by user 150 to the chat conversation may be displayed inchat interface 429 bychat application 427. For example, user commands from theuser 410 can be entered in atext input field 430 of the chat interface 429 (e.g., via input devices such as a physical keyboard, displayed touchscreen keyboard, voice input, etc.). - In this example,
user 410 may initiate amobile application 448 withinclient computing device 404 by entering a text command within thechat interface 429 ofchat application 427 onclient computing device 404. For example,user 410 may enter a text command (e.g.,text command 437 illustrated inFIG. 4A ) via atext input field 430 using akeyboard 415 associated withchat interface 429. In some embodiments,user 410 may enter voice commands by initiating avoice command interface 420. - In some embodiments, upon initiating
chat application 427, thechat interface 429 of thechat application 427 may display an icon or avatar associated withuser 410 and indicate thatuser 410 is currently engaged in a chat session withchat assistant 412.Chat assistant 412 may be associated with a particular icon or avatar as indicated inFIGS. 4A-4B . - In some embodiments, upon initiating
chat application 427,user 410 may be greeted bychat assistant 412 displayed as a text message 433: “Hi John! Welcome to Chat Interface. What service would you like to use?” In response,user 410 may enter atext command 437, indicating that they want to initiate interaction with a ride sharing service application, by stating “I need an Uber.” - In this example,
user 410 has entered a command in thetext input field 430, where the received command is displayed asmessage 437 inchat interface 429 after being input. This command specifies a mobile application to be displayed in association with the chat application, e.g., in this case a mobile application provided by the ridesharing company Uber Technologies, Inc. In some embodiments, the user command may not have specified the ridesharing application and included only “I need a ride”. In that case, the user may select a mobile application from a list provided by the chat assistant or selected by the chat assistant based on at least one of a user preferences, ridesharing availability, best prices, and/or other such parameter. - In response to the
command 437, aresponse message 439 may be displayed inchat interface 429. For example,response message 439 indicates that the selected mobile application is being initiated. - In response to
user command 437,mobile application 448 may be initiated and displayed within theclient computing device 404, as illustrated inFIG. 4B . In some embodiments,mobile application interface 449 is displayed based on data received by theclient computing device 404 over the network, e.g., frommobile application 448 at least partially executing on a remote session server or other device connected over the network. - In some embodiments,
mobile application interface 449 is displayed underneathchat interface 429. In this example,mobile interface 449 is displayed such thatchat interface 429 is at least partially displayed, e.g., allowing one or more chat messages in thechat interface 420 to be simultaneously displayed with themobile application interface 449. - In some embodiments, upon initiating
mobile application 448, the size of thechat interface 429 may be reduced so as to accommodate the display of both themobile application interface 449 and thechat interface 429. In some embodiments,mobile application interface 449 may be configured to display content data associated with mobile application 448 (e.g., map data, driver information, etc.), and ascreen control 460 allowing the user to provide user input to enlargemobile application interface 449 to fit the entire screen (or other display area) ofclient computing device 404. - In some embodiments,
user 410 may optionally toggle between the full-screen and reduced-screen chat interface 429 illustrated inFIGS. 4A, 4B , respectively. For example, the user may reduce the full-screen chat interface 429 illustrated inFIG. 4A via ascreen size button 458. Alternatively,user 410 may enlarge reduced-screen chat interface 429 illustrated inFIG. 4B via ascreen size button 459. In some embodiments,user 410 may reduce the full-screen chat interface 429 or enlarge the reduced-screen chat interface 429 by entering an appropriate text command viatext input field 430. - In some embodiments, upon initiating the
mobile application 448, the user may choose to interact with thegraphical user interface 449 provided by themobile application 448 or continue their interaction within thechat interface 429. As illustrated inFIG. 4B ,chat assistant 412 may further inquire via a text message 440: “John, do you want an Uber Expert to help you find a ride?”User 410 may provide a response by entering acommand 441 indicating that he indeed would like assistance from an expert assistant familiar with this particular mobile application. In some embodiments,chat assistant 412 may inquire whether the user is in need of additional assistance after an occurrence of an event (e.g., user input not received within mobile interface within a particular time period). -
FIGS. 4C-4D illustrate a chat interface used for a conversation between a user and an expert assistant to control a mobile application. Upon receiving user command 441 (illustrated inFIG. 4B ) indicating thatuser 410 is interested in receiving help from an expert assistant,chat interface 429 illustrated inFIG. 4C , may display an icon or avatar associated withuser 410 and indicate thatuser 410 is currently engaged in a chat session with anexpert chat assistant 413. In this example,expert chat assistant 413 may include a particular icon or avatar associated withmobile application 448 currently initiated by theuser 410. - As illustrated in
FIG. 4C ,expert chat assistant 413 may further inquire in achat interface 427 via a text message 443: “When would you like your Uber driver to arrive?”User 410 may provide a response by entering acommand 445 indicating that he wants the Uber driver to arrive in 30 minutes. In some embodiments,expert chat assistant 413 may have the ability to obtain user's location fromclient computing device 404 whenuser 410 permits access to location information. For example,expert chat assistant 413 may inquire whetheruser 410 needs to be picked up at his current location in amessage 447.User 410 may provide a response by entering acommand 449 indicating that he wants the Uber driver to arrive to his brother's house. In some embodiments,expert chat assistant 413 may have the ability to obtain location information associated with user's contacts fromclient computing device 404 or from application data stored withmobile application 448 when theuser 410 permits access to their contact information.Expert chat assistant 413 may further inquire whetheruser 410 wants to use a particular service (e.g., Uber X, Uber Pool, or Uber Select) associated withmobile application 448 in amessage 450.User 410 may provide a response by entering acommand 451 indicating that he wants to use the Uber X service. - In
FIG. 4D ,expert chat assistant 413 may respond in amessage 453 indicating that the Uber driver has been reserved and is arriving to a location specified by user 410 (i.e., user's brother's house) at a time specified by user 410 (i.e., in 30 minutes). Additionally,mobile interface 449 may be displayed underneathchat interface 429. In this example,mobile interface 449 includes information based on data received frommobile application 448. -
FIGS. 5A-5F illustrate additional example chat applications comprising a chat interface initiated by a user to control and/or operate a mobile application displayed by the client computing device. For example, inFIG. 5A , achat interface 529 of achat application 527 is displayed on aclient computing device 504 operated by auser 510. In this example, a chat conversation betweenuser 510 and anexpert chat assistant 513 has been initiated. In some embodiments, user commands from the user to the chat conversation may be entered viaclient computing device 504 and displayed inchat interface 529 bychat application 527. For example, user commands fromuser 510 can be entered in atext input field 530 ofchat interface 529 or by initiating avoice interface 520. - In this example,
user 510 may initiatemobile application 548 withinclient computing device 504 by entering a text command withinchat interface 529 ofchat application 527 onclient computing device 504. For example, theuser 510 may enter a text command (e.g.,text command 537 illustrated inFIG. 5A ) via atext input field 530. - In some embodiments, upon initiating
chat application 527,user 510 may be greeted by expertchat interface assistant 513 displayed as a text message 535: “Hi John! Welcome to Chat Interface. What service would you like to use?” In response,user 510 may enter atext command 537 by indicating that he wants to initiate interaction with a flight booking application by stating: “I need to go from Grand Juncture to San Juan on Saturday.” Prior to initiatingmobile interface 549,expert chat assistant 513 may obtain additional details associated with user's 510 request. For example,expert chat assistant 513 may inquire in amessage 539 whether layovers during the flight are acceptable.User 510 may enter auser command 541 indicating that layovers are not acceptable. Next,expert chat assistant 513 may inquire in a message 543 whether early morning departures are acceptable.User 510 may enter auser command 545 indicating that early morning departures are not acceptable. - In
FIG. 5B ,expert chat assistant 513 may respond in amessage 547 indicating that the best flight matching user's requirements has been located. The details of theflight 550 may be displayed withinmobile application interface 549 ofmobile application 548 displayed underneathchat interface 529. In this example,mobile application interface 549 includes information based on data received frommobile application 548, which in turn was provided bychat interface 529 ofchat application 527. - In some embodiments,
user 510 may modify one or more previously provided user commands. For example,user 510 may locate a previously provided user command (e.g., by using a scrolling motion), modify it, and obtain a new response based on the modified information. In some embodiments, the response bychat assistant 513 may be provided based on both previously entered user commands and new information. By virtue of user 150 modifying one or more user commands, the chat interface may generate a new response without requiring user 150 to re-enter all of the previously inputted information which can be time consuming and tedious. - In some embodiments,
chat application 527 may provide achat application GUI 569, in addition tochat interface 529.Chat application GUI 569 may be used to display the original response and a new response generated based on modified information entered byuser 510. In some embodiments, the old response and the new response (i.e., a differential of the old response) may be graphically displayed withinchat application GUI 569. In some embodiments, both old and new user commands and old and new responses generated in response to user commands may be displayed graphically to allowuser 510 to visualize the conversation. For example, user commands and responses may be displayed hierarchically. That is, a modified user command may be displayed as a branch off of the original user command. Similarly, a corresponding new response generated in response to the modified user command may be displayed alongside the old response. By virtue of displaying user commands and response using branch visualization, allowsuser 510 to view the conversation overview easily. In some embodiments,user 510 may modify user commands inchat application GUI 569 by directly accessing individual branches corresponding to user commands (e.g., by tapping or pressing the icon representing particular branch). In yet other embodiments,user 510 may compare results for different branches by dragging one branch to the other. - For example, as illustrated in
FIG. 5C , the user may modify previously enteredcommand 537 illustrated inFIG. 5A . In this example,user command 537 input inFIG. 5A indicated thatuser 510 wants to travel from Grand Juncture to San Juan on Saturday. However, the modifieduser command 547 illustrated inFIG. 5C indicates thatuser 510 wishes travel from Grand Juncture to San Juan on Sunday.Expert chat assistant 513 may respond in amessage 549 acknowledging that a modification has been made and askinguser 510 to confirm it.Chat application GUI 569 may display the conversation betweenuser 510 and chat assistant graphically using a branch visualization or such similar method. For example, original (i.e., the oldest)user command input 537 inFIG. 5A may be displayed aselement 571. User commands that followed originaluser command input 537, e.g., user commands 541, 545 may be displayed following originaluser command element 571 aselements Modified user command 547 may be displayed aselement 571 a, i.e., a branch off of originaluser command element 571.Original response 550 illustrated inFIG. 5B may be displayed aselement 574. A modified response may be displayed under thecorresponding user command 571 a aselement 574 a.Modified response 574 a may be generated based on modifieduser command 571 a and original user commands 572, 573 thus allowinguser 510 to avoid re-entering the information. - In some embodiments, as alluded above,
user 510 may dragelement 571 a toelement 571 to compare the results for original and modified user commands. In other embodiments,user 510 may request a comparison by input user command asking for a comparison rather than by dragging or otherwise manipulating elements withinchat application GUI 569. For example, as illustrated inFIG. 5D ,user 510 may input auser command 581 asking for a comparison. In response,expert chat assistant 513 may generate a comparison as indicated by aresponse 582. The comparison may include original response 550 (i.e., details of the flight from Grand Juncture to San Juan on Saturday) illustrated inFIG. 5B , and a new response 551 (i.e., details of the flight from Grand Juncture to San Juan on Sunday) - In some embodiments, the modified response to modified user commands may be displayed within
mobile interface 549 ofmobile application 548. For example, inFIG. 5E ,expert chat assistant 513 may respond in amessage 583 acknowledging that a modification has been made and askinguser 510 to confirm it.User 510 may confirm byinput user command 584. In response,expert chat assistant 513 may respond in amessage 585 indicating that the best flight matching the user's requirements has been located. Modified responses (i.e., details of the flight from Grand Juncture to San Juan on Sunday rather than Saturday) may be displayed within themobile interface 549 underneath thechat interface 529. In this example, themobile interface 549 includes information based on data received from themobile application 548 which in turn was provided from thechat interface 529 of thechat application 527. -
FIG. 6 illustrates anexample computing module 600, an example of which may be a processor/controller resident on a mobile device, or a processor/controller used to operate a payment transaction device, that may be used to implement various features and/or functionality of the systems and methods disclosed in the present disclosure. - As used herein, the term module might describe a given unit of functionality that can be performed in accordance with one or more embodiments of the present application. As used herein, a module might be implemented utilizing any form of hardware, software, or a combination thereof. For example, one or more processors, controllers, ASICs, PLAs, PALs, CPLDs, FPGAs, logical components, software routines or other mechanisms might be implemented to make up a module. In implementation, the various modules described herein might be implemented as discrete modules or the functions and features described can be shared in part or in total among one or more modules. In other words, as would be apparent to one of ordinary skill in the art after reading this description, the various features and functionality described herein may be implemented in any given application and can be implemented in one or more separate or shared modules in various combinations and permutations. Even though various features or elements of functionality may be individually described or claimed as separate modules, one of ordinary skill in the art will understand that these features and functionality can be shared among one or more common software and hardware elements, and such description shall not require or imply that separate hardware or software components are used to implement such features or functionality.
- Where components or modules of the application are implemented in whole or in part using software, in one embodiment, these software elements may be implemented to operate with a computing or processing module capable of carrying out the functionality described with respect thereto. One such example computing module is shown in
FIG. 6 . Various embodiments are described in terms of this example-computing module 600. After reading this description, it will become apparent to a person skilled in the relevant art how to implement the application using other computing modules or architectures. - Referring now to
FIG. 6 ,computing module 600 may represent, for example, computing or processing capabilities found within desktop, laptop, notebook, and tablet computers; hand-held computing devices (tablets, PDA's, smart phones, cell phones, palmtops, etc.); mainframes, supercomputers, workstations or servers; or any other type of special-purpose or general-purpose computing devices as may be desirable or appropriate for a given application or environment.Computing module 600 might also represent computing capabilities embedded within or otherwise available to a given device. For example, a computing module might be found in other electronic devices such as, for example, digital cameras, navigation systems, cellular telephones, portable computing devices, modems, routers, WAPs, terminals, and other electronic devices that might include some form of processing capability. -
Computing module 600 might include, for example, one or more processors, controllers, control modules, or other processing devices, such as aprocessor 604.Processor 604 might be implemented using a general-purpose or special-purpose processing engine such as, for example, a microprocessor, controller, or other control logic. In the illustrated example,processor 604 is connected to abus 602, although any communication medium can be used to facilitate interaction with other components ofcomputing module 600 or to communicate externally. -
Computing module 600 might also include one or more memory modules, simply referred to herein asmain memory 608. For example, preferably random access memory (RAM) or other dynamic memory might be used for storing information and instructions to be executed byprocessor 604.Main memory 608 might also be used for storing temporary variables or other intermediate information during execution of instructions to be executed byprocessor 604.Computing module 600 might likewise include a read only memory (“ROM”) or other static storage device coupled tobus 602 for storing static information and instructions forprocessor 604. - The
computing module 600 might also include one or more various forms ofinformation storage devices 610, which might include, for example, amedia drive 612 and astorage unit interface 620. The media drive 612 might include a drive or other mechanism to support fixed orremovable storage media 614. For example, a hard disk drive, a floppy disk drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive might be provided. Accordingly,storage media 614 might include, for example, a hard disk, a floppy disk, magnetic tape, cartridge, optical disk, a CD or DVD, or other fixed or removable medium that is read by, written to or accessed bymedia drive 612. As these examples illustrate, thestorage media 614 can include a computer usable storage medium having stored therein computer software or data. - In alternative embodiments,
information storage devices 610 might include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded intocomputing module 600. Such instrumentalities might include, for example, a fixed orremovable storage unit 622 and astorage unit interface 620. Examples ofsuch storage units 622 and storage unit interfaces 620 can include a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, a PCMCIA slot and card, and other fixed orremovable storage units 622 andinterfaces 620 that allow software and data to be transferred from thestorage unit 622 tocomputing module 600. -
Computing module 600 might also include acommunications interface 624. Communications interface 624 might be used to allow software and data to be transferred betweencomputing module 600 and external devices. Examples ofcommunications interface 624 might include a modem or softmodem, a network interface (such as an Ethernet, network interface card, WiMedia, IEEE 802.XX or other interface), a communications port (such as for example, a USB port, IR port, RS232 port Bluetooth® interface, or other port), or other communications interface. Software and data transferred viacommunications interface 624 might typically be carried on signals, which can be electronic, electromagnetic (which includes optical) or other signals capable of being exchanged by a givencommunications interface 624. These signals might be provided tocommunications interface 624 via achannel 628. Thischannel 628 might carry signals and might be implemented using a wired or wireless communication medium. Some examples of a channel might include a phone line, a cellular link, an RF link, an optical link, a network interface, a local or wide area network, and other wired or wireless communications channels. - In this document, the terms “computer program medium” and “computer usable medium” are used to generally refer to transitory or non-transitory media such as, for example,
memory 608,storage unit interface 620,media 614, andchannel 628. These and other various forms of computer program media or computer usable media may be involved in carrying one or more sequences of one or more instructions to a processing device for execution. Such instructions embodied on the medium, are generally referred to as “computer program code” or a “computer program product” (which may be grouped in the form of computer programs or other groupings). When executed, such instructions might enable thecomputing module 600 to perform features or functions of the present application as discussed herein. - Various embodiments have been described with reference to specific exemplary features thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the various embodiments as set forth in the appended claims. The specification and figures are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
- Although described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects, and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the other embodiments of the present application, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present application should not be limited by any of the above-described exemplary embodiments.
- Terms and phrases used in the present application, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term “including” should be read as meaning “including, without limitation” or the like; the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; the terms “a” or “an” should be read as meaning “at least one,” “one or more” or the like; and adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. Likewise, where this document refers to technologies that would be apparent or known to one of ordinary skill in the art, such technologies encompass those apparent or known to the skilled artisan now or at any time in the future.
- The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent. The use of the term “module” does not imply that the components or functionality described or claimed as part of the module are all configured in a common package. Indeed, any or all of the various components of a module, whether control logic or other components, can be combined in a single package or separately maintained and can further be distributed in multiple groupings or packages or across multiple locations.
- Additionally, the various embodiments set forth herein are described in terms of exemplary block diagrams, flow charts and other illustrations. As will become apparent to one of ordinary skill in the art after reading this document, the illustrated embodiments and their various alternatives can be implemented without confinement to the illustrated examples. For example, block diagrams and their accompanying description should not be construed as mandating a particular architecture or configuration.
Claims (18)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/567,870 US10963218B1 (en) | 2019-09-09 | 2019-09-11 | Systems and methods for operating a mobile application using a conversation interface |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/565,452 US10992605B2 (en) | 2019-09-09 | 2019-09-09 | Systems and methods for operating a mobile application using a conversation interface |
US16/567,870 US10963218B1 (en) | 2019-09-09 | 2019-09-11 | Systems and methods for operating a mobile application using a conversation interface |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/565,452 Continuation US10992605B2 (en) | 2019-09-09 | 2019-09-09 | Systems and methods for operating a mobile application using a conversation interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210072952A1 true US20210072952A1 (en) | 2021-03-11 |
US10963218B1 US10963218B1 (en) | 2021-03-30 |
Family
ID=74849729
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/565,452 Active US10992605B2 (en) | 2019-09-09 | 2019-09-09 | Systems and methods for operating a mobile application using a conversation interface |
US16/567,870 Active US10963218B1 (en) | 2019-09-09 | 2019-09-11 | Systems and methods for operating a mobile application using a conversation interface |
US17/242,153 Active US11474783B2 (en) | 2019-09-09 | 2021-04-27 | Systems and methods for operating a mobile application using a conversation interface |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/565,452 Active US10992605B2 (en) | 2019-09-09 | 2019-09-09 | Systems and methods for operating a mobile application using a conversation interface |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/242,153 Active US11474783B2 (en) | 2019-09-09 | 2021-04-27 | Systems and methods for operating a mobile application using a conversation interface |
Country Status (5)
Country | Link |
---|---|
US (3) | US10992605B2 (en) |
EP (1) | EP4028881A4 (en) |
CA (1) | CA3150298A1 (en) |
GB (1) | GB2605015A (en) |
WO (1) | WO2021050513A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11599302B2 (en) * | 2019-09-11 | 2023-03-07 | Samsung Electronic Co., Ltd. | Storage device and method of operating storage device |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7351233B2 (en) * | 2020-02-13 | 2023-09-27 | トヨタ自動車株式会社 | Program, control device, and control method |
US11605187B1 (en) * | 2020-08-18 | 2023-03-14 | Corel Corporation | Drawing function identification in graphics applications |
WO2023205103A1 (en) * | 2022-04-18 | 2023-10-26 | Celligence International Llc | Method and computing apparatus for operating a form-based interface |
US20230393872A1 (en) * | 2022-06-03 | 2023-12-07 | Apple Inc. | Digital assistant integration with system interface |
US20230419301A1 (en) * | 2022-06-24 | 2023-12-28 | Celligence International Llc | Using a conversation interface to transfer digital assets |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9830589B2 (en) | 2002-10-01 | 2017-11-28 | Zhou Tian Xing | Systems and methods for mobile application, wearable application, transactional messaging, calling, digital multimedia capture, payment transactions, and one touch payment, one tap payment, and one touch service |
CN101588888A (en) * | 2007-01-17 | 2009-11-25 | 住友电气工业株式会社 | Laser processing device and its processing method |
US9177284B2 (en) * | 2007-10-29 | 2015-11-03 | International Business Machines Corporation | Instant conversation in a thread of an online discussion forum |
US20110251954A1 (en) | 2008-05-17 | 2011-10-13 | David H. Chin | Access of an online financial account through an applied gesture on a mobile device |
US9666185B2 (en) * | 2014-10-06 | 2017-05-30 | Nuance Communications, Inc. | Automatic data-driven dialog discovery system |
US9585159B2 (en) | 2014-12-19 | 2017-02-28 | Qualcomm Incorporated | Opportunistic dual-band relay |
US10516690B2 (en) | 2015-02-10 | 2019-12-24 | Cequence Security, Inc. | Physical device detection for a mobile application |
US10200824B2 (en) | 2015-05-27 | 2019-02-05 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on a touch-sensitive device |
US9578173B2 (en) | 2015-06-05 | 2017-02-21 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10686738B2 (en) * | 2015-07-24 | 2020-06-16 | Facebook, Inc. | Providing personal assistant service via messaging |
US10225471B2 (en) | 2016-03-18 | 2019-03-05 | Kenneth L. Poindexter, JR. | System and method for autonomously recording a visual media |
US10158593B2 (en) * | 2016-04-08 | 2018-12-18 | Microsoft Technology Licensing, Llc | Proactive intelligent personal assistant |
US10474946B2 (en) * | 2016-06-24 | 2019-11-12 | Microsoft Technology Licensing, Llc | Situation aware personal assistant |
US10546586B2 (en) * | 2016-09-07 | 2020-01-28 | International Business Machines Corporation | Conversation path rerouting in a dialog system based on user sentiment |
DE102016011654A1 (en) | 2016-09-27 | 2017-04-06 | Daimler Ag | Method for controlling an access authorization and / or driving authorization for a vehicle |
WO2019118399A1 (en) * | 2017-12-15 | 2019-06-20 | Walmart Apollo, Llc | Systems and methods for conserving user device resources during an online or virtual shopping session |
DK180639B1 (en) | 2018-06-01 | 2021-11-04 | Apple Inc | DISABILITY OF ATTENTION-ATTENTIVE VIRTUAL ASSISTANT |
US10504518B1 (en) | 2018-06-03 | 2019-12-10 | Apple Inc. | Accelerated task performance |
CA3050378C (en) | 2018-07-25 | 2022-09-13 | Accenture Global Solutions Limited | Natural language control of web browsers |
-
2019
- 2019-09-09 US US16/565,452 patent/US10992605B2/en active Active
- 2019-09-11 US US16/567,870 patent/US10963218B1/en active Active
-
2020
- 2020-09-09 GB GB2203311.2A patent/GB2605015A/en active Pending
- 2020-09-09 CA CA3150298A patent/CA3150298A1/en active Pending
- 2020-09-09 EP EP20862680.4A patent/EP4028881A4/en active Pending
- 2020-09-09 WO PCT/US2020/049891 patent/WO2021050513A1/en unknown
-
2021
- 2021-04-27 US US17/242,153 patent/US11474783B2/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11599302B2 (en) * | 2019-09-11 | 2023-03-07 | Samsung Electronic Co., Ltd. | Storage device and method of operating storage device |
Also Published As
Publication number | Publication date |
---|---|
WO2021050513A1 (en) | 2021-03-18 |
GB2605015A (en) | 2022-09-21 |
CA3150298A1 (en) | 2021-03-18 |
GB202203311D0 (en) | 2022-04-20 |
US20210075748A1 (en) | 2021-03-11 |
US10992605B2 (en) | 2021-04-27 |
EP4028881A1 (en) | 2022-07-20 |
EP4028881A4 (en) | 2023-10-18 |
US20210247959A1 (en) | 2021-08-12 |
US11474783B2 (en) | 2022-10-18 |
US10963218B1 (en) | 2021-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10963218B1 (en) | Systems and methods for operating a mobile application using a conversation interface | |
US11099867B2 (en) | Virtual assistant focused user interfaces | |
US20240073167A1 (en) | Determining contextually relevant application templates associated with electronic message content | |
US20220405986A1 (en) | Virtual image generation method, device, terminal and storage medium | |
US11803415B2 (en) | Automating tasks for a user across their mobile applications | |
US10088972B2 (en) | Virtual assistant conversations | |
US7895530B2 (en) | User definable interface system, method, support tools, and computer program product | |
US11003315B2 (en) | Terminal device and sharing method thereof | |
US11651162B2 (en) | Composite entity for rule driven acquisition of input data to chatbots | |
CN115917512A (en) | Artificial intelligence request and suggestion card | |
US11848012B2 (en) | System and method for providing voice assistant service | |
US20240069706A1 (en) | Method and apparatus for displaying co-hosting, electronic device and computer readable medium | |
CN110519155B (en) | Information processing method and system | |
US20230297961A1 (en) | Operating system facilitation of content sharing | |
US11797529B2 (en) | Rendering interactive subsidiary application(s) in response to a search request | |
US20190138329A1 (en) | User interface for efficient user-software interaction | |
CN114008590B (en) | Providing an auxiliary user interface using execution blocks | |
EP4270186A1 (en) | Image display method and apparatus, device, and medium | |
CN117742832A (en) | Page guiding configuration method, page guiding method and equipment | |
CN117853600A (en) | Image generation method, device, electronic equipment and storage medium | |
CN115248655A (en) | Method and apparatus for displaying information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
AS | Assignment |
Owner name: PAG FINANCIAL INTERNATIONAL LLC, PUERTO RICO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AGARWAL, PAVAN;SANCHEZ, GABRIEL ALBORS;RIVERA, JONATHAN ORTIZ;SIGNING DATES FROM 20200107 TO 20210106;REEL/FRAME:055439/0295 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: CELLIGENCE INTERNATIONAL LLC, PUERTO RICO Free format text: CHANGE OF NAME;ASSIGNOR:PAG FINANCIAL INTERNATIONAL LLC;REEL/FRAME:057923/0614 Effective date: 20210629 |