US20200120165A1 - Using Voice Commands From A Device To Remotely Access And Control A Computer - Google Patents
Using Voice Commands From A Device To Remotely Access And Control A Computer Download PDFInfo
- Publication number
- US20200120165A1 US20200120165A1 US16/710,692 US201916710692A US2020120165A1 US 20200120165 A1 US20200120165 A1 US 20200120165A1 US 201916710692 A US201916710692 A US 201916710692A US 2020120165 A1 US2020120165 A1 US 2020120165A1
- Authority
- US
- United States
- Prior art keywords
- mobile device
- command
- computer
- data
- voice
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
- H04L67/125—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/46—Multiprogramming arrangements
- G06F9/48—Program initiating; Program switching, e.g. by interrupt
- G06F9/4806—Task transfer initiation or dispatching
- G06F9/4843—Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/083—Network architectures or network communication protocols for network security for authentication of entities using passwords
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M11/00—Telephonic communication systems specially adapted for combination with other electrical systems
- H04M11/007—Telephonic communication systems specially adapted for combination with other electrical systems with remote control systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/487—Arrangements for providing information services, e.g. recorded voice services or time announcements
- H04M3/493—Interactive information services, e.g. directory enquiries ; Arrangements therefor, e.g. interactive voice response [IVR] systems or voice portals
- H04M3/4938—Interactive information services, e.g. directory enquiries ; Arrangements therefor, e.g. interactive voice response [IVR] systems or voice portals comprising a voice browser which renders and interprets, e.g. VoiceXML
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2203/00—Aspects of automatic or semi-automatic exchanges
- H04M2203/10—Aspects of automatic or semi-automatic exchanges related to the purpose or context of the telephonic communication
- H04M2203/1033—Remote administration, e.g. of web servers
Definitions
- the present invention pertains to the field of computer operating systems, and more specifically to a system for using voice commands from a mobile device to remotely access and control a computer, and a method of use that allows a person to use voice commands from a mobile device to remotely access and control a computer.
- a system for using voice commands from a mobile device to remotely access and control a computer, and a method of use is provided that allows a person to use voice commands from a mobile device to remotely access and control a computer.
- a system and method of use are provided that allow a person to use voice commands from a mobile device to remotely interact with a computer.
- a method of using voice commands from a mobile device to remotely access and control a computer includes receiving audio data from the mobile device at the computer.
- the audio data is decoded into a command.
- a software program that the command was provided for is determined.
- At least one process is executed at the computer in response to the command.
- Output data is generated at the computer in response to executing at least one process at the computer.
- the output data is transmitted to the mobile device.
- the present invention provides many important technical advantages.
- One important technical advantage of the present invention is a system that allows a person to use voice commands from a mobile device to remotely access and control a computer and see and hear the images and sounds generated at the computer, in response to the voice commands, in a manner similar to what the person would see and hear, if the person were directly and locally accessing and controlling the computer using a voice command system.
- FIG. 1 is a diagram of a system for using voice commands from a mobile device to remotely access and control a computer in accordance with an exemplary embodiment of the present invention
- FIG. 2 is a diagram of a system for providing a mobile device interface in accordance with an exemplary embodiment of the present invention
- FIG. 3 is a diagram of a system for providing an audio command interface in accordance with an exemplary embodiment of the present invention.
- FIG. 4 is a diagram of a method for using voice commands from a mobile device to remotely access and control a general purpose computer in accordance with an exemplary embodiment of the present invention.
- FIG. 1 is a diagram of system 100 for using voice commands from a mobile device to remotely access and control a computer in accordance with an exemplary embodiment of the present invention.
- System 100 allows a person to use voice commands from a mobile device to remotely access and control a computer, whereby the person can operate the operating system at the computer, operate native applications at the computer, and see and hear the images and sounds generated at the computer in response to the voice commands.
- System 100 includes mobile device 102 , which can be a suitable mobile device such as a cellular phone, smart phone, touch-screen device, personal digital assistant, tablet device, notebook device, laptop device, or other suitable mobile device that allows communication with a computer via a wireless or wire-line network or a suitable combination of a wireless and wire-line network.
- a suitable mobile device such as a cellular phone, smart phone, touch-screen device, personal digital assistant, tablet device, notebook device, laptop device, or other suitable mobile device that allows communication with a computer via a wireless or wire-line network or a suitable combination of a wireless and wire-line network.
- System 100 also includes general purpose computer 104 , which can be a general purpose processing platform or other suitable processing platforms.
- General purpose computer 104 includes mobile device interface 106 , audio command interface 108 , operating system interface 110 , and native applications 112 , each of which can be implemented in hardware, software, or a suitable combination of hardware and software, which can be one or more software systems operating on a general purpose processing platform.
- “hardware” can include a combination of discrete components, an integrated circuit, an application-specific integrated circuit, a field programmable gate array, or other suitable hardware.
- software can include one or more objects, agents, threads, lines of code, subroutines, separate software applications, two or more lines of code or other suitable software structures operating in two or more software applications or on two or more processors, or other suitable software structures.
- software can include one or more lines of code or other suitable software structures operating in a general purpose software application, such as an operating system; and one or more lines of code or other suitable software structures operating in a specific purpose software application.
- Mobile device 102 is coupled to general purpose computer 104 via communications medium 114 .
- “communications medium” can include a wire-line communications medium, a wireless communications medium, an optical communications medium, an analog communications medium, a digital communications medium, other suitable communications media or a suitable combination of communications media.
- the term “coupled” and its cognate terms such as “couples” or “couple,” can include a physical connection (such as a wire, optical fiber, or a telecommunications medium), a virtual connection (such as through randomly assigned memory locations of a data memory device or a hypertext transfer protocol (HTTP) link), a logical connection (such as through one or more semiconductor devices in an integrated circuit), or other suitable connections.
- a communications medium can be a network or other suitable communications media.
- Mobile device interface 106 receives voice or data information from mobile device 102 .
- mobile device interface 106 is configured to monitor communications medium 114 interface devices, or other suitable mechanisms for interfacing with communications medium 114 .
- mobile device interface 106 can monitor a public switched telephone network (PSTN) modem that responds to ring signals when a call is being made from mobile device 102 to general purpose computer 104 , such as through a dedicated telephone number.
- PSTN public switched telephone network
- Mobile device interface 106 can answer or monitor the answering of the PSTN modem and determine whether mobile device 102 is calling or whether some other device, such as a fax machine or other computer is calling.
- PSTN public switched telephone network
- mobile device interface 106 can operate or monitor a network connection, such as over a local area network, DSL modem, cable modem or other suitable internet or network connections and can determine whether mobile device 102 has transmitted data to general purpose computer 104 .
- mobile device interface 106 can receive addressing data, such as through the use of keypad entries or other data control devices that can transmit network addressed data to general purpose computer 104 at a predetermined network address.
- Mobile device interface 106 can receive such network addressed data and determine whether it has been transmitted by mobile device 102 or other suitable devices. If it is determined by mobile device interface 106 that mobile device 102 has transmitted data to general purpose computer 104 , mobile device interface 106 establishes a session with mobile device 102 to allow mobile device 102 to interact with general purpose computer 104 .
- Audio command interface 108 receives audio commands from mobile device interface 106 .
- audio command interface 108 can receive data from mobile device interface 106 and detect audio commands in the data.
- mobile device interface 106 may receive data from mobile device 102 without determining whether the data includes audio data, keypad entry data, keyboard entry data, or other suitable data.
- mobile device interface 106 can perform voice recognition and other suitable processing and can provide voice data to audio command interface 108 .
- Audio command interface 108 determines whether voice data corresponds to an audio command. In one exemplary embodiment, audio command interface 108 can determine whether voice data corresponds to one of two or more predetermined audio commands. Audio command interface 108 can also execute such detected commands. Audio command interface 108 can also provide a list of available commands to the person using mobile device 102 , such as by presenting prompts to the person, by allowing the person to request a list of available audio commands, or in other suitable manners. Likewise, audio command interface 108 can include one or more states, such that certain audio commands are available depending upon the state of audio command interface 108 . In this exemplary embodiment, when audio command interface 108 is being used to access and control a native application 112 or operating system interface 110 , audio command interface 108 may change states to provide different audio commands to the person using mobile device 102 .
- Operating system interface 110 allows audio command interface 108 to activate various operating system commands.
- audio command interface 108 can include a file of available operating system commands that can be matched with voice data, such as operating system commands that would otherwise be available to a person directly and locally accessing and controlling general purpose computer 104 using a voice command system.
- an operating system command may include a file search command that can be activated through audio command interface 108 and operating system interface 110 .
- the person using mobile device 102 after establishing a session with mobile device interface 106 , could state the command “search”, and audio command interface 108 could cause the search functionality of operating system interface 110 to be activated. The person could then further identify information to be searched for such as documents, pictures, videos, all files and folders, or other suitable processes can be performed.
- Native applications 112 can include one or more native applications accessed and controlled at general purpose computer 104 .
- native applications 112 can be loaded into or interface with audio command interface 108 , such as by installing an applications program interface (API) or other suitable data into audio command interface 108 that identifies native applications 112 and provides available commands for audio command interface 108 to interface with native applications 112 .
- API applications program interface
- certain predetermined commands for native applications 112 can be provided, such as commands that one person would be authorized to use, whereas commands another person would not be authorized to use can be inhibited or blocked for that person (such as a print command, a command allowing the person to turn off general purpose computer 104 , or other suitable commands).
- system 100 allows a person to use voice commands from mobile device 102 to remotely access and control general purpose computer 104 over communications medium 114 .
- system 100 allows mobile device 102 to remotely access and control both operating system functions and native applications at general purpose computer 104 .
- the person using mobile device 102 can see and hear the images and sounds generated at general purpose computer 104 , regardless of the native application or operating system function that is used to generate such images and sounds.
- each native application 112 or operating system of general purpose computer 104 is not required to be configured to remotely interact with mobile device 102 .
- system 100 allows mobile device 102 to switch between native applications that are configured and operating system functions and native applications that are not configured for interaction with mobile device 102 .
- FIG. 2 is a diagram of system 200 for providing a mobile device interface in accordance with an exemplary embodiment of the present invention.
- System 200 includes network interface 202 , native application video output conversion 204 , and native application audio output conversion 206 , each of which can be implemented in hardware, software, or a suitable combination of hardware and software, which can be one or more software systems operating on a general purpose processing platform.
- Network interface 202 provides an interface between general purpose computer 104 and communications medium 114 or other suitable networks.
- network interface 202 can monitor a modem, such as a PSTN modem, cable modem, DSL modem, or other suitable modems for incoming data traffic that indicates that mobile device 102 or other suitable devices are attempting to interface with general purpose computer 104 .
- network interface 202 can monitor a network card, such as a local area network connection, network interface card, or other suitable devices.
- Network interface 202 receives the incoming mobile device 102 data and performs voice or data recognition on the data.
- Native application video output conversion 204 receives native application video output from general purpose computer 104 , such as video data that has been generated by a native application, and converts the video data into a format for transmission to mobile device 102 or other suitable mobile devices.
- native application video output conversion 204 can receive screen information from general purpose computer 104 operating under a native application or operating system command and can convert the screen information into a format for transmission to mobile device 102 .
- native application video output conversion 204 can convert the native application video output into a different format for viewing at mobile device 102 , such as a format that excludes data that would not provide additional functionality, but would otherwise require excessive bandwidth requirements for transfer to mobile device 102 .
- Native application audio output conversion 206 receives native application audio output from general purpose computer 104 , such as audio data that has been generated by a native application, and converts the audio data into a format for transmission to mobile device 102 or other suitable mobile devices.
- Native application audio output conversion 206 is coordinated with native application video output conversion 204 , so as to provide audio output that is correlated with video output, allowing video conferencing, playback of audio-visual data, or other suitable processes.
- native application audio output conversion 206 can convert the native application audio output into a different format for hearing at mobile device 102 , such as a format that excludes data that would not provide additional functionality, but would otherwise require excessive bandwidth requirements for transfer to mobile device 102 .
- system 200 allows a person using mobile device 102 to remotely provide commands to general purpose computer 104 to cause native applications or operating system functions to change state or otherwise perform suitable functions at general purpose computer 104 , while at the same time allowing the person using mobile device 102 to see and hear the images and sounds generated by the native applications or operating system functions at general purpose computer 104 .
- FIG. 3 is a diagram of system 300 for providing an audio command interface in accordance with an exemplary embodiment of the present invention.
- System 300 allows voice commands to be provided to the operating system or native applications operating at general purpose computer 104 .
- System 300 includes voice to command conversion 302 , operating system command system 304 , native application command system 306 and functionality limitation system 308 , each of which can be implemented in hardware, software, or a suitable combination of hardware and software, which can be one or more software systems operating on a general purpose processing platform.
- Voice to command conversion 302 receives voice data and determines whether the voice data matches one or more predetermined commands. In one exemplary embodiment, voice to command conversion 302 can determine if the voice data is in a proper sequence to be received as a command, has occurred at a point in time to be interpreted as a command, and can perform confirmatory functions such as repeating the command to the person or other suitable functions.
- voice to command conversion 302 can include one or more states, such as states based on an operating system function, a native application, or other function being accessed and controlled at a present time.
- voice to command conversion 302 can limit the number of commands to a predetermined set of commands relevant to the operating system function or native application.
- voice to command conversion 302 can allow the person using mobile device 102 to obtain a list of available commands, such as by stating “list”, can allow the person to request confirmation of commands, such as by requesting “confirm,” or can perform other suitable functions.
- Operating system command system 304 allows a person to use voice commands from mobile device 102 to remotely access and control the operating system at general purpose computer 104 .
- operating system command system 304 can allow the person to perform predetermined operating system commands such as a search command, a run command, a program list command, or other suitable commands.
- the person can speak a command that causes the operating system to generate a display of a predetermined number of last programs that were run by the operating system.
- operating command system 304 can interact with the person, such as allowing the person to obtain a list of available operating system commands, allowing the person to query the operating system to obtain a list of native applications that are available for operation, or in other suitable manners.
- Native application command system 306 allows a person to use voice commands from mobile device 102 to remotely access and control the native applications at general purpose computer 104 .
- native application command system 306 can include one or more application programming interfaces having a predetermined set of commands that can be used to operate a native application.
- native application command system 306 can allow the person to request a list of available native application commands, can confirm whether a spoken command received from the person was meant to be one of two or more similar native application commands, or can perform other suitable functions.
- native application command system 306 can request the person to repeat a command that was not understood, can ask the person to choose between one of two similar commands, can allow the person to navigate backwards a predetermined number of command steps or reset, can provide other suitable functions.
- Functionality limitation system 308 interacts with network interface 202 to perform password or other authorization processes, requiring the person using mobile device 102 to be authenticated before being allowed access and control of general purpose computer 104 in whole or in part.
- speech or image data can be received and compared with stored speech or image data.
- the speech or image data can also include a plurality of sets of speech or image data.
- the speech data can be selected from a set that causes one of a plurality of predetermined processes to be performed, such as a first phrase or term that causes a first native application to be executed, a second phrase or term that causes a second native application to be executed, and so forth.
- certain terms or phrases can be used in public to provide limited functionality to native applications and other terms or phrases can be used in private to provide access and control of additional native applications or functions of such native applications.
- files of image data showing different authorized persons or authorized persons in different states can also or alternatively be provided; and an image of the person using mobile device 102 can be transmitted from mobile device 102 to general purpose computer 104 to authorize access and control of general purpose computer 104 .
- the transmitted image data can be compared to the stored image data and it can be determined whether the data matches within a predetermined tolerance, such as by using traditional image data comparison algorithms or processes that identify a plurality of points or features in the facial images for comparison.
- Combinations of audio and video data can also or alternatively be used for password or other authorization processes and other suitable functions.
- system 300 provides an audio command interface that allows a person to use voice commands from mobile device 102 to access and control native applications or operating system functions at general purpose computer 104 .
- Audio command interface 108 provides the person with the ability to access and control both native applications that are configured for direct interaction with a mobile device as well as native applications that are not configured for direct interaction with a mobile device, allowing a person to start applications, to switch between applications, to shut down applications, or to perform other suitable functions.
- System 300 can also limit the access and control the person using mobile device 102 can have of general purpose computer 104 .
- FIG. 4 is a diagram of method 400 for using voice commands from a mobile device to remotely access and control a general purpose computer in accordance with an exemplary embodiment of the present invention.
- Method 400 begins at 402 where a call is received from a mobile device.
- the call can be received over a public switched telephone network, a wireless network, the internet, or other suitable networks.
- the call can be encrypted or coded in other suitable manners. The method then proceeds to 404 .
- a session is established with the mobile device.
- a session can be established utilizing mobile device identification and confirmation, utilizing a process such as a text password and person ID, a spoken or visual password and person ID where the person using the mobile device transmits audio or image data of themselves, which is confirmed using suitable processes at the general purpose processor, or in other suitable manners.
- the image of the person using the mobile device can be compared with a plurality of their images using known image comparison processes to determine whether a match exists within a predetermined tolerance, allowing the person using the mobile device to be recognized in different states, such as with long hair, short hair, unkempt hair, with or without facial hair or make-up, or in other suitable states.
- the method then proceeds to 406 .
- a prompt can be provided to the person using the mobile device to enter a voice or data command, or other suitable processes are performed.
- the prompt can confirm that a session has been established with the mobile device and the general purpose processing platform or other suitable computer, and can allow or provide the person with available voice commands.
- other suitable processes can also or alternatively be performed, such as receipt of data commands entered by a keypad entry or keyboard entry. If it is determined at 406 that a voice or data command has been received, the method then proceeds to 410 .
- the voice or data command is decoded.
- a voice command can be decoded by determining the equivalent word for a spoken word, whereas a data command can be decoded by determining whether predetermined control data precedes the data that identifies it as a data command such as a key entry or other suitable data. The method then proceeds to 412 .
- an operating system command can be used to generate audio and video output data, such as a command to play a movie file using a video player of the operating system, or other suitable processes.
- the operating system command can include a request to list available operating system commands, such that the operating system command executed at 414 is selected from a list of available operating system commands.
- operating system commands such as “find” or “run” can be utilized in conjunction with spelling commands, such as where the person spells an alphanumeric file identifier or program name.
- a document, spreadsheet, photographic image, audio recording, video recording, web page or other suitable data viewer/player can be used to find and view/play a document, spreadsheet, photographic image, audio recording, video recording, web page or other suitable data; and navigation commands such as “back” or “forward” can be used to navigate through a set of files of documents, spreadsheets, photographic images, audio recordings, video recordings, web pages, search results of such data, or other suitable data.
- the method then proceeds to 422 .
- the audio output data, video output data, or audio and video output data, generated at the computer is converted to a mobile device format.
- the audio output data and video output data can be converted from a format generated at the general purpose processing platform to a format compatible with the mobile device, such as one having a reduced amount of data, a reduced number of pixels, reduced definition, reduced audio content, or other suitable formats that are optimized for the mobile device.
- the audio output data and video output data can be converted into a format that excludes data that would not provide additional functionality, but would otherwise require excessive bandwidth requirements for transfer to mobile device 102 .
- the audio output data and video output data can be encrypted or coded in other suitable manners. The method then proceeds to 424 .
- the audio output data, video output data, or audio and video output data, converted to a mobile device format is transmitted to the mobile device.
- the audio output data and video output data are coordinated with each other, so as to provide audio output that is correlated with video output, allowing video conferencing, playback of audio-visual data, or other suitable processes.
- the audio output data can be transmitted without the video output data being transmitted.
- the video output data can be transmitted without the audio output data being transmitted.
- audio output data and video output data can be converted into a format that allows data transmission to the mobile device in accordance with the bandwidth available at the time of the transmission. The method then proceeds to 426 .
- the method proceeds to 428 and terminates. Likewise, the person using the mobile device can be prompted to confirm whether or not they have received the converted audio output data, the converted video output data, or whether they are still active. A “ping” command or other suitable operations can also or alternatively be performed to determine whether a connection still exists with the mobile device. If a response is received at 426 , the method then returns to 406 .
- a native application command can include a request to list available native applications, a command to request a list of available native application commands (such as from an API), or other suitable native application commands. If it is determined at 416 that a native application command has been received, the method then proceeds to 418 where the command is executed. The method then proceeds to 422 . Otherwise, if it is determined at 416 that the native application command has not been received, the method proceeds to 420 where an error message is generated. In one exemplary embodiment, the error message can include feedback to the person using the mobile device that the voice or data command received was not recognized as an available operating system or native application command. The method then returns to 406 .
- method 400 allows a person using a mobile device to remotely interact with a computer, whereby the person is able to command the computer from the mobile device and see and hear the images and sounds generated at the computer, in response to the commands received from the mobile device, in a manner similar to what the person would see and hear, if the person were directly and locally interacting with the computer.
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Automation & Control Theory (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Telephonic Communication Services (AREA)
- Selective Calling Equipment (AREA)
- Telephone Function (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Description
- This application is a continuation of U.S. application Ser. No. 16/677,351 filed on Nov. 7, 2019, which is a continuation of U.S. application Ser. No. 16/655,061 filed on Oct. 16, 2019, which is a continuation of U.S. Pat. No. 10,491,676 issued on Nov. 26, 2019, which is a continuation of U.S. Pat. No. 9,794,348 issued on Oct. 17, 2017, the entire disclosures of which are incorporated by reference.
- Not applicable.
- The present invention pertains to the field of computer operating systems, and more specifically to a system for using voice commands from a mobile device to remotely access and control a computer, and a method of use that allows a person to use voice commands from a mobile device to remotely access and control a computer.
- Systems for allowing a person to use voice commands from a mobile device to remotely access and control a computer are known in the art. However, such prior art systems are application-specific, meaning they are configured to allow the person to use voice commands from a mobile device to remotely access and control a specific application at a computer. Therefore, the prior art systems require the person to have multiple mobile devices and/or systems to remotely access and control the different applications at a computer. Additionally, the prior art systems limit the audible and visible feedback the person can receive from a computer while using voice commands from a mobile device to remotely access and control the computer.
- In accordance with the present invention, a system is provided for using voice commands from a mobile device to remotely access and control a computer, and a method of use is provided that allows a person to use voice commands from a mobile device to remotely access and control a computer.
- In particular, a system and method of use are provided that allow a person to use voice commands from a mobile device to remotely interact with a computer.
- In accordance with an exemplary embodiment of the present invention, a method of using voice commands from a mobile device to remotely access and control a computer is provided. The method includes receiving audio data from the mobile device at the computer. The audio data is decoded into a command. A software program that the command was provided for is determined. At least one process is executed at the computer in response to the command. Output data is generated at the computer in response to executing at least one process at the computer. The output data is transmitted to the mobile device.
- The present invention provides many important technical advantages. One important technical advantage of the present invention is a system that allows a person to use voice commands from a mobile device to remotely access and control a computer and see and hear the images and sounds generated at the computer, in response to the voice commands, in a manner similar to what the person would see and hear, if the person were directly and locally accessing and controlling the computer using a voice command system.
- Those skilled in the art will further appreciate the advantages and superior features of the invention together with other important aspects thereof on reading the detailed description that follows in conjunction with the drawings.
- For a detailed description of the preferred embodiments of the invention, reference will now be made to the accompanying drawings in which:
-
FIG. 1 is a diagram of a system for using voice commands from a mobile device to remotely access and control a computer in accordance with an exemplary embodiment of the present invention; -
FIG. 2 is a diagram of a system for providing a mobile device interface in accordance with an exemplary embodiment of the present invention; -
FIG. 3 is a diagram of a system for providing an audio command interface in accordance with an exemplary embodiment of the present invention; and -
FIG. 4 is a diagram of a method for using voice commands from a mobile device to remotely access and control a general purpose computer in accordance with an exemplary embodiment of the present invention. - In the description that follows, like parts are marked throughout the specification and drawings with the same reference numerals, respectively. The drawing figures might not be to scale, and certain components can be shown in generalized or schematic form and identified by commercial designations in the interest of clarity and conciseness.
-
FIG. 1 is a diagram of system 100 for using voice commands from a mobile device to remotely access and control a computer in accordance with an exemplary embodiment of the present invention. System 100 allows a person to use voice commands from a mobile device to remotely access and control a computer, whereby the person can operate the operating system at the computer, operate native applications at the computer, and see and hear the images and sounds generated at the computer in response to the voice commands. - System 100 includes
mobile device 102, which can be a suitable mobile device such as a cellular phone, smart phone, touch-screen device, personal digital assistant, tablet device, notebook device, laptop device, or other suitable mobile device that allows communication with a computer via a wireless or wire-line network or a suitable combination of a wireless and wire-line network. - System 100 also includes
general purpose computer 104, which can be a general purpose processing platform or other suitable processing platforms.General purpose computer 104 includesmobile device interface 106,audio command interface 108,operating system interface 110, andnative applications 112, each of which can be implemented in hardware, software, or a suitable combination of hardware and software, which can be one or more software systems operating on a general purpose processing platform. As used herein, “hardware” can include a combination of discrete components, an integrated circuit, an application-specific integrated circuit, a field programmable gate array, or other suitable hardware. As used herein, “software” can include one or more objects, agents, threads, lines of code, subroutines, separate software applications, two or more lines of code or other suitable software structures operating in two or more software applications or on two or more processors, or other suitable software structures. In one exemplary embodiment, software can include one or more lines of code or other suitable software structures operating in a general purpose software application, such as an operating system; and one or more lines of code or other suitable software structures operating in a specific purpose software application. -
Mobile device 102 is coupled togeneral purpose computer 104 viacommunications medium 114. As used herein, “communications medium” can include a wire-line communications medium, a wireless communications medium, an optical communications medium, an analog communications medium, a digital communications medium, other suitable communications media or a suitable combination of communications media. As used herein, the term “coupled” and its cognate terms such as “couples” or “couple,” can include a physical connection (such as a wire, optical fiber, or a telecommunications medium), a virtual connection (such as through randomly assigned memory locations of a data memory device or a hypertext transfer protocol (HTTP) link), a logical connection (such as through one or more semiconductor devices in an integrated circuit), or other suitable connections. In one exemplary embodiment, a communications medium can be a network or other suitable communications media. -
Mobile device interface 106 receives voice or data information frommobile device 102. In one exemplary embodiment,mobile device interface 106 is configured to monitorcommunications medium 114 interface devices, or other suitable mechanisms for interfacing withcommunications medium 114. In this exemplary embodiment,mobile device interface 106 can monitor a public switched telephone network (PSTN) modem that responds to ring signals when a call is being made frommobile device 102 togeneral purpose computer 104, such as through a dedicated telephone number.Mobile device interface 106 can answer or monitor the answering of the PSTN modem and determine whethermobile device 102 is calling or whether some other device, such as a fax machine or other computer is calling. Likewise,mobile device interface 106 can operate or monitor a network connection, such as over a local area network, DSL modem, cable modem or other suitable internet or network connections and can determine whethermobile device 102 has transmitted data togeneral purpose computer 104. In one exemplary embodiment,mobile device interface 106 can receive addressing data, such as through the use of keypad entries or other data control devices that can transmit network addressed data togeneral purpose computer 104 at a predetermined network address.Mobile device interface 106 can receive such network addressed data and determine whether it has been transmitted bymobile device 102 or other suitable devices. If it is determined bymobile device interface 106 thatmobile device 102 has transmitted data togeneral purpose computer 104,mobile device interface 106 establishes a session withmobile device 102 to allowmobile device 102 to interact withgeneral purpose computer 104. -
Audio command interface 108 receives audio commands frommobile device interface 106. In one exemplary embodiment,audio command interface 108 can receive data frommobile device interface 106 and detect audio commands in the data. For example,mobile device interface 106 may receive data frommobile device 102 without determining whether the data includes audio data, keypad entry data, keyboard entry data, or other suitable data. Likewise,mobile device interface 106 can perform voice recognition and other suitable processing and can provide voice data toaudio command interface 108. -
Audio command interface 108 determines whether voice data corresponds to an audio command. In one exemplary embodiment,audio command interface 108 can determine whether voice data corresponds to one of two or more predetermined audio commands.Audio command interface 108 can also execute such detected commands.Audio command interface 108 can also provide a list of available commands to the person usingmobile device 102, such as by presenting prompts to the person, by allowing the person to request a list of available audio commands, or in other suitable manners. Likewise,audio command interface 108 can include one or more states, such that certain audio commands are available depending upon the state ofaudio command interface 108. In this exemplary embodiment, whenaudio command interface 108 is being used to access and control anative application 112 oroperating system interface 110,audio command interface 108 may change states to provide different audio commands to the person usingmobile device 102. -
Operating system interface 110 allowsaudio command interface 108 to activate various operating system commands. In one exemplary embodiment,audio command interface 108 can include a file of available operating system commands that can be matched with voice data, such as operating system commands that would otherwise be available to a person directly and locally accessing and controllinggeneral purpose computer 104 using a voice command system. For example, an operating system command may include a file search command that can be activated throughaudio command interface 108 andoperating system interface 110. In this exemplary embodiment, the person usingmobile device 102, after establishing a session withmobile device interface 106, could state the command “search”, andaudio command interface 108 could cause the search functionality ofoperating system interface 110 to be activated. The person could then further identify information to be searched for such as documents, pictures, videos, all files and folders, or other suitable processes can be performed. -
Native applications 112 can include one or more native applications accessed and controlled atgeneral purpose computer 104. In one exemplary embodiment,native applications 112 can be loaded into or interface withaudio command interface 108, such as by installing an applications program interface (API) or other suitable data intoaudio command interface 108 that identifiesnative applications 112 and provides available commands foraudio command interface 108 to interface withnative applications 112. In this exemplary embodiment, certain predetermined commands fornative applications 112 can be provided, such as commands that one person would be authorized to use, whereas commands another person would not be authorized to use can be inhibited or blocked for that person (such as a print command, a command allowing the person to turn offgeneral purpose computer 104, or other suitable commands). - In operation, system 100 allows a person to use voice commands from
mobile device 102 to remotely access and controlgeneral purpose computer 104 overcommunications medium 114. Unlike prior art systems that require voice command interoperability be provided for each separate native application, system 100 allowsmobile device 102 to remotely access and control both operating system functions and native applications atgeneral purpose computer 104. In this manner, the person usingmobile device 102 can see and hear the images and sounds generated atgeneral purpose computer 104, regardless of the native application or operating system function that is used to generate such images and sounds. As such, eachnative application 112 or operating system ofgeneral purpose computer 104 is not required to be configured to remotely interact withmobile device 102. Likewise, where native applications atgeneral purpose computer 104 are configured to remotely interact withmobile device 102, system 100 allowsmobile device 102 to switch between native applications that are configured and operating system functions and native applications that are not configured for interaction withmobile device 102. -
FIG. 2 is a diagram of system 200 for providing a mobile device interface in accordance with an exemplary embodiment of the present invention. System 200 includesnetwork interface 202, native application video output conversion 204, and native application audio output conversion 206, each of which can be implemented in hardware, software, or a suitable combination of hardware and software, which can be one or more software systems operating on a general purpose processing platform. -
Network interface 202 provides an interface betweengeneral purpose computer 104 and communications medium 114 or other suitable networks. In one exemplary embodiment,network interface 202 can monitor a modem, such as a PSTN modem, cable modem, DSL modem, or other suitable modems for incoming data traffic that indicates thatmobile device 102 or other suitable devices are attempting to interface withgeneral purpose computer 104. Likewise,network interface 202 can monitor a network card, such as a local area network connection, network interface card, or other suitable devices.Network interface 202 receives the incomingmobile device 102 data and performs voice or data recognition on the data. - Native application video output conversion 204 receives native application video output from
general purpose computer 104, such as video data that has been generated by a native application, and converts the video data into a format for transmission tomobile device 102 or other suitable mobile devices. In one exemplary embodiment, native application video output conversion 204 can receive screen information fromgeneral purpose computer 104 operating under a native application or operating system command and can convert the screen information into a format for transmission tomobile device 102. Likewise, native application video output conversion 204 can convert the native application video output into a different format for viewing atmobile device 102, such as a format that excludes data that would not provide additional functionality, but would otherwise require excessive bandwidth requirements for transfer tomobile device 102. - Native application audio output conversion 206 receives native application audio output from
general purpose computer 104, such as audio data that has been generated by a native application, and converts the audio data into a format for transmission tomobile device 102 or other suitable mobile devices. Native application audio output conversion 206 is coordinated with native application video output conversion 204, so as to provide audio output that is correlated with video output, allowing video conferencing, playback of audio-visual data, or other suitable processes. Likewise, native application audio output conversion 206 can convert the native application audio output into a different format for hearing atmobile device 102, such as a format that excludes data that would not provide additional functionality, but would otherwise require excessive bandwidth requirements for transfer tomobile device 102. - In operation, system 200 allows a person using
mobile device 102 to remotely provide commands togeneral purpose computer 104 to cause native applications or operating system functions to change state or otherwise perform suitable functions atgeneral purpose computer 104, while at the same time allowing the person usingmobile device 102 to see and hear the images and sounds generated by the native applications or operating system functions atgeneral purpose computer 104. -
FIG. 3 is a diagram of system 300 for providing an audio command interface in accordance with an exemplary embodiment of the present invention. System 300 allows voice commands to be provided to the operating system or native applications operating atgeneral purpose computer 104. - System 300 includes voice to command
conversion 302, operatingsystem command system 304, native application command system 306 and functionality limitation system 308, each of which can be implemented in hardware, software, or a suitable combination of hardware and software, which can be one or more software systems operating on a general purpose processing platform. Voice tocommand conversion 302 receives voice data and determines whether the voice data matches one or more predetermined commands. In one exemplary embodiment, voice to commandconversion 302 can determine if the voice data is in a proper sequence to be received as a command, has occurred at a point in time to be interpreted as a command, and can perform confirmatory functions such as repeating the command to the person or other suitable functions. In one exemplary embodiment, voice to commandconversion 302 can include one or more states, such as states based on an operating system function, a native application, or other function being accessed and controlled at a present time. In this exemplary embodiment, when the person usingmobile device 102 is accessing and controlling a certain operating system function or native application atgeneral purpose computer 104, voice to commandconversion 302 can limit the number of commands to a predetermined set of commands relevant to the operating system function or native application. Likewise, voice to commandconversion 302 can allow the person usingmobile device 102 to obtain a list of available commands, such as by stating “list”, can allow the person to request confirmation of commands, such as by requesting “confirm,” or can perform other suitable functions. - Operating
system command system 304 allows a person to use voice commands frommobile device 102 to remotely access and control the operating system atgeneral purpose computer 104. In one exemplary embodiment, operatingsystem command system 304 can allow the person to perform predetermined operating system commands such as a search command, a run command, a program list command, or other suitable commands. In this exemplary embodiment, the person can speak a command that causes the operating system to generate a display of a predetermined number of last programs that were run by the operating system. Likewise, operatingcommand system 304 can interact with the person, such as allowing the person to obtain a list of available operating system commands, allowing the person to query the operating system to obtain a list of native applications that are available for operation, or in other suitable manners. - Native application command system 306 allows a person to use voice commands from
mobile device 102 to remotely access and control the native applications atgeneral purpose computer 104. In one exemplary embodiment, native application command system 306 can include one or more application programming interfaces having a predetermined set of commands that can be used to operate a native application. In this exemplary embodiment, native application command system 306 can allow the person to request a list of available native application commands, can confirm whether a spoken command received from the person was meant to be one of two or more similar native application commands, or can perform other suitable functions. For example, native application command system 306 can request the person to repeat a command that was not understood, can ask the person to choose between one of two similar commands, can allow the person to navigate backwards a predetermined number of command steps or reset, can provide other suitable functions. - Functionality limitation system 308 interacts with
network interface 202 to perform password or other authorization processes, requiring the person usingmobile device 102 to be authenticated before being allowed access and control ofgeneral purpose computer 104 in whole or in part. In one exemplary embodiment, speech or image data can be received and compared with stored speech or image data. The speech or image data can also include a plurality of sets of speech or image data. In this exemplary embodiment, the speech data can be selected from a set that causes one of a plurality of predetermined processes to be performed, such as a first phrase or term that causes a first native application to be executed, a second phrase or term that causes a second native application to be executed, and so forth. Likewise, certain terms or phrases can be used in public to provide limited functionality to native applications and other terms or phrases can be used in private to provide access and control of additional native applications or functions of such native applications. - In another exemplary embodiment, files of image data showing different authorized persons or authorized persons in different states (such as with long hair, short hair, unkempt hair, with or without facial hair or make-up, or in other states) can also or alternatively be provided; and an image of the person using
mobile device 102 can be transmitted frommobile device 102 togeneral purpose computer 104 to authorize access and control ofgeneral purpose computer 104. The transmitted image data can be compared to the stored image data and it can be determined whether the data matches within a predetermined tolerance, such as by using traditional image data comparison algorithms or processes that identify a plurality of points or features in the facial images for comparison. Combinations of audio and video data can also or alternatively be used for password or other authorization processes and other suitable functions. - In operation, system 300 provides an audio command interface that allows a person to use voice commands from
mobile device 102 to access and control native applications or operating system functions atgeneral purpose computer 104.Audio command interface 108 provides the person with the ability to access and control both native applications that are configured for direct interaction with a mobile device as well as native applications that are not configured for direct interaction with a mobile device, allowing a person to start applications, to switch between applications, to shut down applications, or to perform other suitable functions. System 300 can also limit the access and control the person usingmobile device 102 can have ofgeneral purpose computer 104. -
FIG. 4 is a diagram of method 400 for using voice commands from a mobile device to remotely access and control a general purpose computer in accordance with an exemplary embodiment of the present invention. Method 400 begins at 402 where a call is received from a mobile device. In one exemplary embodiment, the call can be received over a public switched telephone network, a wireless network, the internet, or other suitable networks. In another exemplary embodiment, the call can be encrypted or coded in other suitable manners. The method then proceeds to 404. - At 404, a session is established with the mobile device. In one exemplary embodiment, a session can be established utilizing mobile device identification and confirmation, utilizing a process such as a text password and person ID, a spoken or visual password and person ID where the person using the mobile device transmits audio or image data of themselves, which is confirmed using suitable processes at the general purpose processor, or in other suitable manners. In another exemplary embodiment, the image of the person using the mobile device can be compared with a plurality of their images using known image comparison processes to determine whether a match exists within a predetermined tolerance, allowing the person using the mobile device to be recognized in different states, such as with long hair, short hair, unkempt hair, with or without facial hair or make-up, or in other suitable states. The method then proceeds to 406.
- At 406, it is determined whether a voice or data command has been received. If it is determined that a voice or data command has not been received, after a predetermined time, the method then proceeds to 408 where a prompt can be provided to the person using the mobile device to enter a voice or data command, or other suitable processes are performed. In one exemplary embodiment, the prompt can confirm that a session has been established with the mobile device and the general purpose processing platform or other suitable computer, and can allow or provide the person with available voice commands. Likewise, other suitable processes can also or alternatively be performed, such as receipt of data commands entered by a keypad entry or keyboard entry. If it is determined at 406 that a voice or data command has been received, the method then proceeds to 410.
- At 410, the voice or data command is decoded. In one exemplary embodiment, a voice command can be decoded by determining the equivalent word for a spoken word, whereas a data command can be decoded by determining whether predetermined control data precedes the data that identifies it as a data command such as a key entry or other suitable data. The method then proceeds to 412.
- At 412, it is determined whether the command is an operating system command. If it is determined that an operating system command has been received, the method then proceeds to 414 where the operating system command is executed. In this exemplary embodiment, an operating system command can be used to generate audio and video output data, such as a command to play a movie file using a video player of the operating system, or other suitable processes. In another exemplary embodiment, the operating system command can include a request to list available operating system commands, such that the operating system command executed at 414 is selected from a list of available operating system commands. In another exemplary embodiment, operating system commands such as “find” or “run” can be utilized in conjunction with spelling commands, such as where the person spells an alphanumeric file identifier or program name. The spelled name can then be repeated back to the person for confirmation, or other suitable processes can be used. In another exemplary embodiment, a document, spreadsheet, photographic image, audio recording, video recording, web page or other suitable data viewer/player can be used to find and view/play a document, spreadsheet, photographic image, audio recording, video recording, web page or other suitable data; and navigation commands such as “back” or “forward” can be used to navigate through a set of files of documents, spreadsheets, photographic images, audio recordings, video recordings, web pages, search results of such data, or other suitable data. The method then proceeds to 422.
- At 422, the audio output data, video output data, or audio and video output data, generated at the computer is converted to a mobile device format. In one exemplary embodiment, the audio output data and video output data can be converted from a format generated at the general purpose processing platform to a format compatible with the mobile device, such as one having a reduced amount of data, a reduced number of pixels, reduced definition, reduced audio content, or other suitable formats that are optimized for the mobile device. In another exemplary embodiment, the audio output data and video output data can be converted into a format that excludes data that would not provide additional functionality, but would otherwise require excessive bandwidth requirements for transfer to
mobile device 102. In another exemplary embodiment, the audio output data and video output data can be encrypted or coded in other suitable manners. The method then proceeds to 424. - At 424, the audio output data, video output data, or audio and video output data, converted to a mobile device format is transmitted to the mobile device. In one exemplary embodiment, the audio output data and video output data are coordinated with each other, so as to provide audio output that is correlated with video output, allowing video conferencing, playback of audio-visual data, or other suitable processes. In another exemplary embodiment, the audio output data can be transmitted without the video output data being transmitted. In another exemplary embodiment, the video output data can be transmitted without the audio output data being transmitted. In another exemplary embodiment, audio output data and video output data can be converted into a format that allows data transmission to the mobile device in accordance with the bandwidth available at the time of the transmission. The method then proceeds to 426.
- At 426, it is determined whether a response has been received from the mobile device. If no response has been received, the method proceeds to 428 and terminates. Likewise, the person using the mobile device can be prompted to confirm whether or not they have received the converted audio output data, the converted video output data, or whether they are still active. A “ping” command or other suitable operations can also or alternatively be performed to determine whether a connection still exists with the mobile device. If a response is received at 426, the method then returns to 406.
- If it is determined at 412 that an operating system command has not been received, the method then proceeds to 416 where it is determined whether a native application command has been received. In one exemplary embodiment, a native application command can include a request to list available native applications, a command to request a list of available native application commands (such as from an API), or other suitable native application commands. If it is determined at 416 that a native application command has been received, the method then proceeds to 418 where the command is executed. The method then proceeds to 422. Otherwise, if it is determined at 416 that the native application command has not been received, the method proceeds to 420 where an error message is generated. In one exemplary embodiment, the error message can include feedback to the person using the mobile device that the voice or data command received was not recognized as an available operating system or native application command. The method then returns to 406.
- In operation, method 400 allows a person using a mobile device to remotely interact with a computer, whereby the person is able to command the computer from the mobile device and see and hear the images and sounds generated at the computer, in response to the commands received from the mobile device, in a manner similar to what the person would see and hear, if the person were directly and locally interacting with the computer.
- Although exemplary embodiments of the system and method of the present invention have been described in detail herein, those skilled in the art will also recognize that various substitutions and modifications can be made to the systems and methods without departing from the scope and spirit of the appended claims.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/710,692 US20200120165A1 (en) | 2007-06-04 | 2019-12-11 | Using Voice Commands From A Device To Remotely Access And Control A Computer |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/809,998 US9794348B2 (en) | 2007-06-04 | 2007-06-04 | Using voice commands from a mobile device to remotely access and control a computer |
US15/704,871 US10491679B2 (en) | 2007-06-04 | 2017-09-14 | Using voice commands from a mobile device to remotely access and control a computer |
US16/655,047 US20200053157A1 (en) | 2007-06-04 | 2019-10-16 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/677,351 US20200076900A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/710,692 US20200120165A1 (en) | 2007-06-04 | 2019-12-11 | Using Voice Commands From A Device To Remotely Access And Control A Computer |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/677,351 Continuation US20200076900A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
Publications (1)
Publication Number | Publication Date |
---|---|
US20200120165A1 true US20200120165A1 (en) | 2020-04-16 |
Family
ID=40089238
Family Applications (15)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/809,998 Active 2031-04-25 US9794348B2 (en) | 2007-06-04 | 2007-06-04 | Using voice commands from a mobile device to remotely access and control a computer |
US15/704,871 Expired - Fee Related US10491679B2 (en) | 2007-06-04 | 2017-09-14 | Using voice commands from a mobile device to remotely access and control a computer |
US16/655,054 Abandoned US20200053158A1 (en) | 2007-06-04 | 2019-10-16 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/655,047 Abandoned US20200053157A1 (en) | 2007-06-04 | 2019-10-16 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/655,061 Abandoned US20200053159A1 (en) | 2007-06-04 | 2019-10-16 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/677,369 Abandoned US20200076901A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/677,332 Abandoned US20200076899A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/677,351 Abandoned US20200076900A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/710,692 Abandoned US20200120165A1 (en) | 2007-06-04 | 2019-12-11 | Using Voice Commands From A Device To Remotely Access And Control A Computer |
US16/710,539 Abandoned US20200120164A1 (en) | 2007-06-04 | 2019-12-11 | Using Voice Commands From A Device To Remotely Access And Control A Computer |
US16/896,673 Active US11128714B2 (en) | 2007-06-04 | 2020-06-09 | Using voice commands from a mobile device to remotely access and control a computer |
US16/896,743 Abandoned US20200382601A1 (en) | 2007-06-04 | 2020-06-09 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/896,693 Abandoned US20200336546A1 (en) | 2007-06-04 | 2020-06-09 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US17/477,278 Active 2027-08-29 US11778032B2 (en) | 2007-06-04 | 2021-09-16 | Using voice commands from a mobile device to remotely access and control a computer |
US18/238,875 Pending US20230412684A1 (en) | 2007-06-04 | 2023-08-28 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
Family Applications Before (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/809,998 Active 2031-04-25 US9794348B2 (en) | 2007-06-04 | 2007-06-04 | Using voice commands from a mobile device to remotely access and control a computer |
US15/704,871 Expired - Fee Related US10491679B2 (en) | 2007-06-04 | 2017-09-14 | Using voice commands from a mobile device to remotely access and control a computer |
US16/655,054 Abandoned US20200053158A1 (en) | 2007-06-04 | 2019-10-16 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/655,047 Abandoned US20200053157A1 (en) | 2007-06-04 | 2019-10-16 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/655,061 Abandoned US20200053159A1 (en) | 2007-06-04 | 2019-10-16 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/677,369 Abandoned US20200076901A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/677,332 Abandoned US20200076899A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/677,351 Abandoned US20200076900A1 (en) | 2007-06-04 | 2019-11-07 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
Family Applications After (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/710,539 Abandoned US20200120164A1 (en) | 2007-06-04 | 2019-12-11 | Using Voice Commands From A Device To Remotely Access And Control A Computer |
US16/896,673 Active US11128714B2 (en) | 2007-06-04 | 2020-06-09 | Using voice commands from a mobile device to remotely access and control a computer |
US16/896,743 Abandoned US20200382601A1 (en) | 2007-06-04 | 2020-06-09 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US16/896,693 Abandoned US20200336546A1 (en) | 2007-06-04 | 2020-06-09 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
US17/477,278 Active 2027-08-29 US11778032B2 (en) | 2007-06-04 | 2021-09-16 | Using voice commands from a mobile device to remotely access and control a computer |
US18/238,875 Pending US20230412684A1 (en) | 2007-06-04 | 2023-08-28 | Using Voice Commands From A Mobile Device To Remotely Access And Control A Computer |
Country Status (1)
Country | Link |
---|---|
US (15) | US9794348B2 (en) |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9794348B2 (en) | 2007-06-04 | 2017-10-17 | Todd R. Smith | Using voice commands from a mobile device to remotely access and control a computer |
US8111247B2 (en) * | 2009-03-27 | 2012-02-07 | Sony Ericsson Mobile Communications Ab | System and method for changing touch screen functionality |
US9998580B2 (en) * | 2010-04-26 | 2018-06-12 | Hu-Do Ltd. | Computing device operable to work in conjunction with a companion electronic device |
KR101330671B1 (en) * | 2012-09-28 | 2013-11-15 | 삼성전자주식회사 | Electronic device, server and control methods thereof |
KR20140128814A (en) * | 2013-04-29 | 2014-11-06 | 인포뱅크 주식회사 | A portable terminal and a method for operating it |
US20160372112A1 (en) * | 2015-06-18 | 2016-12-22 | Amgine Technologies (Us), Inc. | Managing Interactions between Users and Applications |
WO2018045303A1 (en) * | 2016-09-02 | 2018-03-08 | Bose Corporation | Application-based messaging system using headphones |
US10528414B2 (en) * | 2017-09-13 | 2020-01-07 | Toshiba Memory Corporation | Centralized error handling in application specific integrated circuits |
US10782986B2 (en) | 2018-04-20 | 2020-09-22 | Facebook, Inc. | Assisting users with personalized and contextual communication content |
US11307880B2 (en) | 2018-04-20 | 2022-04-19 | Meta Platforms, Inc. | Assisting users with personalized and contextual communication content |
US10621983B2 (en) * | 2018-04-20 | 2020-04-14 | Spotify Ab | Systems and methods for enhancing responsiveness to utterances having detectable emotion |
US11676220B2 (en) | 2018-04-20 | 2023-06-13 | Meta Platforms, Inc. | Processing multimodal user input for assistant systems |
US11886473B2 (en) | 2018-04-20 | 2024-01-30 | Meta Platforms, Inc. | Intent identification for agent matching by assistant systems |
US11715042B1 (en) | 2018-04-20 | 2023-08-01 | Meta Platforms Technologies, Llc | Interpretability of deep reinforcement learning models in assistant systems |
CN109903763B (en) * | 2019-01-11 | 2022-02-22 | 百度在线网络技术(北京)有限公司 | Service control method, device and equipment |
US11200896B2 (en) * | 2019-09-03 | 2021-12-14 | Bose Corporation | Multi-home shared media |
US11861674B1 (en) | 2019-10-18 | 2024-01-02 | Meta Platforms Technologies, Llc | Method, one or more computer-readable non-transitory storage media, and a system for generating comprehensive information for products of interest by assistant systems |
US11567788B1 (en) | 2019-10-18 | 2023-01-31 | Meta Platforms, Inc. | Generating proactive reminders for assistant systems |
CN111243587A (en) * | 2020-01-08 | 2020-06-05 | 北京松果电子有限公司 | Voice interaction method, device, equipment and storage medium |
CN112669839B (en) * | 2020-12-17 | 2023-08-08 | 阿波罗智联(北京)科技有限公司 | Voice interaction method, device, equipment and storage medium |
US11563706B2 (en) | 2020-12-29 | 2023-01-24 | Meta Platforms, Inc. | Generating context-aware rendering of media contents for assistant systems |
US11809480B1 (en) | 2020-12-31 | 2023-11-07 | Meta Platforms, Inc. | Generating dynamic knowledge graph of media contents for assistant systems |
US11861315B2 (en) | 2021-04-21 | 2024-01-02 | Meta Platforms, Inc. | Continuous learning for natural-language understanding models for assistant systems |
Family Cites Families (122)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7181692B2 (en) * | 1994-07-22 | 2007-02-20 | Siegel Steven H | Method for the auditory navigation of text |
US6362897B1 (en) * | 1995-03-10 | 2002-03-26 | The Standard Register Company | Printing system and method for printing documents and forms |
US6169789B1 (en) | 1996-12-16 | 2001-01-02 | Sanjay K. Rao | Intelligent keyboard system |
US5960399A (en) | 1996-12-24 | 1999-09-28 | Gte Internetworking Incorporated | Client/server speech processor/recognizer |
IL119948A (en) * | 1996-12-31 | 2004-09-27 | News Datacom Ltd | Voice activated communication system and program guide |
US6109153A (en) | 1997-04-01 | 2000-08-29 | Dueck; Raymond | Wire driven cutter for carpet dispenser |
US5974413A (en) * | 1997-07-03 | 1999-10-26 | Activeword Systems, Inc. | Semantic user interface |
US6535243B1 (en) | 1998-01-06 | 2003-03-18 | Hewlett- Packard Company | Wireless hand-held digital camera |
WO1999044363A1 (en) | 1998-02-27 | 1999-09-02 | Ridgeway Systems And Software Ltd. | Audio-video packet synchronisation at network gateway |
US6233559B1 (en) * | 1998-04-01 | 2001-05-15 | Motorola, Inc. | Speech control of multiple applications using applets |
US6144938A (en) | 1998-05-01 | 2000-11-07 | Sun Microsystems, Inc. | Voice user interface with personality |
US9037451B2 (en) * | 1998-09-25 | 2015-05-19 | Rpx Corporation | Systems and methods for multiple mode voice and data communications using intelligently bridged TDM and packet buses and methods for implementing language capabilities using the same |
US7003463B1 (en) * | 1998-10-02 | 2006-02-21 | International Business Machines Corporation | System and method for providing network coordinated conversational services |
IL142363A0 (en) * | 1998-10-02 | 2002-03-10 | Ibm | System and method for providing network coordinated conversational services |
US6965863B1 (en) | 1998-11-12 | 2005-11-15 | Microsoft Corporation | Speech recognition user interface |
US6606599B2 (en) * | 1998-12-23 | 2003-08-12 | Interactive Speech Technologies, Llc | Method for integrating computing processes with an interface controlled by voice actuated grammars |
US6651095B2 (en) * | 1998-12-14 | 2003-11-18 | International Business Machines Corporation | Methods, systems and computer program products for management of preferences in a heterogeneous computing environment |
US6404859B1 (en) | 1999-03-16 | 2002-06-11 | Lockheed Martin Corporation | Voice enabled system for remote access of information |
US8648692B2 (en) * | 1999-07-23 | 2014-02-11 | Seong Sang Investments Llc | Accessing an automobile with a transponder |
US7203721B1 (en) * | 1999-10-08 | 2007-04-10 | At Road, Inc. | Portable browser device with voice recognition and feedback capability |
US6970915B1 (en) | 1999-11-01 | 2005-11-29 | Tellme Networks, Inc. | Streaming content over a telephone interface |
US7376586B1 (en) | 1999-10-22 | 2008-05-20 | Microsoft Corporation | Method and apparatus for electronic commerce using a telephone interface |
US6633846B1 (en) | 1999-11-12 | 2003-10-14 | Phoenix Solutions, Inc. | Distributed realtime speech recognition system |
US6339706B1 (en) | 1999-11-12 | 2002-01-15 | Telefonaktiebolaget L M Ericsson (Publ) | Wireless voice-activated remote control device |
US6895558B1 (en) * | 2000-02-11 | 2005-05-17 | Microsoft Corporation | Multi-access mode electronic personal assistant |
US6687734B1 (en) | 2000-03-21 | 2004-02-03 | America Online, Incorporated | System and method for determining if one web site has the same information as another web site |
US7096185B2 (en) | 2000-03-31 | 2006-08-22 | United Video Properties, Inc. | User speech interfaces for interactive media guidance applications |
AU2001272009A1 (en) * | 2000-06-16 | 2001-12-24 | Healthetech, Inc. | Speech recognition capability for a personal digital assistant |
US6714222B1 (en) * | 2000-06-21 | 2004-03-30 | E2 Home Ab | Graphical user interface for communications |
US7487112B2 (en) | 2000-06-29 | 2009-02-03 | Barnes Jr Melvin L | System, method, and computer program product for providing location based services and mobile e-commerce |
US6934756B2 (en) | 2000-11-01 | 2005-08-23 | International Business Machines Corporation | Conversational networking via transport, coding and control conversational protocols |
US6901270B1 (en) * | 2000-11-17 | 2005-05-31 | Symbol Technologies, Inc. | Apparatus and method for wireless communication |
US6944679B2 (en) | 2000-12-22 | 2005-09-13 | Microsoft Corp. | Context-aware systems and methods, location-aware systems and methods, context-aware vehicles and methods of operating the same, and location-aware vehicles and methods of operating the same |
US7027987B1 (en) | 2001-02-07 | 2006-04-11 | Google Inc. | Voice interface for a search engine |
US7409349B2 (en) | 2001-05-04 | 2008-08-05 | Microsoft Corporation | Servers for web enabled speech recognition |
US20020180798A1 (en) * | 2001-05-31 | 2002-12-05 | Poor Graham V. | System and method for extending a wireless device platform to multiple applications |
US20030007609A1 (en) | 2001-07-03 | 2003-01-09 | Yuen Michael S. | Method and apparatus for development, deployment, and maintenance of a voice software application for distribution to one or more consumers |
JP2003016008A (en) * | 2001-07-03 | 2003-01-17 | Sony Corp | Program, system and method for processing information |
US8121649B2 (en) * | 2001-09-05 | 2012-02-21 | Vocera Communications, Inc. | Voice-controlled communications system and method having an access device |
US20030101054A1 (en) | 2001-11-27 | 2003-05-29 | Ncc, Llc | Integrated system and method for electronic speech recognition and transcription |
US6889191B2 (en) | 2001-12-03 | 2005-05-03 | Scientific-Atlanta, Inc. | Systems and methods for TV navigation with compressed voice-activated commands |
US7493259B2 (en) | 2002-01-04 | 2009-02-17 | Siebel Systems, Inc. | Method for accessing data via voice |
AU2002235321B2 (en) | 2002-01-16 | 2006-03-16 | Vedanti Systems Limited | Optimized data transmission system and method |
US9374451B2 (en) * | 2002-02-04 | 2016-06-21 | Nokia Technologies Oy | System and method for multimodal short-cuts to digital services |
CA2477962C (en) | 2002-03-01 | 2013-07-16 | Enterasys Networks, Inc. | Location aware data network |
JP3715584B2 (en) | 2002-03-28 | 2005-11-09 | 富士通株式会社 | Device control apparatus and device control method |
AU2003222132A1 (en) * | 2002-03-28 | 2003-10-13 | Martin Dunsmuir | Closed-loop command and response system for automatic communications between interacting computer systems over an audio communications channel |
US7120255B2 (en) | 2002-04-04 | 2006-10-10 | International Business Machines Corporation | Java applications for secured palm held cellular communications |
EP1359536A3 (en) * | 2002-04-27 | 2005-03-23 | Samsung Electronics Co., Ltd. | Face recognition method and apparatus using component-based face descriptor |
US7047200B2 (en) | 2002-05-24 | 2006-05-16 | Microsoft, Corporation | Voice recognition status display |
US20060100879A1 (en) * | 2002-07-02 | 2006-05-11 | Jens Jakobsen | Method and communication device for handling data records by speech recognition |
US7693720B2 (en) | 2002-07-15 | 2010-04-06 | Voicebox Technologies, Inc. | Mobile systems and methods for responding to natural language speech utterance |
US20040027392A1 (en) * | 2002-08-08 | 2004-02-12 | Dunn Loren S. | System and method for quick access of computer resources to control and configure a computer |
US7421390B2 (en) * | 2002-09-13 | 2008-09-02 | Sun Microsystems, Inc. | Method and system for voice control of software applications |
US20050180464A1 (en) | 2002-10-01 | 2005-08-18 | Adondo Corporation | Audio communication with a computer |
US20060276230A1 (en) | 2002-10-01 | 2006-12-07 | Mcconnell Christopher F | System and method for wireless audio communication with a computer |
US20040086120A1 (en) * | 2002-11-06 | 2004-05-06 | Akins Glendon L. | Selecting and downloading content to a portable player |
EP2017828A1 (en) * | 2002-12-10 | 2009-01-21 | Kirusa, Inc. | Techniques for disambiguating speech input using multimodal interfaces |
US7822612B1 (en) * | 2003-01-03 | 2010-10-26 | Verizon Laboratories Inc. | Methods of processing a voice command from a caller |
WO2004066125A2 (en) | 2003-01-14 | 2004-08-05 | V-Enable, Inc. | Multi-modal information retrieval system |
JP4107093B2 (en) | 2003-01-30 | 2008-06-25 | 株式会社日立製作所 | Interactive terminal device and interactive application providing method |
US7392188B2 (en) | 2003-07-31 | 2008-06-24 | Telefonaktiebolaget Lm Ericsson (Publ) | System and method enabling acoustic barge-in |
US7349845B2 (en) * | 2003-09-03 | 2008-03-25 | International Business Machines Corporation | Method and apparatus for dynamic modification of command weights in a natural language understanding system |
US7539619B1 (en) * | 2003-09-05 | 2009-05-26 | Spoken Translation Ind. | Speech-enabled language translation system and method enabling interactive user supervision of translation and speech recognition accuracy |
CA2537977A1 (en) * | 2003-09-05 | 2005-03-17 | Stephen D. Grody | Methods and apparatus for providing services using speech recognition |
US7194259B2 (en) | 2003-09-05 | 2007-03-20 | Sony Ericsson Mobile Communications Ab | Remote control device having wireless phone interface |
KR100547445B1 (en) | 2003-11-11 | 2006-01-31 | 주식회사 코스모탄 | Shifting processing method of digital audio signal and audio / video signal and shifting reproduction method of digital broadcasting signal using the same |
US6993166B2 (en) | 2003-12-16 | 2006-01-31 | Motorola, Inc. | Method and apparatus for enrollment and authentication of biometric images |
DE10360656A1 (en) * | 2003-12-23 | 2005-07-21 | Daimlerchrysler Ag | Operating system for a vehicle |
US7565139B2 (en) | 2004-02-20 | 2009-07-21 | Google Inc. | Image-based search engine for mobile phones with camera |
US20050206513A1 (en) | 2004-03-17 | 2005-09-22 | Fallon Kenneth T | Voice remote command and control of a mapping security system |
US20050223101A1 (en) * | 2004-03-22 | 2005-10-06 | International Business Machines Corporation | Computer-implemented method, system and program product for resolving prerequisites for native applications utilizing an open service gateway initiative ( OSGi) framework |
US7751535B2 (en) * | 2004-04-28 | 2010-07-06 | Nuance Communications, Inc. | Voice browser implemented as a distributable component |
US20060041926A1 (en) | 2004-04-30 | 2006-02-23 | Vulcan Inc. | Voice control of multimedia content |
AU2005246437B2 (en) | 2004-05-21 | 2011-10-06 | Voice On The Go Inc. | Remote access system and method and intelligent agent therefor |
KR100982197B1 (en) | 2004-06-21 | 2010-09-14 | 구글 인코포레이티드 | Face recognition method, face recognition apparatus, and computer readable storage medium |
JP2008504607A (en) | 2004-06-22 | 2008-02-14 | ヴォイス シグナル テクノロジーズ インコーポレーティッド | Extensible voice commands |
CA2575581C (en) * | 2004-07-30 | 2013-02-12 | Research In Motion Limited | System and method for providing a communications client on a host device |
US20080275704A1 (en) | 2004-08-06 | 2008-11-06 | Koninklijke Philips Electronics, N.V. | Method for a System of Performing a Dialogue Communication with a User |
US20070061488A1 (en) * | 2004-09-20 | 2007-03-15 | Trilibis Inc. | System and method for flexible user interfaces |
US8942985B2 (en) * | 2004-11-16 | 2015-01-27 | Microsoft Corporation | Centralized method and system for clarifying voice commands |
US7298873B2 (en) | 2004-11-16 | 2007-11-20 | Imageware Systems, Inc. | Multimodal biometric platform |
US7895594B2 (en) * | 2005-03-28 | 2011-02-22 | Freescale Semiconductor, Inc. | Virtual machine extended capabilities using application contexts in a resource-constrained device |
US7721301B2 (en) * | 2005-03-31 | 2010-05-18 | Microsoft Corporation | Processing files from a mobile device using voice commands |
DE102005016853A1 (en) * | 2005-04-12 | 2006-10-19 | Siemens Ag | Voice-operated applications controlling method for use in medical device, involves activating or deactivating application assigned to key term upon determining key term in recorded voice data stream, which is assigned to authorized user |
US7516478B2 (en) * | 2005-06-03 | 2009-04-07 | Microsoft Corporation | Remote management of mobile devices |
US20060277043A1 (en) | 2005-06-06 | 2006-12-07 | Edward Tomes | Voice authentication system and methods therefor |
CA2654867C (en) | 2005-06-13 | 2018-05-22 | E-Lane Systems Inc. | Vehicle immersive communication system |
US20070005370A1 (en) | 2005-06-30 | 2007-01-04 | Scott Elshout | Voice-activated control system |
US7640160B2 (en) * | 2005-08-05 | 2009-12-29 | Voicebox Technologies, Inc. | Systems and methods for responding to natural language speech utterance |
EP1922717A4 (en) * | 2005-08-09 | 2011-03-23 | Mobile Voice Control Llc | Use of multiple speech recognition software instances |
US7949529B2 (en) | 2005-08-29 | 2011-05-24 | Voicebox Technologies, Inc. | Mobile systems and methods of supporting natural language human-machine interactions |
US20070047719A1 (en) | 2005-09-01 | 2007-03-01 | Vishal Dhawan | Voice application network platform |
US7539472B2 (en) * | 2005-09-13 | 2009-05-26 | Microsoft Corporation | Type-ahead keypad input for an input device |
US20070123191A1 (en) * | 2005-11-03 | 2007-05-31 | Andrew Simpson | Human-machine interface for a portable electronic device |
KR100632400B1 (en) * | 2005-11-11 | 2006-10-11 | 한국전자통신연구원 | Apparatus and method for input/output using voice recognition |
US20070184899A1 (en) * | 2006-02-03 | 2007-08-09 | Nokia Corporation | Gaming device, method, and computer program product for modifying input to a native application to present modified output |
US20090222270A2 (en) | 2006-02-14 | 2009-09-03 | Ivc Inc. | Voice command interface device |
JP4997796B2 (en) * | 2006-03-13 | 2012-08-08 | 株式会社デンソー | Voice recognition device and navigation system |
US8478860B2 (en) * | 2006-03-14 | 2013-07-02 | Strong Bear L.L.C. | Device detection system for monitoring use of removable media in networked computers |
WO2007117626A2 (en) | 2006-04-05 | 2007-10-18 | Yap, Inc. | Hosted voice recognition system for wireless devices |
US7747605B2 (en) * | 2006-04-17 | 2010-06-29 | Perry J. Narancic | Organizational data analysis and management |
US20070249365A1 (en) * | 2006-04-20 | 2007-10-25 | Sony Ericsson Mobile Communications Ab | Device, method and computer program for connecting a mobile device to a wireless network |
EP2044804A4 (en) | 2006-07-08 | 2013-12-18 | Personics Holdings Inc | Personal audio assistant device and method |
US8073697B2 (en) * | 2006-09-12 | 2011-12-06 | International Business Machines Corporation | Establishing a multimodal personality for a multimodal application |
US8214208B2 (en) * | 2006-09-28 | 2012-07-03 | Reqall, Inc. | Method and system for sharing portable voice profiles |
US7944357B2 (en) * | 2006-12-18 | 2011-05-17 | Cummings Engineering Consultants, Inc. | Method and system for a grass roots intelligence program |
US20080155637A1 (en) | 2006-12-20 | 2008-06-26 | General Instrument Corporation | Method and System for Acquiring Information on the Basis of Media Content |
US20080153465A1 (en) | 2006-12-26 | 2008-06-26 | Voice Signal Technologies, Inc. | Voice search-enabled mobile device |
US8601515B2 (en) * | 2006-12-28 | 2013-12-03 | Motorola Mobility Llc | On screen alert to indicate status of remote recording |
US8495510B2 (en) * | 2006-12-29 | 2013-07-23 | Sap Ag | System and method for managing browser extensions |
US7725594B2 (en) | 2006-12-29 | 2010-05-25 | Verizon Patent And Licensing Inc. | Assigning priority to network traffic at customer premises |
US8731146B2 (en) | 2007-01-04 | 2014-05-20 | At&T Intellectual Property I, L.P. | Call re-directed based on voice command |
WO2008091727A1 (en) | 2007-01-23 | 2008-07-31 | Johnson Controls Technology Company | Mobile device gateway systems and methods |
US7751807B2 (en) * | 2007-02-12 | 2010-07-06 | Oomble, Inc. | Method and system for a hosted mobile management service architecture |
US8650030B2 (en) * | 2007-04-02 | 2014-02-11 | Google Inc. | Location based responses to telephone requests |
US8114828B2 (en) | 2007-04-16 | 2012-02-14 | Honeywell International Inc. | Azeotrope-like compositions of tetrafluoropropene and alcohols |
WO2008144638A2 (en) * | 2007-05-17 | 2008-11-27 | Redstart Systems Inc. | Systems and methods of a structured grammar for a speech recognition command system |
US9794348B2 (en) | 2007-06-04 | 2017-10-17 | Todd R. Smith | Using voice commands from a mobile device to remotely access and control a computer |
US9576572B2 (en) * | 2012-06-18 | 2017-02-21 | Telefonaktiebolaget Lm Ericsson (Publ) | Methods and nodes for enabling and producing input to an application |
US9548066B2 (en) * | 2014-08-11 | 2017-01-17 | Amazon Technologies, Inc. | Voice application architecture |
US10258295B2 (en) * | 2017-05-09 | 2019-04-16 | LifePod Solutions, Inc. | Voice controlled assistance for monitoring adverse events of a user and/or coordinating emergency actions such as caregiver communication |
-
2007
- 2007-06-04 US US11/809,998 patent/US9794348B2/en active Active
-
2017
- 2017-09-14 US US15/704,871 patent/US10491679B2/en not_active Expired - Fee Related
-
2019
- 2019-10-16 US US16/655,054 patent/US20200053158A1/en not_active Abandoned
- 2019-10-16 US US16/655,047 patent/US20200053157A1/en not_active Abandoned
- 2019-10-16 US US16/655,061 patent/US20200053159A1/en not_active Abandoned
- 2019-11-07 US US16/677,369 patent/US20200076901A1/en not_active Abandoned
- 2019-11-07 US US16/677,332 patent/US20200076899A1/en not_active Abandoned
- 2019-11-07 US US16/677,351 patent/US20200076900A1/en not_active Abandoned
- 2019-12-11 US US16/710,692 patent/US20200120165A1/en not_active Abandoned
- 2019-12-11 US US16/710,539 patent/US20200120164A1/en not_active Abandoned
-
2020
- 2020-06-09 US US16/896,673 patent/US11128714B2/en active Active
- 2020-06-09 US US16/896,743 patent/US20200382601A1/en not_active Abandoned
- 2020-06-09 US US16/896,693 patent/US20200336546A1/en not_active Abandoned
-
2021
- 2021-09-16 US US17/477,278 patent/US11778032B2/en active Active
-
2023
- 2023-08-28 US US18/238,875 patent/US20230412684A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20200076900A1 (en) | 2020-03-05 |
US20200053159A1 (en) | 2020-02-13 |
US11778032B2 (en) | 2023-10-03 |
US20200336545A1 (en) | 2020-10-22 |
US20200336546A1 (en) | 2020-10-22 |
US20200120164A1 (en) | 2020-04-16 |
US20230412684A1 (en) | 2023-12-21 |
US10491679B2 (en) | 2019-11-26 |
US20180020059A1 (en) | 2018-01-18 |
US20200076901A1 (en) | 2020-03-05 |
US20200053158A1 (en) | 2020-02-13 |
US11128714B2 (en) | 2021-09-21 |
US20220006866A1 (en) | 2022-01-06 |
US20080300884A1 (en) | 2008-12-04 |
US20200382601A1 (en) | 2020-12-03 |
US20200053157A1 (en) | 2020-02-13 |
US9794348B2 (en) | 2017-10-17 |
US20200076899A1 (en) | 2020-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11778032B2 (en) | Using voice commands from a mobile device to remotely access and control a computer | |
US20050261890A1 (en) | Method and apparatus for providing language translation | |
US7653183B2 (en) | Method and apparatus to provide data to an interactive voice response (IVR) system | |
US9530415B2 (en) | System and method of providing speech processing in user interface | |
KR20060077988A (en) | System and method for information providing service through retrieving of context in multimedia communication system | |
US20020095294A1 (en) | Voice user interface for controlling a consumer media data storage and playback device | |
US9110888B2 (en) | Service server apparatus, service providing method, and service providing program for providing a service other than a telephone call during the telephone call on a telephone | |
US20080043418A1 (en) | Video communication apparatus using VoIP and method of operating the same | |
US11967336B2 (en) | Method for providing speech video and computing device for executing the method | |
WO2015023138A1 (en) | System and method for providing speech recognition-based messaging interpretation service | |
US9343065B2 (en) | System and method for processing a keyword identifier | |
US9277051B2 (en) | Service server apparatus, service providing method, and service providing program | |
JP2004356896A (en) | Automatic answering machine and automatic answering system using same, and telephone banking system | |
US20220199096A1 (en) | Information processing apparatus and information processing method | |
KR20210032643A (en) | Imagfe Checking system of Smart Terminal by Using Voice Data and Method Thereof | |
WO2011125066A1 (en) | A cost effective communication device | |
JP2021078037A (en) | Automatic voice response device, server device, automatic voice response method, page transmission method, and program | |
JP2001145082A (en) | Voice response type information service system | |
CN107948696A (en) | A kind of set-top box text entry method, system and set-top box | |
JP2001109687A (en) | Device and method for accessing home page | |
KR20090012603A (en) | Method and device for transferring another data in the process of displaying video data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: VOICE TECH CORPORATION, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SMITH, TODD R;REEL/FRAME:052095/0365 Effective date: 20181222 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |