WO2020165816A1 - Système et procédés pour fournir une assistance personnelle aux utilisateurs - Google Patents

Système et procédés pour fournir une assistance personnelle aux utilisateurs Download PDF

Info

Publication number
WO2020165816A1
WO2020165816A1 PCT/IB2020/051176 IB2020051176W WO2020165816A1 WO 2020165816 A1 WO2020165816 A1 WO 2020165816A1 IB 2020051176 W IB2020051176 W IB 2020051176W WO 2020165816 A1 WO2020165816 A1 WO 2020165816A1
Authority
WO
WIPO (PCT)
Prior art keywords
personal assistance
users
assistance device
service
computing device
Prior art date
Application number
PCT/IB2020/051176
Other languages
English (en)
Inventor
Rahul Devarakonda
Lakshman Sandep GONDI
Sirisha Gondi
Original Assignee
Woobloo Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Woobloo Inc. filed Critical Woobloo Inc.
Publication of WO2020165816A1 publication Critical patent/WO2020165816A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal

Definitions

  • the disclosed subject matter relates generally to personal assistance systems. More particularly, the present disclosure relates to a system and methods for providing personal assistance to the users.
  • Exemplary embodiments of the present disclosure is directed towards a system and methods for providing personal assistance to the users.
  • An objective of the present disclosure is directed towards understanding and communicating in vernacular languages with automated voice recognition and natural language processing techniques.
  • Another objective of the present disclosure is directed towards sending notifications to the emergency contacts in real time.
  • Another objective of the present disclosure is directed towards alerting nearby people and notifying the emergency services.
  • Another objective of the present disclosure is directed towards understanding user requirements and responding back appropriately by having the smart conversation capability.
  • Another objective of the present disclosure is directed towards requesting the personal assistant to accomplish the service requirements of the users.
  • Another objective of the present disclosure is directed towards providing services to the users in a single click.
  • the system comprises at least one personal assistance device connected with a first computing device and a second computing device via a network to accomplish a plurality of personal assistance services for a plurality of users.
  • the first and second computing devices comprise a service management module configured to accomplish the plurality of personal assistance services for the plurality of users.
  • the at least one personal assistance device configured to convey user queries and service requests to a plurality of personal assistance service providers to provide the selected personal assistance service.
  • the at least one personal assistance device comprises a first switch configured to enable the at least one personal assistance device to establish the connection with the second computing device through the network, the at least one personal assistance device comprises a second switch configured to generate a plurality of emergency notifications to the plurality of users.
  • the system further comprises at least one cloud server is associated with the plurality of personal assistance services and displays the requested personal assistance services to the plurality of users on the first computing device, the at least one cloud server comprises at least one artificial intelligence engine has a smart conversation capability to understand the user queries and service requests and also has a capability to respond back appropriately with the plurality of users.
  • FIG. 1 is a diagram depicting a schematic representation of a system for providing personal assistance to the users, in accordance with one or more exemplary embodiments.
  • FIG. 2 is a block diagram depicting a schematic representation of the personal assistance device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • FIG. 3 is a block diagram depicting a schematic representation of the service management module 114 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • FIG. 4A-4B are diagrams depicting exemplary embodiments of the personal assistance device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • FIG. 5 is a flowchart depicting an exemplary method for providing personal assistance to the users, in accordance with one or more exemplary embodiments.
  • FIG. 6 is a flowchart depicting an exemplary method for providing personal assistance to the users by tapping the first switch or second switch, in accordance with one or more exemplary embodiments.
  • FIG. 7 is a flowchart depicting an exemplary method for displaying the media content, in accordance with one or more exemplary embodiments.
  • FIG. 8 is a flowchart depicting an exemplary method for responding to the user’s requests by the personal assistance device, in accordance with one or more exemplary embodiments.
  • FIG. 9 is a block diagram illustrating the details of a digital processing system in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • FIG. 1 is a diagram 100 depicting a schematic representation of a system for providing personal assistance to the users, in accordance with one or more exemplary embodiments.
  • the environment 100 depicts a personal assistance device 102, a first computing device 104, a network 106, a second computing device 108, and a cloud server 110.
  • the personal assistance device 102 may include a processing device 112.
  • the processing device 112 may include but not limited to, a microcontroller (for example ARM 7 or ARM 11), a raspberry pi3 or a Pine 64 or any other 64 bit processor which can run Linux OS, a microprocessor, a digital signal processor, a microcomputer, a field programmable gate array, a programmable logic device, a state machine or logic circuitry, PC board.
  • the first computing device 104 may include a service management module 114 which is accessed as mobile applications, web applications, software that offers the functionality of accessing mobile applications, and viewing/processing of interactive pages, for example, are implemented in the computing devices 104, 108 as will be apparent to one skilled in the relevant arts by reading the disclosure provided herein.
  • the service management module 114 may be downloaded from the cloud server 110.
  • the service management module 114 may be any suitable applications downloaded from, GOOGLE PLAY® (for Google Android devices), Apple Inc.'s APP STORE® (for Apple devices, or any other suitable database.
  • the service management module 114 may be software, firmware, or hardware that is integrated into first and second computing devices 104 and 108.
  • the first and second computing devices 104 and 108 may include but not limited to, a computer workstation, an interactive kiosk, and a personal mobile computing device such as a digital assistant, a mobile phone, a laptop, and storage devices, backend servers hosting database and other software and so forth.
  • the personal assistance device 102 may be situated at the remote location of the users, including, for example, customers, individuals, clients, person, consumers, entities, shoppers, and so forth.
  • the personal assistance device 102 may be connected with the first computing device 104 via a network 106 to accomplish the personal assistance services for the users by the service management module 114.
  • the first computing device 104 may also be referred as an IoT device.
  • the personal assistance services may include but not limited to, a food ordering service, a flight ticket booking service, a bus ticket booking service, a train ticket booking service, a movie ticket booking service, a hotel booking service, a pharma service, a grocery delivering service, a medical taxi service, a paying utility bills service, a delivering gift service, a cab or car rental booking service, appointing a driver service, Handyman Services (Electrician, Plumber, Carpenter, AC Repair, etc...), and so forth.
  • the personal assistance device 102 may be powered by ASR (automated speech recognition) and NLP (natural language processing) techniques.
  • the personal assistance device 102 may be capable to understand and communicate in vernacular languages with the users.
  • the personal assistance device 102 may also be capable to display the screen mirroring through the projector (not shown).
  • the screen mirroring may include but not limited to, air play, chrome casting and so forth.
  • the network 106 may include but not limited to, an Internet of things (IoT network devices), an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service, a RFID module, a NFC module, wired cables, such as the world-wide-web based Internet, or other types of networks may include Transport Control Protocol/Internet Protocol (TCP/IP) or device addresses (e.g.
  • the service management module 114 may be configured to provide the communication channel between the personal assistance device 102 and the second computing device 108 through the network 106.
  • the personal assistance device 102 may be configured to convey the user queries and service requests to the personal assistance service providers to provide the selected personal assistance service.
  • the personal assistance service providers may include but not limited to, operators, workers, employees, agents, and so forth.
  • the users may able to convey the queries and service requests by a wake word (For example,“Bloo” or“WooBloo” or“Hello personal assistant”).
  • the queries and service requests may include but not limited to, book a cab or order food or book movie tickets, and so forth.
  • the second computing device 108 may be situated at multiple locations of the personal assistance services and is operated by the personal assistance service providers.
  • the cloud server 110 here may be referred to a cloud or a physical server located in a remote location and is associated with personal assistance services and displays requested personal assistance service to the users on the first computing device 104.
  • the artificial intelligence engine 116 may have a smart conversation capability to understand user queries and service requests.
  • the artificial intelligence engine 116 may have a capability to respond back appropriately with the users.
  • FIG. 2 is a block diagram 200 depicting a schematic representation of the personal assistance device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • the personal assistance device 102 may include the processing device 112, a microphone 202, a network module 204, a GPS module 206, Universal Serial Bus (USB) port 208, a battery 210, a voice recognizer 212, and a natural language processor 214.
  • the microphone 202 may be configured to receive the audio signals (user queries or service requests, for e.g.) of the users. The received audio signals may be delivered to the voice recognizer 212 and thereby delivers the audio signals to the natural language processor 214.
  • the natural language processor 214 may be configured to process and recognize the audio signals are in a natural language context.
  • the recognized audio signals may be allowed to perform the intended actions (For example, perform an action, retrieve content, launch an application, etc.) by the personal assistance device 102.
  • the network module 204 may be coupled to the processing device 112 and may be configured to connect personal assistance device 102 with the first computing device 104 and the second computing device 108.
  • the network module 204 for example, a subscriber identity module may be placed in a slot of the wireless terminal to establish the unique identity of the subscriber to the telecom network.
  • the subscriber identity module may include, but are not limited to, GSM module, a CDMA module or a TDM A module or any other type of modules.
  • the network module 204 may be configured to send emergency notifications to the registered emergency contacts.
  • the emergency contacts may include, but not limited to, friends, family members, relatives, neighbours, emergency service providers, and so forth.
  • the emergency notifications may include but not limited to, SMS, alerts, email, warnings, and so forth.
  • the network module 204 may be allowed to communicate with the second computing device 108 through the network 106.
  • the GPS module 206 may be coupled to the processing device 112 and may be configured to identify the remote location of the personal assistance device 102.
  • the universal serial bus (USB) port 208 may be coupled to the processing device 112 and may be configured to charge the battery 210 positioned in the personal assistance device 102 by using electrical cables.
  • USB universal serial bus
  • FIG. 3 is a block diagram 300 depicting a schematic representation of the service management module 114 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • the service management module 114 may include a bus 301, a voice recognition module 302, a natural language processing module 304, an operator assistance module 306, and a central database 308.
  • the service management module 114 may be activated by tapping the first computing device 104 or by stating a wake word.
  • the voice recognition module 302 may be configured to detect or receive the audio signals (user queries or service requests, for e.g.) and transmits to the second computing device 108 through the network 106.
  • the natural language processing module 304 may be configured to determine the user intent based on the detected audio signals received from the first computing device 104.
  • the operator assistance module 306 may be configured to provide the communication channel between the personal assistance device 102 and the second computing device 108 for providing the personal assistance services to the users.
  • the central database 308 may be configured to store, retrieve, process, analyse, and/or generate data to be provided to the personal assistance device 102.
  • the bus 301 may include a path that permits communication among the modules of the service management module 114.
  • FIG. 4A-4B are diagrams 400a-400b depicting exemplary embodiments of the personal assistance device 102 shown in FIG. 1, in accordance with one or more exemplary embodiments.
  • the personal assistance device 102 includes a first switch 402, a second switch 404, a microphone 406, a universal serial bus port (USB) 408, a projector (a type of a display device) 410, and audio buttons 412a, 412b, 412c, 412d.
  • the first switch 402 may be configured to enable the personal assistance device 102 to establish the connection with the second computing device 108 through the network 106.
  • the second switch 404 may be configured to generate the emergency notifications to the registered emergency contacts.
  • the second switch 404 may also be configured to trigger an audible sound to alert the nearby people.
  • the audible sound may include but not limited to, alarm, siren, alert, bell, and so forth.
  • the microphone 406 may be configured to receive the audio signals (user queries or service requests, for e.g.) of the users.
  • the universal serial bus port 408 may be configured to charge the battery positioned in the personal assistance device 102 by using electrical cables (not shown).
  • the projector 410 may be configured to display the output images/videos of intended personal service selected by the users.
  • the intended personal service images may be displayed on a wall, screen, or surfaces, or in air without the requirement of any surface and so forth.
  • the projector 410 may include a cathode ray tube projector, an IXD projector, a digital light processing projector, a liquid crystal display projector, laser display projectors, and so forth.
  • the 412a, 412b, 412c, 412d may be configured to decrease the audio volume, mute the audio, pause the audio and to increase the audio volume.
  • FIG. 5 is a flowchart 500 depicting an exemplary method for providing personal assistance to the users, in accordance with one or more exemplary embodiments.
  • the method 500 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, and FIG. 4.
  • the method 500 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the exemplary method 500 commences at step 502, activating a personal assistance device by detecting a wake word or taping the first switch. Thereafter, at step 504, requesting the intended personal assistance service using the personal assistance device by a user. Thereafter, at step 506, establishing the communication between the personal assistance device and a second computing device through a network. Thereafter, at step 508, responding to the user’s request by the personal assistance service providers to perform intended actions.
  • FIG. 6 is a flowchart 600 depicting an exemplary method for providing personal assistance to the users by tapping the first switch or second switch, in accordance with one or more exemplary embodiments.
  • the method 600 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, FIG. 4 and FIG.5.
  • the method 600 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the exemplary method 600 commences at step 602, determining whether the user tapped the first switch or second switch. If the user selected switch is the first switch, activating the personal assistance device for the personal assistance of the users at step 604. Thereafter, at step 606, requesting the intended personal assistance service using the microphone. Thereafter, at step 608, allowing the personal assistance device to connect with the second computing device. Thereafter, at step 610, responding to the service requests and performing the intended actions by the personal assistance service providers. If the user selected switch is the second switch, sending notifications to the emergency contacts at step 612. Thereafter, at step 614, alerting the nearby people by triggering the audible sound.
  • FIG. 7 is a flowchart 700 depicting an exemplary method for displaying the media content, in accordance with one or more exemplary embodiments.
  • the method 700 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, FIG. 4 FIG.5, and FIG. 6.
  • the method 700 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the exemplary method 700 commences at step 702, activating the personal assistance device with the wake word or tapping the first switch by the user. Thereafter, at step 704, requesting to book the movie tickets (For example) by using the personal assistance device. Thereafter, at step 706, responding to the request by the personal assistance service providers by using the second computing device through the network. Thereafter, at step 708, displaying the media content on the surface of the wall / table / in the air using the projector positioned on the personal assistance device. Thereafter, at step 710, selecting the predetermined movie theater and reserving the seat by viewing the displayed media content.
  • FIG. 8 is a flowchart 800 depicting an exemplary method for responding to the user’s requests by the personal assistance device, in accordance with one or more exemplary embodiments.
  • the method 800 is carried out in the context of the details of FIG. 1, FIG. 2 FIG. 3, FIG. 4 FIG.5, FIG. 6, and FIG. 8.
  • the method 800 is carried out in any desired environment. Further, the aforementioned definitions are equally applied to the description below.
  • the method commences at step 802, tapping the first switch to activate the personal assistance device by the user. Thereafter, at step 804, requesting the intended personal assistance service using the microphone. Thereafter, at step 806, sending the audio signals to a natural language processor from the microphone. Thereafter, at step 808, recognizing the natural language context by the natural language processor. Thereafter, at step 810, connecting the personal assistance device to the second computing device through the network. Thereafter, at step 812, responding to the requests by retrieving the content from the cloud server using the artificial intelligence engine.
  • FIG. 9 is a block diagram 900 illustrating the details of a digital processing system 900 in which various aspects of the present disclosure are operative by execution of appropriate software instructions.
  • the Digital processing system 900 may correspond to the computing devices 104, 108 (or any other system in which the various features disclosed above can be implemented).
  • Digital processing system 900 may contain one or more processors such as a central processing unit (CPU) 910, random access memory (RAM) 920, secondary memory 627, graphics controller 960, display unit 970, network interface 980, and input interface 990. All the components except display unit 970 may communicate with each other over communication path 950, which may contain several buses as is well known in the relevant arts. The components of Figure 9 are described below in further detail.
  • CPU central processing unit
  • RAM random access memory
  • secondary memory 627 secondary memory 627
  • graphics controller 960 graphics controller 960
  • display unit 970 may communicate with each other over communication path 950, which may contain several buses as is well known in the relevant arts.
  • the components of Figure 9 are described below in further detail.
  • CPU 910 may execute instructions stored in RAM 920 to provide several features of the present disclosure.
  • CPU 910 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 910 may contain only a single general-purpose processing unit.
  • RAM 920 may receive instructions from secondary memory 930 using communication path 950.
  • RAM 920 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 925 and/or user programs 926.
  • Shared environment 925 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 926.
  • Graphics controller 960 generates display signals (e.g., in RGB format) to display unit 970 based on data/instructions received from CPU 910.
  • Display unit 970 contains a display screen to display the images defined by the display signals.
  • Input interface 990 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs.
  • Network interface 980 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1) connected to the network 106.
  • Secondary memory 930 may contain hard drive 935, flash memory 936, and removable storage drive 937. Secondary memory 930 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 900 to provide several features in accordance with the present disclosure.
  • removable storage unit 940 Some or all of the data and instructions may be provided on removable storage unit 940, and the data and instructions may be read and provided by removable storage drive 937 to CPU 910.
  • Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 937.
  • Removable storage unit 940 may be implemented using medium and storage format compatible with removable storage drive 937 such that removable storage drive 937 can read the data and instructions.
  • removable storage unit 940 includes a computer readable (storage) medium having stored therein computer software and/or data.
  • the computer (or machine, in general) readable medium can be in other forms (e.g., non removable, random access, etc.).
  • computer program product is used to generally refer to removable storage unit 940 or hard disk installed in hard drive 935. These computer program products are means for providing software to digital processing system 900.
  • CPU 910 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.
  • Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 930.
  • Volatile media includes dynamic memory, such as RAM 920.
  • Storage media includes, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASF1-EPROM, NVRAM, any other memory chip or cartridge.
  • Storage media is distinct from but may be used in conjunction with transmission media.
  • Transmission media participates in transferring information between storage media.
  • transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus (communication path) 950.
  • Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des exemples de modes de réalisation de la présente invention concernent un système pour fournir une assistance personnelle aux utilisateurs, comprenant : un dispositif d'assistance personnelle connecté à un premier dispositif informatique et à un second dispositif informatique par l'intermédiaire d'un réseau, les premier et second dispositifs informatiques comprennent un module de gestion de services configuré pour accomplir des services d'assistance personnelle pour les utilisateurs, le dispositif d'assistance personnelle comprend un premier commutateur configuré pour permettre à un dispositif d'assistance personnelle d'établir une connexion avec un second dispositif informatique par l'intermédiaire d'un réseau et un second commutateur configuré pour générer des notifications d'urgence à des utilisateurs, et un serveur en nuage est associé à des services d'assistance personnelle et affiche des services d'assistance personnelle demandés à l'intention d'une pluralité d'utilisateurs sur le premier dispositif informatique, le serveur en nuage comprend un moteur d'intelligence artificielle qui a une capacité de conversation intelligente pour comprendre des interrogations d'utilisateur et des demandes de service et a également une capacité de réponse adaptée aux utilisateurs.
PCT/IB2020/051176 2019-02-13 2020-02-13 Système et procédés pour fournir une assistance personnelle aux utilisateurs WO2020165816A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201941005740 2019-02-13
IN201941005740A IN201941005740A (fr) 2019-02-13 2019-02-13

Publications (1)

Publication Number Publication Date
WO2020165816A1 true WO2020165816A1 (fr) 2020-08-20

Family

ID=72045359

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/051176 WO2020165816A1 (fr) 2019-02-13 2020-02-13 Système et procédés pour fournir une assistance personnelle aux utilisateurs

Country Status (2)

Country Link
IN (1) IN201941005740A (fr)
WO (1) WO2020165816A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020225648A1 (fr) * 2019-05-07 2020-11-12 Plusneed Solutions Pvt Ltd Système et procédé cognitifs, décentralisés et sans commission, pour répondre aux besoins mutuels entre utilisateurs en temps réel
WO2021009772A1 (fr) * 2019-07-13 2021-01-21 Twenty 4 Ventures Group Limited Système et procédés de services d'assistance et de sécurité

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040218732A1 (en) * 2001-01-22 2004-11-04 Royal Thoughts, L.L.C. Assisted personal communication system and method
US20110025262A1 (en) * 2001-03-01 2011-02-03 Research In Motion Limited Multifunctional Charger System and Method
US20140222436A1 (en) * 2013-02-07 2014-08-07 Apple Inc. Voice trigger for a digital assistant
US20140365885A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US20150199401A1 (en) * 2014-01-10 2015-07-16 Cellco Partnership D/B/A Verizon Wireless Personal assistant application
US20150278679A1 (en) * 2014-01-30 2015-10-01 Vishal Sharma Mobile personal assistant device to remotely control external services and selectively share control of the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040218732A1 (en) * 2001-01-22 2004-11-04 Royal Thoughts, L.L.C. Assisted personal communication system and method
US20110025262A1 (en) * 2001-03-01 2011-02-03 Research In Motion Limited Multifunctional Charger System and Method
US20140222436A1 (en) * 2013-02-07 2014-08-07 Apple Inc. Voice trigger for a digital assistant
US20140365885A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant
US20150199401A1 (en) * 2014-01-10 2015-07-16 Cellco Partnership D/B/A Verizon Wireless Personal assistant application
US20150278679A1 (en) * 2014-01-30 2015-10-01 Vishal Sharma Mobile personal assistant device to remotely control external services and selectively share control of the same

Also Published As

Publication number Publication date
IN201941005740A (fr) 2019-02-22

Similar Documents

Publication Publication Date Title
US10650816B2 (en) Performing tasks and returning audio and visual feedbacks based on voice command
EP3485489B1 (fr) Mots-clés contextuels
WO2019223390A1 (fr) Système, dispositif, appareil et procédé de traitement de données de guidage d'autorisation
US10031721B2 (en) System and method for processing control commands in a voice interactive system
JP2023002502A (ja) ホームオートメーションのためのインテリジェントアシスタント
US20240184517A1 (en) Associating of computing devices
US11204681B2 (en) Program orchestration method and electronic device
KR20190100512A (ko) 챗봇과 대화하기 위한 전자 장치 및 그의 동작 방법
US11494376B2 (en) Data query method supporting natural language, open platform, and user terminal
US10249296B1 (en) Application discovery and selection in language-based systems
US11487491B2 (en) Screen projection method and apparatus, and storage medium
US11195122B2 (en) Intelligent user notification during an event in an internet of things (IoT) computing environment
US20220068272A1 (en) Context-based dynamic tolerance of virtual assistant
US20180367669A1 (en) Input during conversational session
WO2020165816A1 (fr) Système et procédés pour fournir une assistance personnelle aux utilisateurs
US11226835B2 (en) Determination and initiation of a computing interface for computer-initiated task response
KR20200013774A (ko) 보이스 가능 디바이스를 디스플레이 디바이스와 페어링
US10592832B2 (en) Effective utilization of idle cycles of users
US20180137859A1 (en) Method and apparatus for information search using voice recognition
US11977815B2 (en) Dialogue processing method and device
US11676599B2 (en) Operational command boundaries
US10783886B2 (en) Cognitive agent disambiguation
CN106899487B (zh) 通信处理方法、装置、服务器及设备
US20230125273A1 (en) Urgency-based queue management systems and methods
US11722572B2 (en) Communication platform shifting for voice-enabled device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20756484

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20756484

Country of ref document: EP

Kind code of ref document: A1