WO2023069622A1 - Gestion de permission de niveau de système d'exploitation pour dispositifs domestiques intelligents multi-écosystème - Google Patents

Gestion de permission de niveau de système d'exploitation pour dispositifs domestiques intelligents multi-écosystème Download PDF

Info

Publication number
WO2023069622A1
WO2023069622A1 PCT/US2022/047285 US2022047285W WO2023069622A1 WO 2023069622 A1 WO2023069622 A1 WO 2023069622A1 US 2022047285 W US2022047285 W US 2022047285W WO 2023069622 A1 WO2023069622 A1 WO 2023069622A1
Authority
WO
WIPO (PCT)
Prior art keywords
smart home
application
devices
control device
home device
Prior art date
Application number
PCT/US2022/047285
Other languages
English (en)
Inventor
Alexander Crettenand
Gilles Drieu
Nathan SANDLAND
Kevin Po
Alexei Sakhartchouk
Julius Löwe
Anna Maria Phan
Mehdi Kash Khaleghi
Kevin COPPOCK
Original Assignee
Google Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/839,463 external-priority patent/US12057961B2/en
Application filed by Google Llc filed Critical Google Llc
Priority to EP22809264.9A priority Critical patent/EP4420310A1/fr
Publication of WO2023069622A1 publication Critical patent/WO2023069622A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • This patent specification relates to systems, methods, and related computer program products for managing permissions for smart home devices. More particularly, this specification relates to techniques for sharing a device that is registered with one application to a second application using a customized share sheet for an operating system.
  • a smart home refers to a home that includes smart home devices that are networked together to form an ecosystem for automation, providing information and entertainment, and controlling electronic devices.
  • a smart home ecosystem includes smart home devices that work together to accomplish tasks or share information using wired or wireless communication protocols.
  • the advantages of using a smart home ecosystem are numerous, including providing a seamless user experience and compatibility between devices without adapters or other bridge protocols/devices to enable communication and information sharing.
  • not all ecosystems of connected devices are open to devices for manufacturers outside of those ecosystems. Integrating smart home devices from one ecosystem into another ecosystem can pose technical challenges for control devices and network communications. Additionally, commissioning and registering these devices with an existing ecosystem can be a frustrating and problematic process for users. Difficulty can even arise when sharing a device registered with one application with another application on the same control device. Therefore, improvements are needed in the art.
  • a method of sharing smart home devices between applications may include receiving a request from a first application operating on a control device to share a smart home device that is registered with the first application; generating a user interface on the control device that displays one or more applications with which the smart home device can be shared; receiving a selection of a second application with which to share the smart home device; and sending information to the second application to register the smart home device with the second application.
  • a non-transitory computer-readable medium may include instructions that, when executed by one or more processors, cause the one or more processors to perform operations including receiving a request from a first application operating on a control device to share a smart home device that is registered with the first application; generating a user interface on the control device that displays one or more applications with which the smart home device can be shared; receiving a selection of a second application with which to share the smart home device; and sending information to the second application to register the smart home device with the second application.
  • a control device may include one or more processors and one or more memory devices.
  • the one or more memory devices may include instructions that, when executed by the one or more processors, cause the one or more processors to perform operations including receiving a request from a first application operating on a control device to share a smart home device that is registered with the first application; generating a user interface on the control device that displays one or more applications with which the smart home device can be shared; receiving a selection of a second application with which to share the smart home device; and sending information to the second application to register the smart home device with the second application.
  • the control device may include a smart phone.
  • the smart home device may include a thermostat, a hazard detector, a component of security system, a smart doorbell, a video camera, a child monitor, a smart appliance, an audio speaker, and/or any other smart home device.
  • the second application may be operating on the control device with the first application.
  • the second application may be operating on a second control device.
  • Sending the information to the second application may include generating a QR code in the user interface to be scanned by the second control device, where the QR code may include information for registering the smart home device with the second application.
  • the user interface may include an augmented share sheet generated by an operating system of the control device.
  • Receiving the request from the first application may include a command to share any of a plurality of smart home devices registered with the first application.
  • the method/operations may also include selecting the smart home device from the plurality of smart home devices registered with the first application using the user interface.
  • the method/operations may also include sending a request to an operating system of the control device to return a list of the one or more applications with which the smart home device can be shared.
  • the method/operations may also include accessing, by the operating system, metadata of the applications that indicates that the one or more applications are compatible with the smart home device.
  • Sending the information to the second application may include the operating system sending the information via a secure communication utility of the operating system.
  • the second application may use a universal setup utility of the operating system to register the smart home device with the second application using a multi-ecosystem protocol.
  • the first application may place the smart home device in a multi-admin mode such that the first application and the second application can both administer the smart home device.
  • Sending the information to the second application may include generating a pairing code and passing the pairing code to the second application.
  • the pairing code may be embedded in a QR code generated in the user interface.
  • the first application may be from a first smart home ecosystem, and the second application may be from a second smart home ecosystem.
  • the smart home device may be from a third smart home ecosystem.
  • FIG. 1 illustrates an example smart home environment in accordance with some embodiments.
  • FIG. 2 is a block diagram illustrating an example display assistant device that is applied as a voice interface to collect user voice commands in a smart home environment in accordance with some embodiments.
  • FIG. 3 illustrates an example of smart home devices operating within a structure, according to some embodiments.
  • FIG. 4 illustrates a sequence of steps that may be executed by a control device to share a device between apps, according to some embodiments.
  • FIG. 5A illustrates a user interface for sharing registered smart home devices between apps, according to some embodiments.
  • FIG. 5B illustrates an example of the user interface with a selected smart home device for sharing between apps, according to some embodiments.
  • FIG. 6 illustrates a flow diagram of a process for sharing a device between two apps, according to some embodiments.
  • FIG. 7 illustrates a system diagram, according to some embodiments.
  • FIG. 8 illustrates a flowchart 800 of a method for sharing smart home devices between applications, according to some embodiments.
  • FIG. 1 illustrates an example smart home environment 100 in accordance with some embodiments.
  • the smart home environment 100 includes a structure 150 (e g., a house, office building, garage, or mobile home) with various integrated devices (also referred to herein as “connected” or “smart” devices). It will be appreciated that smart devices may also be integrated into a smart home environment 100 that does not include an entire structure 150, such as an apartment, condominium, or office space.
  • a structure 150 e g., a house, office building, garage, or mobile home
  • various integrated devices also referred to herein as “connected” or “smart” devices.
  • smart devices may also be integrated into a smart home environment 100 that does not include an entire structure 150, such as an apartment, condominium, or office space.
  • the smart devices include one or more of: personal client devices 104 (e.g., tablets, laptops or mobile phones), display devices 106, media casting or streaming devices 108, thermostats 122, home protection devices 124 (e.g., smoke, fire and carbon dioxide detector), home security devices (e.g., motion detectors, window and door sensors and alarms), including connected doorbell/cameras 126, connected locksets 128, alarm systems 130 and cameras 132, connected wall switches transponders 136, connected appliances 138, WiFi communication devices 160 (e.g., hubs, routers, extenders), connected home cleaning devices 168(e.g., vacuum or floor cleaner), smart home communication and control hubs 180, voice assistant devices and display assistant devices 190, and/or other smart home devices.
  • personal client devices 104 e.g., tablets, laptops or mobile phones
  • display devices 106 e.g., media casting or streaming devices 108
  • thermostats 122 e.g., thermostats 122
  • home protection devices 124 e.g
  • smart home environments may refer to smart environments for homes such as a single-family house, but the scope of the present teachings is not so limited.
  • the present teachings are also applicable, without limitation, to duplexes, townhomes, multi-unit apartment buildings, hotels, retail stores, office buildings, industrial buildings, yards, parks, and more generally any living space or work space.
  • the terms user, customer, purchaser, installer, subscriber, and homeowner may often refer to the same person in the case of a single-family residential dwelling who makes the purchasing decision, buys the unit, and installs and configures the unit, and is also one of the users of the unit
  • the customer may be the landlord with respect to purchasing the unit
  • the installer may be a local apartment supervisor
  • a first user may be the tenant
  • a second user may again be the landlord with respect to remote control functionality.
  • the depicted structure 150 includes a plurality of rooms 152, separated at least partly from each other via walls 154.
  • the walls 154 may include interior walls or exterior walls.
  • Each room may further include a floor 156 and a ceiling 158.
  • One or more media devices are disposed in the smart home environment 100 to provide users with access to media content that is stored locally or streamed from a remote content source (e.g., content host(s) 114).
  • the media devices include media output devices 106, which directly output/di splay /play media content to an audience, and/or cast devices 108, which stream media content received over one or more networks to the media output devices 106.
  • the media output devices 106 include, but are not limited to, television (TV) display devices, music players and computer monitors.
  • the cast devices 108 include, but are not limited to, medial streaming boxes, casting devices (e.g., GOOGLE CHROMECAST devices), set-top boxes (STBs), DVD players, TV boxes, and so forth.
  • media output devices 106 are disposed in more than one location, and each media output device 106 is coupled to a respective cast device 108 or includes an embedded casting unit.
  • the media output device 106-1 includes a TV display that is hard wired to a DVD player or a set top box 108-1.
  • the media output device 106-3 includes a smart TV device that integrates an embedded casting unit to stream media content for display to its audience.
  • the media output device 106-2 includes a regular TV display that is coupled to a TV box 108-1 (e.g., Google TV or Apple TV products), and such a TV box 108-2 streams media content received from a media content host server 114 and provides an access to the Internet for displaying Internet-based content on the media output device 106-2.
  • Electronic devices 190 are display assistant devices and/or voice assistant devices.
  • the display assistant device 190 is also a voice assistant device.
  • the electronic devices 190 collect audio inputs for initiating various media play functions of the devices 190 and/or media devices 106 and 108.
  • the devices 190 are configured to provide media content that is stored locally or streamed from a remote content source.
  • the electronic devices 190 are voice-activated and are disposed in proximity to a media device, for example, in the same room with the cast devices 108 and the media output devices 106.
  • a voice- activated display assistant device 190-1 is disposed in a room having one or more smart home devices but not any media device.
  • a voice-activated electronic device 190 is disposed in a location having no networked electronic device. This allows for the devices 190 to communicate with the media devices and share content that is being displayed on one device to another device (e.g., from device 190-1 to device 190-2 and/or media devices 108).
  • the voice-activated electronic device 190 includes at least one microphone, a speaker, a processor and memory storing at least one program for execution by the processor.
  • the speaker is configured to allow the electronic device 190 to deliver voice messages to a location where the electronic device 190 is located in the smart home environment 100, thereby broadcasting information related to a current media content being displayed, reporting a state of audio input processing, having a conversation with or giving instructions to a user of the electronic device 190. For instance, in some embodiments, in response to a user query the device provides audible information to the user through the speaker.
  • visual signals could also be used to provide feedback to the user of the electronic device 190 concerning the state of audio input processing, such as a notification displayed on the device.
  • an electronic device 190 is a voice interface device that is network-connected to provide voice recognition functions with the aid of a server system 140.
  • the server system 140 includes a cloud cast service server 116 and/or a voice/display assistance server 112.
  • an electronic device 190 includes a smart speaker that provides music (e.g., audio for video content being displayed on the device 190 or on a display device 106) to a user and allows eyes-free and hands-free access to a voice assistant service (e.g., Google Assistant).
  • music e.g., audio for video content being displayed on the device 190 or on a display device 106
  • a voice assistant service e.g., Google Assistant
  • the electronic device 190 is a simple and low cost voice interface device, e.g., a speaker device and a display assistant device (including a display screen having no touch detection capability).
  • the voice-activated electronic devices 190 integrates a display screen in addition to the microphones, speaker, processor and memory (e.g., 190-2 and 190-4), and are referred to as “display assistant devices.”
  • the display screen is configured to provide additional visual information (e.g., media content, information pertaining to media content, etc.) in addition to audio information that can be broadcast via the speaker of the voice-activated electronic device 190.
  • the user may review the additional visual information directly on the display screen of the display assistant device.
  • the additional visual information provides feedback to the user of the electronic device 190 concerning the state of audio input processing.
  • the additional visual information is provided in response to the user's previous voice inputs (e.g., user queries), and may be related to the audio information broadcast by the speaker.
  • the display screen of the voice-activated electronic devices 190 includes a touch display screen configured to detect touch inputs on its surface (e.g., instructions provided through the touch display screen).
  • the display screen of the voice-activated electronic devices 190 is not a touch display screen, which is relatively expensive and can compromise the goal of offering the display assistant device 190 as a low cost user interface solution.
  • the electronic device 190 When voice inputs from the electronic device 190 are used to control the electronic device 190 and/or media output devices 106 via the cast devices 108, the electronic device 190 effectively enables a new level of control of cast-enabled media devices independently of whether the electronic device 190 has its own display.
  • the electronic device 190 includes a casual enjoyment speaker with far-field voice access and functions as a voice interface device for Google Assistant.
  • the electronic device 190 could be disposed in any room in the smart home environment 100. When multiple electronic devices 190 are distributed in multiple rooms, they become audio receivers that are synchronized to provide voice inputs from all these rooms. For instant, a first electronic device 190 may receive a user instruction that is directed towards a second electronic device 190-2 (e.g., a user instruction of “OK Google, show this photo album on the kitchen device.”).
  • an electronic device 190 includes a WiFi speaker with a microphone that is connected to a voice-activated personal assistant service (e.g., Google Assistant).
  • a voice-activated personal assistant service e.g., Google Assistant
  • a user could issue a media play request via the microphone of electronic device 190, and ask the personal assistant service to play media content on the electronic device 190 itself and/or on another connected media output device 106.
  • the user could issue a media play request by saying to the Wi-Fi speaker “OK Google, Play cat videos on my Living room TV.”
  • the personal assistant service then fulfils the media play request by playing the requested media content on the requested device using a default or designated media application.
  • the display assistant device includes a display screen and one- or more built in cameras (e.g., 190-4).
  • the cameras are configured to capture images and/or videos, which are then transmitted (e.g., streamed) to a server system 140 for display on client devices(s).
  • the voice-activated electronic devices 190, smart home devices could also be mounted on, integrated with and/or supported by a wall 154, floor 156 or ceiling 158 of the smart home environment 100 (which is also broadly called as a smart home environment in view of the existence of the smart home devices).
  • the integrated smart home devices include intelligent, multi-sensing, network-connected devices that integrate seamlessly with each other in a smart home network and/or with a central server or a cloud-computing system to provide a variety of useful smart home functions.
  • a smart home device is disposed at the same location of the smart home environment 100 as a cast device 108 and/or an output device 106, and therefore, is located in proximity to or with a known distance with respect to the cast device 108 and the output device 106.
  • the smart home devices in the smart home environment 100 includes, but are not limited to, one or more intelligent, multi-sensing, network-connected camera systems 132.
  • content that is captured by the camera systems 132 is displayed on the electronic devices 1 0 at a request of a user (e.g., a user instruction of “OK Google, Show the baby room monitor.”) and/or according to settings of the home environment 100 (e.g., a setting to display content captured by the camera systems during the evening or in response to detecting an intruder).
  • the smart home devices in the smart home environment 100 may include, but are not limited to, one or more intelligent, multi-sensing, network-connected thermostats 122, one or more intelligent, network-connected, multi-sensing hazard detectors 124, one or more intelligent, multisensing, network-connected entryway interface devices 126 and 128 (hereinafter referred to as “smart doorbells 126” and “smart door locks 128”), one or more intelligent, multi-sensing, network-connected alarm systems 130, one or more intelligent, multi-sensing, network-connected camera systems 132, and one or more intelligent, multi-sensing, network-connected wall switches 136.
  • the smart home devices in the smart home environment 100 of FIG. 1 includes a plurality of intelligent, multi-sensing, network-connected appliances 138 (hereinafter referred to as “smart appliances 138”), such as refrigerators, stoves, ovens, televisions, washers, dryers, lights, stereos, intercom systems, garage-door openers, floor fans, ceiling fans, wall air conditioners, pool heaters, irrigation systems, security systems, space heaters, window AC units, motorized duct vents, and so forth.
  • smart appliances 138 such as refrigerators, stoves, ovens, televisions, washers, dryers, lights, stereos, intercom systems, garage-door openers, floor fans, ceiling fans, wall air conditioners, pool heaters, irrigation systems, security systems, space heaters, window AC units, motorized duct vents, and so forth.
  • the smart home devices in the smart home environment 100 may additionally or alternatively include one or more other occupancy sensors (e.g., touch screens, IR sensors, ambient light sensors and motion detectors).
  • the smart home devices in the smart home environment 100 include radio-frequency identification (RFID) readers (e.g., in each room 152 or a portion thereof) that determine occupancy based on RFID tags located on or embedded in occupants.
  • RFID readers may be integrated into the smart hazard detectors.
  • devices 122, 124, 126, 128, 130, 132, 136, 138, and 190 are capable of data communications and information sharing with other smart home devices, a central server or cloud-computing system, and/or other devices (e.g., the client device 104, the cast devices 108 and the voice-activated electronic devices 190) that are network-connected.
  • each of the cast devices 108 and the voice-activated electronic devices 190 is also capable of data communications and information sharing with other cast devices 108, voice-activated electronic devices 190, smart home devices, a central server or cloud-computing system 140, and/or other devices (e.g., the client device 104) that are network- connected.
  • Data communications may be carried out using any of a variety of custom or standard wireless protocols (e g., IEEE 802.15.4, Wi-Fi, ZigBee, 6L0WPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.1 la, WirelessHART, MiWi, etc.) and/or any of a variety of custom or standard wired protocols (e.g., Ethernet, HomePlug, etc.), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
  • custom or standard wireless protocols e g., IEEE 802.15.4, Wi-Fi, ZigBee, 6L0WPAN, Thread, Z-Wave, Bluetooth Smart, ISA100.1 la, WirelessHART, MiWi, etc.
  • any of a variety of custom or standard wired protocols e.g., Ethernet, HomePlug, etc.
  • the cast devices 108, the electronic devices 190 and the smart home devices serve as wireless or wired repeaters.
  • a first one of and the cast devices 108 communicates with a second one of the cast devices 108 and the smart home devices 120 via a wireless router.
  • the cast devices 108, the electronic devices 190 and the smart home devices 120 may further communicate with each other via a connection (e.g., network interface 160) to a network, such as the Internet 110.
  • a connection e.g., network interface 160
  • the cast devices 108, the electronic devices 190 and the smart home devices 120 may communicate with a server system 140 (also called a central server system and/or a cloud-computing system herein).
  • the server system 140 may be associated with a manufacturer, support entity, or service provider associated with the cast devices 108 and the media content displayed to the user.
  • any of the connected electronic devices described herein can be configured with a range of capabilities for interacting with users in the environment.
  • an electronic device can be configured with one or more microphones, one or more speakers and voice-interaction capabilities in which a user interacts with the device display assistant device via voice inputs received by the microphone and audible outputs played back by the speakers to present information to users.
  • an electronic device can be configured with buttons, switches and/or other touch-responsive sensors (such as a touch screen, touch panel, or capacitive or resistive touch sensors) to receive user inputs, and with haptic or other tactile feedback capabilities to provide tactile outputs to users.
  • An electronic device can also be configured with visual output capabilities, such as a display panel and/or one or more indicator lights to output information to users visually.
  • an electronic device can be configured with movement sensors that can detect movement of objects and people in proximity to the electronic device, such as a radar transceiver(s) or PIR detector(s).
  • Inputs received by any of these sensors can be processed by the electronic device and/or by a server communicatively coupled with the electronic device (e.g., the server system 140 of FIG. 1).
  • the electronic device and/or the server processes and/or prepares a response to the user's input(s), which response is output by the electronic device via one or more of the electronic device's output capabilities.
  • the electronic device outputs via one or more of the electronic device's output capabilities information that is not directly responsive to a user input, but which is transmitted to the electronic device by a second electronic device in the environment, or by a server communicatively coupled with the electronic device. This transmitted information can be of virtually any type that is displayable/playable by the output capabilities of the electronic device.
  • the server system 140 provides data processing for monitoring and facilitating review of events (e.g., motion, audio, security, etc.) from data captured by the smart devices 120, such as video cameras 132, smart doorbells 126, and display assistant device 190-4.
  • the server system 140 may include a voice/display assistance server 112 that processes audio inputs collected by voice-activated electronic devices 190, one or more content hosts 114 that provide the displayed media content, and a cloud cast service server 116 creating a virtual user domain based on distributed device terminals.
  • the server system 140 also includes a device registry for keeping a record of the distributed device terminals in the virtual user environment.
  • Examples of the distributed device terminals include, but are not limited to the voice-activated electronic devices 190, cast devices 108, media output devices 106 and smart home devices 122-138 In some implementations, these distributed device terminals are linked to a user account (e.g., a Google user account) in the virtual user domain. In some implementations, each of these functionalities and content hosts is a distinct server within the server system 140. In some implementations, a subset of these functionalities is integrated within the server system 140.
  • the network interface 160 includes a conventional network device (e.g., a router).
  • the smart home environment 100 of FIG. 1 further includes a hub device 180 that is communicatively coupled to the network(s) 110 directly or via the network interface 160.
  • the hub device 180 is further communicatively coupled to one or more of the above intelligent, multi-sensing, network-connected devices (e.g., the cast devices 108, the electronic devices 190, the smart home devices and the client device 104).
  • Each of these network-connected devices optionally communicates with the hub device 180 using one or more radio communication networks available at least in the smart home environment 100 (e.g., ZigBee, Z-Wave, Insteon, Bluetooth, Wi-Fi and other radio communication networks).
  • the hub device 180 and devices coupled with/to the hub device can be controlled and/or interacted with via an application running on a smart phone, household controller, laptop, tablet computer, game console or similar electronic device.
  • a user of such controller application can view status of the hub device or coupled network-connected devices, configure the hub device to interoperate with devices newly introduced to the home network, commission new devices, and adjust or view settings of connected devices, etc.
  • FIG. 2 is a block diagram illustrating an example display assistant device 200 that is applied as a voice interface to collect user voice commands in a smart home environment 100 in accordance with some embodiments.
  • the display assistant device 200 typically includes one or more processing units (CPUs) 202, one or more network interfaces 204, memory 206, and one or more communication buses 208 for interconnecting these components (sometimes called a chipset).
  • the display assistant device 200 includes one or more output devices 212, including one or more speakers 252, a display 254 and one or more indicators 256.
  • the display assistant device 200 also includes one or more input devices 210 that facilitate user input, including one or more microphones 242, a volume control 244 and a privacy control 246.
  • the volume control 244 is configured to receive a user action (e.g., a press on a volume up button or a volume down button, a press on both volumes up and down buttons for an extended length of time) that controls a volume level of the speakers 252 or resets the display assistant device 200.
  • the privacy control 246 is configured to receive a user action that controls privacy settings of the display assistant device (e.g., whether to deactivate the microphones 242).
  • the one or more indicator 256 is configured to indicate at least whether the microphone 242 is deactivated (e g., muted).
  • the input devices 210 of the display assistant device 200 include a touch detection module 248 that is integrated on the display panel 254 and configured to detect touch inputs on its surface.
  • the input devices 210 of the display assistant device 200 include a camera module 250 configured to capture a video stream of a field of view.
  • the input devices 210 of the display assistant device 200 does not include any camera or touch detection module, because they are relatively expensive and can compromise the goal of offering the display assistant device 200 as a low cost user interface solution.
  • the display assistant device 200 further includes a presence sensor 260 configured to detect a presence of a user in a predetermined area surrounding the display assistant device 200.
  • the display assistant device 200 operates at a sleep or hibernation mode that deactivates detection and processing of audio inputs, and does not wake up from the sleep or hibernation mode or listen to the ambient (i.e., processing audio signals collected from the ambient) until the presence sensor 260 detects a presence of a user in the predetermined area.
  • An example of the presence sensor 260 is an ultrasonic sensor configured to detect a presence of a user.
  • the display assistant device 200 further includes an ambient light sensor 270 (e.g., a white ambient light sensor, an RGB color sensor).
  • the ambient light sensor 270 is configured to detect a light condition in the smart home environment 100 where the display assistant device 200 sits.
  • the display assistant device 200 is configure to adjust a brightness level and/or a color tone of its screen according to the light condition.
  • the ambient light sensor 270 are disposed behind a bezel area of the screen of the display assistant device 200, and exposed to light via transparent part of the bezel area.
  • Memory 206 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 206, optionally, includes one or more storage devices remotely located from one or more processing units 202. Memory 206, or alternatively the non-volatile memory within memory 206, includes a non-transitory computer readable storage medium.
  • memory 206 stores the following programs, modules, and data structures, or a subset or superset thereof: (1) Operating system 216 including procedures for handling various basic system services and for performing hardware dependent tasks; (2) Network communication module 218 for connecting the display assistant device 200 to other devices (e.g., the server system 140, cast device 108, client device 104, smart home devices 120 and other voice-activated electronic device(s) 190) via one or more network interfaces 204 (wired or wireless) and one or more networks 110, such as the Internet, other wide area networks, local area networks, metropolitan area networks, and so on; (3) Input/output control module 220 for receiving inputs via one or more input devices 210 enabling presentation of information at the display assistant device 200 via one or more output devices 212.
  • Operating system 216 including procedures for handling various basic system services and for performing hardware dependent tasks
  • Network communication module 218 for connecting the display assistant device 200 to other devices (e.g., the server system 140, cast device 108, client device 104, smart home devices 120 and other voice-activated electronic
  • the one or more output devices may include: (1) Voice processing module 222 for processing audio inputs or voice messages collected in an environment surrounding the display assistant device 200, or preparing the collected audio inputs or voice messages for processing at a voice/display assistance server 112 or a cloud cast service server; (2) Display assistant module 224 for displaying additional visual information including but not limited to a media content item (e g., a YouTube video clip), news post, social media message, weather information, personal picture, a state of audio input processing, and readings of smart home devices; and/or (3) Touch sense module 226 for sensing touch events associated with the touch detection module 248 on a top surface of the display assistant device 200.
  • a media content item e g., a YouTube video clip
  • Touch sense module 226 for sensing touch events associated with the touch detection module 248 on a top surface of the display assistant device 200.
  • the memory 206 may also include one or more receiver application 228 for responding to user commands extracted from audio inputs or voice messages collected in an environment surrounding the display assistant device 200, including but not limited to, a media play application, an Internet search application, a social network application and a smart device application.
  • the memory 206 may also include Display assistant device data 230 storing at least data associated with the display assistant device 200.
  • the data may include display assistant settings 232 for storing information associated with the display assistant device 200 itself, including common device settings (e.g., service tier, device model, storage capacity, processing capabilities, communication capabilities, etc.) and information of a user account 234 in a virtual user domain to which the display assistant device 200 is linked.
  • the data may also include voice control data 236 for storing audio signals, voice messages, response messages and other data related to voice interface functions of the display assistant device 200.
  • the input/output control module 220 further includes an image processing module (not shown) configured to process image data captured by the camera module 250.
  • the image processing module is configured to analyze the image data captured by the camera module 250 and associate biometric features (e.g., face, voice and gesture) recognized from the image data with known or unknown users.
  • biometric features e.g., face, voice and gesture
  • User profiles can be selected based on the biometric features to control the display assistant device 200 itself, cast devices 106 or smart home devices adaptively.
  • Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above.
  • the above identified modules or programs i.e., sets of instructions
  • memory 206 optionally, stores a subset of the modules and data structures identified above.
  • memory 206 optionally, stores additional modules and data structures not described above.
  • each smart home device manufacturer provided their own “ecosystem” for controlling and monitoring smart home devices. Adding a new device from another manufacturer often required the installation and maintenance of multiple smart home ecosystems together.
  • the term “ecosystem” a software system and/or operating environment provided by a single manufacturer.
  • the most popular smart home ecosystems at the time of this disclosure include Google Home®, Amazon Alexa®, Apple Homekit®, Samsung SmartThings®, and so forth.
  • Each ecosystem may define its own communication protocols and may be configured to work with devices from the same manufacturer.
  • Each ecosystem may also be controlled by a smart home hub, such as the Nest Hub® and/or by a server.
  • Each ecosystem may also be controlled by an application, such as the Google Home® app operating on a smart phone or other electronic device.
  • each ecosystem may be primarily configured to use devices from the same manufacturer, however ecosystems may also be configured to accept registration and use of devices from other ecosystems.
  • the Google Home® app may be used to monitor, control, and interact with devices from Google® and from other manufacturers together as part of the same smart home environment.
  • FIG. 3 illustrates an example of smart home devices operating within a structure, according to some embodiments.
  • a first device 302 e.g., a thermostat
  • a second device 304 e.g., a security system
  • the second device 304 may be compatible with a second ecosystem and provided by the second manufacturer.
  • Each ecosystem may be controlled and monitored by a control device 308, such as a smart phone or digital home assistant.
  • each ecosystem would be represented by a different application (“app”) on a smart phone or a different control device.
  • a different home assistant device e.g., a display assistant device and/or a voice assistant device
  • the first device 302 would be controlled by a first home assistant associated with the first ecosystem
  • the second device 304 would be controlled by a second home assistant associated with the second ecosystem.
  • a user’s smart phone could include “home” applications for each ecosystem.
  • a user would need to toggle back and forth between multiple apps and/or multiple control devices.
  • two ecosystems may operate independently in parallel with each other to accommodate the first device 302 and the second device 304 from different manufacturers.
  • the MATTER® connectivity standard is built upon the Internet Protocol (IP) and runs on, for example, Wi-Fi and Thread network layers through Bluetooth Low Energy (BLE) for commissioning devices.
  • IP Internet Protocol
  • BLE Bluetooth Low Energy
  • This unifying protocol enables communication across smart home devices, mobile apps, cloud services, and other devices.
  • the MATTER® standard is an open source, unified connectivity protocol that allows communications between devices from multiple manufacturers (e.g., Google®, Amazon®, Apple®, and so forth).
  • a Google Home® or a Google Hub® may be used to control a Google Nest Thermostat® alongside speakers from another manufacturer, a security system from another manufacturer, etc., in a unified interface.
  • the Google Home® smart phone app may be used to control a Google Nest® doorbell, along with a smart appliance from another manufacturer within the same app.
  • the MATTER® standard is used here is an example, this disclosure uses the term “multiecosystem protocol” to generically refer to any application layer protocol that allows different smart home ecosystems and devices to interoperate, for which the MATTER® standard represents one example.
  • FIG. 3 illustrates how the first device 302 and the second device 304 can be controlled on the same control device 308 operating a single control application.
  • the application may include a unified interface that is provided from a manufacturer that is different from the manufacturers providing the first device 302 and the second device 304.
  • the ecosystem for the application on the control device 308 may display icons 312, 314 for the first device 302 and the second device 304 together such that the first device 302 and the second device 304 can be controlled together in the same smart home control app.
  • the application may allow the user to control, monitor, install, and provision smart home devices 302, 304 and any devices from any ecosystem and/or manufacturer in a single application.
  • the same interface may be provided on a digital home assistant (e.g., display assistant or voice assistant) as described above.
  • the first device 302 and/or the second device 304 may both operate according to the standards defined by the multi-ecosystem protocol.
  • the multi-ecosystem protocol operating, for example, on Wi-Fi and Thread network layers using BLE for device setup may allow the first device 302 and/for the second device 304 to be commissioned individually and controlled within the separate ecosystem of the application operating on the control device 308.
  • compatibility with the multi-ecosystem protocol be implemented using a firmware update.
  • some newer devices may come with compatibility already built-in with the multi-ecosystem protocol.
  • multi-ecosystem protocol alleviates some of the communication problems that have plagued the adoption of multi-ecosystem smart home devices.
  • users were often confused on how to discover, set up, and/or control their devices in and across multiple smart home ecosystems.
  • the proliferation of a variety of smart home protocols such as Zigbee®, HomeKit®, Z-Wave®, and others caused incompatibility that may be solved by the multi-ecosystem protocol.
  • This protocol clearly defines network layers to ensure compatibility of the communication stack.
  • a multi-ecosystem protocol only attempts to provide compatibility between devices. On its own, the multi-ecosystem protocol does not solve problems of device commissioning, device sharing, and other common task that accompany interoperability.
  • the embodiments described herein provide additional improvements on top of the multiecosystem protocol.
  • One of the key user journeys and pain points involves adding devices from one app or ecosystem experience to another.
  • sharing a smart light from a third-party manufacturer with the Google Home® app involves adding a device from one manufacturer into an ecosystem of another.
  • the user experience is still unique and cumbersome for each different device and/or manufacturer.
  • This typically requires creating multiple accounts in multiple apps, and navigating complex linking flows to complete the integration of the new device. These linking flows are often bespoke between different apps and ecosystems, making it difficult to develop a consistent, familiar pattern for users.
  • some devices may have proprietary protocols that are tightly coupled with their associated mobile apps. These devices may require users to navigate individual apps for each device manufactured in order to set up the device and register the device with their chosen ecosystem or platform. This adds additional steps and unnecessary apps to an already frustrating user experience.
  • Some embodiments described herein optimize this process of integrating devices from different ecosystems by sharing permissions between apps.
  • some embodiments include methods and systems for users to share one or more devices paired to or registered with a first mobile application with a second mobile application by guiding the user through a familiar set of steps.
  • These embodiments may conceptually (but not functionally) extend the concept of a “share sheet” where one app can share text with another app.
  • these embodiments allow applications to share devices.
  • an application can generate an extended or customized version of a share sheet indicating that they are sharing a device, such as a smart lightbulb.
  • the user may be allowed to select an application with which the device should be shared from the originating application.
  • the device may then be placed in a multi-admin commissioning mode, and a passcode may be generated.
  • the passcode may be sent to a third-party app to complete the sharing process.
  • FIG. 4 illustrates a sequence of steps that may be executed by a control device to share a device between apps, according to some embodiments.
  • the process may begin with a first app 401 that is executed by a control device 400, such as a smart phone, a tablet computer, a smart watch, and/or any other computing device.
  • the first app 401 may be configured to control smart home devices in the smart home environment of the structure.
  • the first app 401 may be a control app for a smart home ecosystem (e.g., the Google Home® app).
  • the first app 401 may be a proprietary or customized app that is configured to specifically interface with a smart home device.
  • the first app 401 may be configured to control a smart lightbulb, and may be provided from a manufacturer of the smart lightbulb. In these cases, the first app 401 may be configured to specifically control the smart lightbulb without necessarily being configured to control other smart home devices in the smart home environment.
  • the smart home device 405 may include any smart home device that is already registered with the first app 401.
  • the smart home device 405 may include any smart home device, including the smart home devices described above in FIG. 1, such as thermostats, hazard detectors, security systems, video cameras, streaming consoles, smart appliances, and so forth.
  • the smart home device 405 may be commissioned on the local network and registered with the first app 401.
  • the first app 401 may receive status data from the smart home device 405, may send commands to the smart home device 405, may generate automation sequences involving the smart home device 405, and so forth.
  • the smart home device may also interact with other smart home devices in the ecosystem controlled by the first app 401.
  • the first app 401 may be any application with administrative privileges for the device 405. For example, permissions or credentials may be required in order to register the smart home device 405 with a new app or control device. This prevents malicious actors from commandeering smart home devices when they gain access to a network connection or when they are within wireless transmission range. This prevents the easy sharing of smart home devices between applications on different devices, as well as between applications on the same device.
  • the embodiments described herein allow a user to share registration information of a smart home device 405 between applications on the same control device 400. When initiated by a user, and when shared between apps on the same control device 405, the operating system of the control device 300 can facilitate this sharing of permissions and registration information between apps, as this process is unlikely to be used by a malicious actor.
  • the first app 401 may call a core API from the operating system, which may generate an augmented share sheet that has been customized to share devices.
  • the term “share sheet” refers to a user interface that allows the user to share or text or other symbol data from one application to another application. For example, a user may select a text string in one application and summoned the share sheet for the application. The share sheet may identify other applications on the control device 400 that use the core API from the operating system to receive text. The user may then select one of the applications that have populated the share sheet, and the text or simple data may be sent to that application.
  • the core API from the operating system may be modified to generate an augmented share sheet.
  • This augmented share sheet may be different from the traditional share sheet provided by the platform. Specifically, instead of sharing text, images, files, etc. between applications, this augmented share sheet may provide a unique user experience that clearly indicates to the user that they are sharing a device between apps. As described in detail below, the augmented share sheet allows the user to view smart home devices registered with the first app 401, select a smart home device from the list of registered devices, choose a compatible second app 403, and initiate the sharing of registration and permissions between the two apps 401, 403. Note that some embodiments of the operating system may provide a custom share sheet that is specifically designed to handle the sharing of smart home devices between apps. However, some embodiments may also leverage the existing share sheet utility provided by the platform by augmenting existing functionality with the ability to share a device.
  • FIG. 5A illustrates a user interface 502 for sharing registered smart home devices between apps, according to some embodiments.
  • the user may tap a control on the first app 401 for sharing a smart home device.
  • the first app 401 may provide a menu item or contextual menu that includes a share command when selecting individual smart home devices that are managed by the first app 401.
  • the user interface 502 may query the operating system for a list of apps that can receive a new device shared from the first app 401.
  • the operating system may maintain a list of apps that are registered with the operating system as being compatible with the control and interaction with smart home devices.
  • the second app 403 may register with the operating system as being compatible with smart home device registrations.
  • the first app 401 may query the operating system to retrieve the list of registered apps that are compatible with smart home devices.
  • different device types may be compatible with different application types.
  • the first app 401 may provide the device type to the operating system, which may in turn provide list of installed apps that are compatible with that specific device type.
  • This list of available apps 510 may be listed at the bottom of the user interface 502. For example, icons may be displayed for each of the available apps 510.
  • some share sheets may also include controls 512, 514 that can be used to generate a QR code or other passcode for sharing the device with other ecosystems.
  • the QR code or passcode can be used to share a device between control devices (e.g., sharing a device between apps on two separate phones rather than between apps on a single phone).
  • the QR code can be displayed on the screen of a first phone and the image of the QR code can be captured by the second phone to share the device with an app on the second phone. Therefore, some embodiments may allow the control device 400 to share the registration of the smart home device with another app that is operating on a second control device that is separate and distinct from the control device 400.
  • the user interface 502 may begin to search for devices that are registered with the first app 301 (indicated by the spinning “wait” icon) 516.
  • the search may be executed when a network call is performed to retrieve a list of devices.
  • some embodiments may already have the device to be shared selected by the app when the user interface 502 is generated.
  • the first app 401 may already provide a list of devices that can be shared from the current app, and the user may select the device from the list such that the user interface 502 is already populated with the device.
  • FIG. 5B illustrates an example of the user interface 502 with a selected smart home device for sharing between apps, according to some embodiments.
  • the user interface 502 illustrates how a customized share sheet can generate an image or other graphical indicator, such as an icon 518, of the device to be shared.
  • an icon 518 of the device to be shared.
  • a lightbulb is to be shared from the first app 401 and the second app 403.
  • the icon 518 may have been selected from a list of icons representing smart home devices registered with the first app 401.
  • the icon 518 may have populated the user interface 502 after being selected before the user interface 502 is initiated.
  • the user interface 502 may continue to show a list of apps 510 that can receive the device to be shared.
  • the user interface 502 may indicate that by selecting an app, that app will be able to manage and control the device. Note that the first app 401, the second app 403 selected to share the device, and the device itself may all be from different ecosystems.
  • the user interface 502 may provide the user a list of destination apps 510 that the chosen device can also be registered with, as well as the ability to copy a setup code or QR code if the user wants to link the device with a platform or app that is not on the control device 400 (e.g., installed on their mobile phone).
  • the list of destination apps 510 may be registered by indicating in their metadata that they can receive access to smart home devices through this utility.
  • the user interface 502 illustrates how the device can be shared between the two apps 401, 403.
  • the device may be placed in a multi-admin mode that allows both the first app 401 and the second app 403 to administer the device.
  • the device may be placed in a commissioning mode for the multi-ecosystem protocol so that metadata may be passed to the user interface 502 to populate the devices in the pairable state.
  • a pairing code may then be generated representing secret information. This pairing code may then be passed over to the second app 403 using a secure operating system communication utility (e.g., using an Android® Intent).
  • the second app 403 can go through the setup process with the device.
  • a universal set up utility may be provided by the operating system that handles the setup process for the device to be registered with the second app 403 using the multi-ecosystem protocol. This process is described in detail in the commonly assigned U.S. Patent Application No. 17/838,736.
  • the second app 310 may use its own set up flow to set up the device if necessary. After setup, the device may now be registered and active in both apps, which may represent two separate ecosystems or may be part of the same ecosystem.
  • FIG. 6 illustrates a flow diagram 600 of a process for sharing a device between two apps, according to some embodiments.
  • the user 606 may click on a control that indicates they would like to “share a device” registered with the current application 604.
  • the current application may act as a current administrator for the device.
  • the first application 604 may then open a commissioning window with the device 602, such as the user interface 502 described above.
  • this method may be implemented as a service on a mobile control device, such as universal set up utility service (e.g., implemented in Google Play® services) on the Android® operating system.
  • the first application 604 may send a request to share the device to the service 608.
  • the service 608 may then generate the user interface 502 as described above for display to the user 606.
  • the user 606 may then select a target second application 610 with which the device 602 should be shared.
  • the service may then send a message (e.g., an intent) with information such as a pairing code to the second application 610.
  • the second application 610 may then commission the device 602, which may also use the service 608 to perform the setup of the device 602.
  • FIG. 7 illustrates a system diagram, according to some embodiments.
  • the first application described above may be a third-party application 702 (e.g., Amazon Alexa®), which may be linked to an SDK 704 that is linked in at compile time.
  • the SDK 704 may include a core, app clusters, and data models for use with the multi-ecosystem protocol that communicates with each of the smart home devices.
  • a universal setup utility 706 may handle pairing operations for multi-admin APIs and may utilize built-in proprietary set up flows that are retrieved from external sources as described in the commonly assigned U.S. Patent Application 17/838,736.
  • the universal set up utility 706 may handle the setup, commissioning, registration, and/or provisioning of smart home devices as they are added to an ecosystem and paired with a specific application or user account.
  • the pairing API may provide functions that can be called by the third-party application 702 to perform the setup activities.
  • the multi-admin API may provide functions that allow devices to be shared between applications and/or control devices as described above.
  • the universal set up utility 706 may also include the SDK 704 to perform the communication, setup, and pairing activities with the devices. However, these functions need not be exposed to the third-party application 702, as the universal setup utility 706 may handle this communication entirely.
  • FIG. 8 illustrates a flowchart 800 of a method for sharing smart home devices between applications, according to some embodiments.
  • This method may be executed by an operating system of the control device, such as the Google Android® operating system.
  • this method may be executed by one or more processors on the control device. Instructions to execute this method may be stored on a non-transitory computer-readable medium that, when executed by the one or more processors, cause the one or more processors to perform the operations described below as an application or operating system utility.
  • the method may include receiving a request from a first application operating on a control device to share a smart home device that is registered with the first application (802).
  • the control device may include any computing device (e.g., a smart phone), and the smart home device may include any smart home device, including any of the example devices described above in FIG. 1, such as a thermostat, a hazard detector, a component of a security system, a smart doorbell, a video camera, a child monitor, a smart appliance, an audio speaker, and so forth.
  • This request may include a request to share a specific smart home device, or may include a request to share any of a plurality of smart devices registered with the first application.
  • the method may also include generating a user interface on the control device that displays one or more applications with which the smart home device can be shared (804).
  • the user interface may include a modified or augmented share sheet as described above.
  • the user interface may display icons representing the smart home device, a plurality of smart home devices registered with the first application, and the one or more applications with which the smart home device can be shared.
  • the one or more applications with which the smart home device can be shared may have been previously registered with the operating system.
  • the operating system may identify the one or more applications by virtue of metadata or other tags that indicate that those applications are compatible with the smart home device or smart home devices of a particular type.
  • the method may additionally include receiving a selection of a second application with which to share the smart home device (806).
  • the second application may be selected from the one or more applications with which the device is compatible.
  • the selection may be made by tapping the user interface or otherwise selecting the application from the one or more applications in the user interface.
  • the method may further include sending information to the second application to register the smart home device with the second application (808).
  • user interface may generate a QR code to be scanned by the second application when the second application operates on a second control device that is separate and distinct from the first control device.
  • the QR code may include information for registering the smart home device with the second application, such as a pairing code.
  • a pairing code may be passed from the first application to the second application using a secure communication utility of the operating system of the control device.
  • the first application may place the smart home device in a multiadmin mode such as the first application and the second application can both administer the smart home device
  • the second application may setup the smart home device using a multi-ecosystem protocol and a universal setup utility that is part of the operating system of the control device as described above.
  • the first application may be part of a first smart home ecosystem
  • the second application may be part of a second smart home ecosystem
  • smart home device may be part of a third smart home ecosystem.
  • individual embodiments may have beeen described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may have described the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed, but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
  • computer-readable medium includes, but is not limited to portable or fixed storage devices, optical storage devices, wireless channels and various other mediums capable of storing, containing, or carrying instruction(s) and/or data.
  • a code segment or machine-executable instructions may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class, or any combination of instructions, data structures, or program statements.
  • a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, or memory contents. Information, arguments, parameters, data, etc., may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a machine readable medium.
  • a processor(s) may perform the necessary tasks.
  • machine-executable instructions may be stored on one or more machine readable mediums, such as CD-ROMs or other type of optical disks, floppy diskettes, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, flash memory, or other types of machine- readable mediums suitable for storing electronic instructions.
  • machine readable mediums such as CD-ROMs or other type of optical disks, floppy diskettes, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, flash memory, or other types of machine- readable mediums suitable for storing electronic instructions.
  • the methods may be performed by a combination of hardware and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un procédé de partage de dispositifs domestiques intelligents entre des applications pouvant comprendre la réception d'une demande provenant d'une première application fonctionnant sur un dispositif de commande pour partager un dispositif domestique intelligent qui est enregistré avec la première application ; la génération d'une interface sur le dispositif de commande qui affiche une ou plusieurs applications avec lesquelles le dispositif domestique intelligent peut être partagé ; la réception d'une sélection d'une seconde application avec laquelle partager le dispositif domestique intelligent ; et l'envoi des informations au second dispositif domestique intelligent pour enregistrer le dispositif domestique intelligent.
PCT/US2022/047285 2021-10-20 2022-10-20 Gestion de permission de niveau de système d'exploitation pour dispositifs domestiques intelligents multi-écosystème WO2023069622A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22809264.9A EP4420310A1 (fr) 2021-10-20 2022-10-20 Gestion de permission de niveau de système d'exploitation pour dispositifs domestiques intelligents multi-écosystème

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163270042P 2021-10-20 2021-10-20
US63/270,042 2021-10-20
US17/839,463 US12057961B2 (en) 2021-10-20 2022-06-13 Operating-system-level permission management for multi-ecosystem smart-home devices
US17/839,463 2022-06-13

Publications (1)

Publication Number Publication Date
WO2023069622A1 true WO2023069622A1 (fr) 2023-04-27

Family

ID=84360627

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/047285 WO2023069622A1 (fr) 2021-10-20 2022-10-20 Gestion de permission de niveau de système d'exploitation pour dispositifs domestiques intelligents multi-écosystème

Country Status (2)

Country Link
EP (1) EP4420310A1 (fr)
WO (1) WO2023069622A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190238358A1 (en) * 2018-02-01 2019-08-01 Bby Solutions, Inc. Automatic device orchestration and configuration
WO2020117302A1 (fr) * 2018-12-03 2020-06-11 Google Llc Commande et/ou liaison efficaces de dispositifs intelligents

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190238358A1 (en) * 2018-02-01 2019-08-01 Bby Solutions, Inc. Automatic device orchestration and configuration
WO2020117302A1 (fr) * 2018-12-03 2020-06-11 Google Llc Commande et/ou liaison efficaces de dispositifs intelligents

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SIKDER AMIT KUMAR ET AL: "A Survey on Sensor-Based Threats and Attacks to Smart Devices and Applications", IEEE COMMUNICATIONS SURVEYS & TUTORIALS, IEEE, USA, vol. 23, no. 2, 8 March 2021 (2021-03-08), pages 1125 - 1159, XP011856169, DOI: 10.1109/COMST.2021.3064507 *

Also Published As

Publication number Publication date
EP4420310A1 (fr) 2024-08-28

Similar Documents

Publication Publication Date Title
US11990126B2 (en) Voice-controlled media play in smart media environment
JP7293180B2 (ja) スマートメディア環境においてスマート制御される媒体再生
US11671662B2 (en) Methods and systems for controlling media display in a smart media display environment
EP2751955B1 (fr) Gestionnaire de ressources et procédé pour transmettre des informations de gestion de ressources relatives à des ressources d'énergie et multimédia intelligentes
US12057961B2 (en) Operating-system-level permission management for multi-ecosystem smart-home devices
WO2017197186A1 (fr) Affichage de sous-titres codés à commande vocale
JP7393526B2 (ja) イベントクリップを提供するための、方法、電子装置、サーバシステム、およびプログラム
EP3406081A1 (fr) Procédés et systèmes de sortie multimédia automatique basée sur une proximité d'utilisateur
US20170238192A1 (en) Electronic apparatus and sensor arrangement method thereof
WO2023069622A1 (fr) Gestion de permission de niveau de système d'exploitation pour dispositifs domestiques intelligents multi-écosystème
US20230119058A1 (en) Operating-system-level setup for multi-ecosystem smart-home devices
WO2023069621A1 (fr) Configuration de niveau système d'exploitation pour dispositifs domestiques intelligents multi-écosystèmes

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22809264

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022809264

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022809264

Country of ref document: EP

Effective date: 20240521