US20170364828A1 - Multifunction mobile units - Google Patents

Multifunction mobile units Download PDF

Info

Publication number
US20170364828A1
US20170364828A1 US15/622,707 US201715622707A US2017364828A1 US 20170364828 A1 US20170364828 A1 US 20170364828A1 US 201715622707 A US201715622707 A US 201715622707A US 2017364828 A1 US2017364828 A1 US 2017364828A1
Authority
US
United States
Prior art keywords
user
mobile electronic
day
module
functionalities
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/622,707
Inventor
James Duane Bennett
Bindu Rama Rao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US15/622,707 priority Critical patent/US20170364828A1/en
Publication of US20170364828A1 publication Critical patent/US20170364828A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/10Protecting distributed programs or content, e.g. vending or licensing of copyrighted material ; Digital rights management [DRM]
    • G06N99/005
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/01Mowers; Mowing apparatus of harvesters characterised by features relating to the type of cutting apparatus
    • A01D34/412Mowers; Mowing apparatus of harvesters characterised by features relating to the type of cutting apparatus having rotating cutters
    • A01D34/63Mowers; Mowing apparatus of harvesters characterised by features relating to the type of cutting apparatus having rotating cutters having cutters rotating about a vertical axis
    • A01D34/64Mowers; Mowing apparatus of harvesters characterised by features relating to the type of cutting apparatus having rotating cutters having cutters rotating about a vertical axis mounted on a vehicle, e.g. a tractor, or drawn by an animal or a vehicle
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D43/00Mowers combined with apparatus performing additional operations while mowing
    • A01D43/16Mowers combined with apparatus performing additional operations while mowing with lawn edgers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L7/00Suction cleaners adapted for additional purposes; Tables with suction openings for cleaning purposes; Containers for cleaning articles by suction; Suction cleaners adapted to cleaning of brushes; Suction cleaners adapted to taking-up liquids
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2857User input or output elements for control, e.g. buttons, switches or displays
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2894Details related to signal transmission in suction cleaners
    • EFIXED CONSTRUCTIONS
    • E04BUILDING
    • E04HBUILDINGS OR LIKE STRUCTURES FOR PARTICULAR PURPOSES; SWIMMING OR SPLASH BATHS OR POOLS; MASTS; FENCING; TENTS OR CANOPIES, IN GENERAL
    • E04H4/00Swimming or splash baths or pools
    • E04H4/06Safety devices; Coverings for baths
    • EFIXED CONSTRUCTIONS
    • E04BUILDING
    • E04HBUILDINGS OR LIKE STRUCTURES FOR PARTICULAR PURPOSES; SWIMMING OR SPLASH BATHS OR POOLS; MASTS; FENCING; TENTS OR CANOPIES, IN GENERAL
    • E04H4/00Swimming or splash baths or pools
    • E04H4/14Parts, details or accessories not otherwise provided for
    • E04H4/16Parts, details or accessories not otherwise provided for specially adapted for cleaning
    • E04H4/1654Self-propelled cleaners
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0088Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services
    • G06Q30/015Providing customer assistance, e.g. assisting a customer within a business location or via helpdesk
    • G06Q30/016After-sales
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/10Actuation by presence of smoke or gases, e.g. automatic alarm devices for analysing flowing fluid materials by the use of optical means
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/08Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0428Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the data content is protected, e.g. by encrypting or encapsulating the payload
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/20Network architectures or network communication protocols for network security for managing network security; network security policies in general
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/104Peer-to-peer [P2P] networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D2101/00Lawn-mowers
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D43/00Mowers combined with apparatus performing additional operations while mowing
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/02Docking stations; Docking operations
    • A47L2201/022Recharging of batteries
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance

Definitions

  • the present invention relates generally to artificial intelligence based assistant devices, and, more specifically, devices that perform household chores and functions as a mobile general purpose autonomous intelligent machine.
  • vacuum cleaners that just function as a vacuum cleaner
  • lawnmowers that just accomplish lawn mowing (even though they might be robotic, in some sense)
  • drones that just photograph and video record aerial views of the house or business premises (some of the drones may have some alternative functionalities and/or additional functionalities though) and so forth.
  • All the above-mentioned devices are mobile and possess wheels or flying rotor blades, accordingly. Many of these devices (vacuum cleaners, lawn mowers, drones and such) are autonomous to a certain degree, but require constant attention (for fuel or recharging them, for example). These devices exhibit limited human interaction capabilities (that is, possess some form of user interfaces) and limited personalization-ability or customizability.
  • the vacuum cleaners, lawnmowers and drones as they are available today, do possess certain Internet based updating capability. This makes updating themselves with the latest available firmware from their manufacturer's server possible, with human assistance in most of the cases though, for example.
  • FIG. 1 is a perspective block diagram of a mobile unit infrastructure that performs a plurality of additional household chores and functions as a mobile general purpose autonomous intelligent machine;
  • FIG. 2 is a perspective block diagram detailing the artificial intelligence analyst (decision making) of FIG. 1 ;
  • FIG. 3 is a perspective block diagram detailing the user behavior pattern module of FIG. 1 ;
  • FIG. 4 is a perspective block diagram detailing the Day-In-Life recorder module of FIG. 1 ;
  • FIG. 5 is a perspective block diagram detailing the tier 0-2 user's device of FIG. 1 ;
  • FIG. 6 is a perspective block diagram illustrating few additional functionalities of the tier 3 user and manufacturer cloud based systems and services, and artificial intelligence analyst module, of FIG. 1 ;
  • FIG. 7 is a perspective block diagram illustrating functionalities of an exemplary wardrobe management role of the mobile units of FIG. 1 ;
  • FIG. 8 is a flowchart illustrating the processes involved in the autonomous functioning and decision making of the mobile units.
  • FIG. 9 is a flowchart illustrating the processes involved in the Day-In-Life functionality of the mobile units.
  • FIG. 1 is a perspective block diagram of a mobile unit infrastructure 105 that performs a plurality of additional household chores and functions as a mobile general purpose autonomous intelligent machine.
  • primary functions may include one or more of vacuum cleaning, lawn mowing, typical drone functionalities
  • the additional autonomous functions include wardrobe management, premises safety, sentry job, rendering personalized music, self-recharging from typical electric outlet, self-learning to predict the user's, rest of the family member's and guest's behaviors, and planning and scheduling the functionalities (decision making) accordingly, providing AI (Artificial Intelligence) assistance and support, Day-In-Life recording and support and so forth.
  • AI Artificial Intelligence
  • the mobile units 183 , 185 , 187 and 189 support wardrobe management functionalities. These functionalities essentially involve clothes sizing, inner garments sizing, socks and shoes sizing, weight measurements and predicting weight gain or loss, height and body volume and BMI (Body Mass Index) measurements, taking stock of clothes in the wardrobe, suggesting clothes for different types of occasions and interfacing with online retailers for new clothes or shoes purchasing (the mobile unit 183 , 185 , 187 or 189 provides the clothes or shoes dimensions' interior and the user filters the selections), among many other wardrobe related functionalities.
  • a database records of all the measurements taken are stored in a tier 3 user and manufacturer cloud based systems and services 111 .
  • the tier 3 user and manufacturer cloud based systems and services 111 contains wardrobe management module 145 .
  • the mobile units 183 , 185 , 187 and 189 contain user interface module (audio/visual and keyboard interfaces) 193 , artificial intelligence module 195 and user personalization module 197 .
  • the user may come out of shower and say “Bot, measure me.” Then, the mobile unit 183 , 185 , 187 or 189 circles around the user and says “Your body volume is up by 22% since last month, most to your waist line . . . . would you like clothes sizing information updated?” Alternatively, the user may have instructed the mobile unit 183 , 185 , 187 or 189 to measure height, weight and BMI (Body Mass Index) periodically, once a week or a month, for example. The mobile unit 183 , 185 , 187 or 189 may say “Your periodic measurements are being taken, could you please stand in the center of carpet?” and then start measuring all the above-mentioned sizes, weight and other dimensions.
  • BMI Body Mass Index
  • the mobile unit 183 , 185 , 187 or 189 provides accurate sizes, weight and other dimensions to the online retailer's systems 160 and results in far less likely returns. Orders via the online retailer's systems 160 also take into consideration sizes, weight and other dimensions' fluctuations and anticipate the user putting on weight, losing weight or yo-yoing between gaining and losing weight, based on the fitness records, for example.
  • the mobile unit 183 , 185 , 187 or 189 also observes the contents of the wardrobe based on what the user wears, figures out the wear and tear on the items (for example, figures out out-of-style materials, or worn holes in socks or under garments, or knees, or rips or missing buttons) and suggests updates or offers items with new colors that coordinate.
  • the mobile unit 183 , 185 , 187 or 189 in conjunction with the online retailer's systems 160 for example, also recommends wardrobe makeover or update.
  • the user may put on a shirt from the wardrobe and the mobile unit 183 , 185 , 187 or 189 may say “Do you like that dirty shirt?”
  • the user may answer, “It is my favorite, Bot.”
  • the mobile unit 183 , 185 , 187 or 189 then responds by saying “Ok then, can I offer you some sweater options to hide it from the public?”
  • the mobile units 183 , 185 , 187 and 189 also support premises safety functionalities. These functionalities include smoke detection, fire inspection, alarm triggering, making sure that the smoke and fire has been brought to everyone's attention, extinguishing the fire, alerting the user about unusual sounds (that may involve destruction of property) or odors and so forth. Besides the Smoke detection, the mobile unit 183 , 185 , 187 or 189 also identifies the source of the fire and figures out the extent of the damage. Furthermore, the mobile unit 183 , 185 , 187 or 189 also informs the user, family members and guests about the smoke, fire or odor, its source and the extent of damage.
  • the mobile unit 183 , 185 , 187 or 189 informs the user (or a predetermined responsible person) about the smoke, fire or odor even when there is nobody at home, via a communication network 191 , that involves mobile (cell) phones (via SMS, for example) or computers (via emails, for example).
  • a communication network 191 that involves mobile (cell) phones (via SMS, for example) or computers (via emails, for example).
  • the tier 3 user and manufacturer cloud based systems and services 111 contains safety management module 129 .
  • Another additional autonomous function that the mobile unit 183 , 185 , 187 or 189 performs is rendering personalized music to the user, family member(s) or guest(s).
  • This functionality involves observing and analyzing the behavior of the user, family member(s) or guest(s) as pertaining to their liking or disliking (studied over a prolonged period), identifying their present moods, and the current contexts and circumstances. For example, in the past, the user might have said “I love this song,” while coming back from the work. Alternatively, during a stressful circumstance, he or she might have said “Bot, please play another song, I don't like this kind of songs when stressed out . . .
  • the mobile unit 183 , 185 , 187 or 189 identifies the current mood and contexts and renders personalized music accordingly. Over a period, the mobile unit 183 , 185 , 187 or 189 begin to make less and less mistakes.
  • the tier 3 user and manufacturer cloud based systems and services 111 contains user behavior pattern module 139 and behavior observation module 147 .
  • the mobile unit 183 , 185 , 187 or 189 also solves many of everyday problems such as a wandering WiFi hub, light turner offer, mobile place to integrate preexisting artificial intelligence engines, working with RoboApps via a Robo SDK (Software Development Kit), integrating with cloud artificial engine tools presently available and so forth.
  • the mobile unit 183 , 185 , 187 or 189 firstly verifies the WiFi signal strength near the user (who is working on a device that needs to be connected to the Internet to function, for example) and if the signal strength is too low, it amplifies the signal and rebroadcasts them within the vicinity of the user.
  • the mobile unit 183 , 185 , 187 or 189 offers to switch on the light, thereby saving the electricity.
  • the mobile unit 183 , 185 , 187 or 189 may follow the user, while trying to make out a noise coming from outside the home, and switch on a built-in flash light or switch on the lights along that path.
  • the mobile unit 183 , 185 , 187 or 189 could locate itself exactly to have a look at things, and assume exact locations and angles/configs.
  • One more application (controlled via a cell phone app) that the mobile unit 183 , 185 , 187 or 189 performs is that of a sentry job.
  • Some of the mobile units 183 , 185 , 187 and 189 are designed to function indoors, some others are for outdoors and many others to perform both indoors and outdoors. When they are functioning within a household or business premises, if there are more than one mobile unit 183 , 185 , 187 or 189 , they network together to share responsibilities.
  • the sentry job involves both indoor and outdoor functionalities and these devices network together to perform a plurality of functionalities.
  • a vacuum cleaner 185 does the sentry job inside a house, during nights to keep a vigil on the house, a lawn mower 189 investigates sounds or flash lights outside the house and responds appropriately.
  • the lawn mower 189 patrols around the house periodically, in an unpredictable manner (to catch the intruders off-guard), and communicates with the vacuum cleaner 185 about the intruders.
  • the network of vacuum cleaner 185 , lawn mower 189 and drone 187 together identify the source of the intrusion and inform the user, family members and guests, during nights, for example, by waking them up, about the intrusion and damage done if any.
  • the mobile units 183 , 185 , 187 and 189 also assemble topography details and share between them, to determine the path to follow in the within the house or building area, garden area and backyard, to perform some specific functionality, such as the sentry job.
  • the artificial intelligence module 195 is adaptable to follow worn paths, roadways, sidewalks, furrows, ground cover transition edges, and is fully automated, semi-automated and fully driven remotely (internet interaction).
  • the mobile units 183 , 185 , 187 and 189 can select nearest charger to recharge themselves.
  • vision systems and artificial intelligence instead of charging stations for some of the mobile units 183 , 185 , 187 and 189 , they just identify and plug their charging cables to a nearest electric outlet. This is done by a flexible telescopic system, that raises itself from above the ground, toward the electric outlet and plug into them.
  • the same goal can be accomplished via a vertical rail system, along which the plug raises toward the electric socket and plugs itself in.
  • the mobile units 183 , 185 , 187 and 189 can predict and forecast results of many of the actions to be taken. This allows them to compare the present reality with that of the possible reality (if certain actions are taken), via their VR (Virtual Reality) capabilities. This can be done for various actions to be taken, for example, before and after for any repairs, furniture layout, lawn repairs, roof repairs, laying out carpets and so forth. This allows the user to correctly and confidently experiment with new possibilities for his or her home, garden and the premises.
  • the mobile unit 183 , 185 , 187 or 189 via a VR headset, demonstrates the looks and feels of the lawn and garden with many new layouts, showing a variety of possibilities, to try. This saves the user financially and allows him or her move ahead with new arrangements for the garden confidently, and is less likely to be disappointed with the money spent on it.
  • the mobile unit 183 , 185 , 187 and 189 collects data by observing the patterns in the behaviors of the user, family members and guests, by following them closely during the initial stages. For example, the vacuum cleaner 185 may begin to follow the user all the times after purchasing and collets the data about what he or she does. It observes that as soon as the user wakes up, he or she heads straight to the bathroom to brush. Then takes shower, haves breakfast and leaves home for work.
  • This type of data collected over a prolonged period allows the vacuum cleaner 185 to assist the user in a variety of ways.
  • collected data in aggregation, from many of the users, also allows the tier 3 user and manufacturer cloud based system and services 111 to predict behaviors of users in general.
  • the mobile units 183 , 185 , 187 and 189 communicate with each other to determine the salient features (as well as mundane features) of the day's events of the user, family members and guests and offer to store the recorded audio/video clips of them.
  • the vacuum cleaner 185 and drone 187 for example, record events of a party held at home, by the user, family members and guests, and store them in the tier 3 user and manufacturer cloud based system and services 111 , with the user's permission.
  • the mobile units 183 , 185 , 187 and 189 also offer to store weather conditions, many other (news) happenings and so forth.
  • the tier 0-2 user's device 181 contains user interface module 193 , artificial intelligence module 195 , user personalization module 197 and premises safety module 199 .
  • the tier 3 user and manufacturer cloud based system and services 111 contains operational control support 113 , user/app command interface 115 and vice recognition and synthesis 117 , WiFi/Bluetooth control module, repair support module 121 , Day-In-Life recorder module 143 , artificial intelligence engines 123 , sensor processing support 125 and voice, security, recognition 127 , safety management module 129 , prediction/forecasting module 131 , wardrobe management module 145 , sentry support module 133 , user personalization module 135 and music rendering module 137 , user behavior pattern module 139 , artificial intelligence analyst (decision making) 141 and behavior observation module 147 .
  • communication networks 191 that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 2 is a perspective block diagram detailing the artificial intelligence analyst (decision making) of FIG. 1 .
  • the tier 0-2 user's devices 281 are smart machines, they can learn from their environment and make decisions autonomously. Over a period, after activation (that is, from the day they are put to work), they begin to improve and perfect their responses, in response to the environmental inputs, which includes specifically the user's inputs, but also that of the family members' and guests′.
  • the mobile units 283 , 285 , 287 and 289 learn, with functional support from artificial intelligence analyst 211 , by collecting data on the conditions in which they can operate, and then they identify patterns in these environmental inputs. Further, they make plan, and schedule, for repeating such operations in the future based on the patterns detected.
  • the artificial intelligence based operations of the mobile units 283 , 285 , 287 and 289 in a user's premises stretch from automating mundane tasks (vacuuming or lawn mowing, for example) to predict user, family member or guest behaviors, and suggesting them different possible actions.
  • the vacuum cleaner 285 follows the user as soon as he or she returns from the work (in the beginning of its life as a mobile unit), and begins to ask questions, such as “What kind of music you like soon as you return from work?” and “What kind of drinks you like to be prepared now?” and in conjunction with the artificial intelligence analyst 211 analyzes the answers to figure out what is the best possible actions to take after returning from the work.
  • the vacuum cleaner 285 also follows the user to figure out the house layout and where all things are placed. Then, it begins to offer many possibilities and lets the user choose what they want right then.
  • Artificial intelligence based operations of the mobile units 283 , 285 , 287 and 289 supplement user's own regular scheduling and planning.
  • the user receives images and graphics showing weather patterns, lawn mowing and repairs to be conducted to mitigate weather impact (including what-if scenarios for repairs), and rescheduling of outdoor sprinkler and lawnmowing operations. For example, lowering the grass height to be trimmed by lawnmowing while simultaneously increasing sprinkler watering time upon determination that the next week is anticipated to be hotter and drier than usual.
  • the artificial intelligence analyst 211 employs proprietary algorithms that weighs hundreds of variables to produce probabilistic forecasts of weather impact on a given user's or residence's lawnmowing schedule, repair schedule, home package delivery schedule, vacuum cleaning schedule and so forth. To perform this functionality, the artificial intelligence analyst 211 consists of probabilistic forecast module 219 . The artificial intelligence analyst 211 then recommends ways to lessen disruptions or save money and time, if rescheduling is inevitable. The artificial intelligence analyst 211 presents different possible outcomes of various decisions, and it ranks possible scenarios associated with an anticipated weather patterns. If user's inputs are not available at that time, the artificial intelligence analyst 211 autonomously takes decision and responds to the calamity by mitigating the impacts. Part of the artificial intelligence analyst 211 is stand-alone software, such as decision support software in tier 0-2 user's devices 281 . The other part is embedded within larger software systems in the artificial intelligence analyst (decision making) 211 .
  • the autonomous decision support module 213 provides support to the tier 0-2 user's device 281 (such as telescopic image bot 283 , vacuum cleaner 285 , indoor/outdoor drone 287 and lawnmower 289 ) as and when needed, in cases of being offline, the tier 0-2 user's devices 281 perform these functions all by themselves. It interacts with most of the modules of the artificial intelligence analyst 211 as well as tier 0-2 user's device 281 to provide autonomous response capabilities to the tier 0-2 user's devices 281 . For example, the autonomous decision support module 213 assists the tier 0-2 user's devices 281 to construct a schedule of autonomous operation for themselves.
  • the tier 0-2 user's device 281 such as telescopic image bot 283 , vacuum cleaner 285 , indoor/outdoor drone 287 and lawnmower 289 .
  • the artificial intelligence analyst 211 conducts data collection and management of behavioral tracking, prediction and recommendation. To perform this, the artificial intelligence analyst 211 consists of user behavior prediction module 215 and guest user behavior prediction module 217 .
  • the robotic movement and decision making module 221 assists household or business premises mobile units 283 , 285 , 287 and 289 network together by sharing necessary information and assisting in making decisions (for all cooperating smart tier 0-2 user's devices 281 ), and assists in robotic movements as well.
  • the situational and threat analysis module 223 provides the mobile units 283 , 285 , 287 and 289 capability to analyze based on contexts (situation based) and respond accordingly, and analyze threats to the people living in the premises (that is, the user, family members and guests) and pets.
  • the natural language processing module 225 provides the capability of processing incoming spoken language and responding (user interaction capabilities) accordingly.
  • the medical diagnostics (and symptoms) module 233 supports the tier 0-2 user's devices 281 capability of diagnosing physical and mental illnesses based on symptoms (not as a substitution for licensed practitioner's diagnosis though).
  • heart disease Coronary Artery Disease CAD or Ischemic Heart Disease, for example, is the most common disease in the world
  • the most common heart disease symptoms include chest pain (angina—generally triggered by physical or emotional stress), shortness of breath (occurs when the heart cannot pump enough blood to meet body's needs) or extreme fatigue with exertion and heart attack.
  • a medical patient treatment module 235 manages the tier 0-2 user's devices 281 capability of providing patients (who are already diagnosed with a disease by a medical practitioner) living in the premises with treatment and therapy.
  • the tier 0-2 user's devices 281 have the capability of providing therapeutic assistance the user's or a family member's diabetes by attempting to restore carbohydrate metabolism to a normal state. (Patients with diabetes tend to have absolute deficiency of insulin.) This goal is achieved by the tier 0-2 user's devices 281 by providing the insulin replacement therapy (as prescribed by the therapist), which is given through injections or an insulin pump.
  • Interactive computer avatar module 231 are chat bots, they provide the tier 0-2 user's devices 281 capabilities of different avatars talking with different residents and guests of the premises.
  • Pattern recognition module 237 provides image, signal or sequence based pattern recognition (in a complex stream of data) capabilities.
  • next sequence/games prediction module 239 makes predictions used in games and behavior prediction possible.
  • Identity systems module 241 is utilized in identifying living residents, handymen and others authorized to work at the premises, as well as intruders within the premises.
  • Advanced decision simulation module 243 provides the tier 0-2 user's devices 281 capability to perform advanced decision simulation for entertainment purposes, such as anticipating the movies residents would want to watch, music that residents might prefer to listen to at a given time and so forth.
  • anticipatory operations module 227 provides the tier 0-2 user's devices 281 capability to perform anticipatory operations for the premises defense.
  • Automated service & management 229 performs automated services of the tier 0-2 user's devices 281 and management of service activities for the tier 0-2 user's devices 281 .
  • Surveillance systems monitoring module 245 provides prediction and analysis capabilities to the tier 0-2 user's devices 281 .
  • Cognitive assistance module 247 reasons, learns, and accepts guidance to provide effective and personalized support to the users, family members and guests. Finally, believable and intelligent non-player characters (not shown) are also utilized to enhance the user's, family member's and guest's gaming experiences.
  • the communication between the systems occur via communication networks 291 , that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 3 is a perspective block diagram detailing the user behavior pattern module of the FIG. 1 .
  • the user behavior pattern module 311 consists of user behavior observation support module 313 , which provides support (higher level behavioral observation processing support) to the basic level observations done by behavior observation module 393 of the tier 0-2 user's devices 381 .
  • the user behavior observation starts with the audiovisual recording of face and facial muscle movements, and determining that the face belongs to the user (owner) of the mobile unit 383 , 385 , 387 or 389 .
  • the user behavior observation also involves bodily movements' video and audio recording.
  • the environmental context in which the user exists are also recorded.
  • This information is kept stored, for further processing, in the server (tier 3 user and manufacturer cloud based system and services 111 of FIG. 1 ).
  • the family member/guest behavior observation support module 319 in conjunction with behavior observation module 393 of the tier 0-2 user's devices 381 , performs similar recording and processing with regards to each of the family members and guests (though it is done with somewhat lesser storage and power consumption than that of the user or owner).
  • the user behavior pattern module 311 also consists of user behavior pattern identification support module 315 and family member/guest behavior identification support module 321 .
  • the user behavior pattern identification support module 315 works in conjunction with behavior identity module 395 of the tier 0-2 user's devices 381 .
  • the most important function of the user behavior pattern identification support module 315 is to find patterns in the recorded audiovisual streams. For example, the incoming stream of audiovisual information (coming in live, from the sensors) are compared with the stored information to find patterns of behaviors. The contexts (coming in live, from the sensors) are also compared with the stored information to find patterns, in an analogous manner.
  • the family member/guest behavior identification support module 321 processes in a similar manner for the family members and guests (and works in conjunction with behavior identify module 395 of the tier 0-2 user's devices 381 as well). These patterns are stored in the tier 3 user and manufacturer cloud based system and services 111 of FIG. 1 , for the future use (that assists in autonomous functioning of the mobile units 383 , 385 , 387 and 389 ).
  • the user behavior pattern analysis support module 317 together with the behavior analysis module 397 of the tier 0-2 user's devices 381 .
  • the family member/guest behavior analysis support module 323 also performs similar processing for the family members and guests (and works in conjunction with behavior identify module 395 of the tier 0-2 user's devices 381 ). These analyses are mostly done by the user behavior pattern analysis support module 317 or family member/guest behavior analysis support module 323 , based mostly upon statistical methodologies. These processes of analysis assist in generation of autonomous operations of the mobile units 383 , 385 , 387 and 389 .
  • the communication between the systems occur via communication networks 391 , that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 4 is a perspective block diagram detailing the Day-In-Life recorder module of FIG. 1 .
  • the Day-In-Life recording as a concept, essentially involves keeping a record of salient features of everyday of the user (and user's family members) as a chronicle. They include maintaining an audiovisual record of the user and family members, news happenings of the day and user's and family member's reactions to them, weather conditions of the day, and the user's and family member's emotions throughout the day. All these records are kept connected to each other and with the user's (and family member's permission). The user can edit the Day-In-Life records, if they wish so.
  • the days for which records are kept need not necessarily be for every day, but can also be only for days in which special events are going to take place, only for prescheduled days (once a week, once a month or once a year and so forth, for example), only for prescheduled times during every day (only after returning from work, for couple of hours, for example) and so forth.
  • the Day-In-Life recorder module 411 consists of user behavior observation module 413 , which assists in recording the user, in their environment, and stores them.
  • the user behavior observation involves audiovisual recording of the user's face and body, after determining to whom the face belongs (user, family members, guests or intruders), and recording the family members and guests and the environmental context in which the user, family members and guests exist. This information is kept stored, as a record of events, in the server (tier 3 user and manufacturer cloud based system and services 111 of FIG. 1 ).
  • the purpose of the user behavior observation module 413 is just to identify the user, family members, guests and intruders (if any) and identify the events which might be important (to be able to record the salient features of the day), but not necessarily to take actual pictures or audiovisual clips of high quality.
  • the Day-In-Life recorder module 411 also consists of camera management support module 415 and video clip management support module 419 .
  • the user behavior observation module 413 identifies user determined or self-determined moments (either in terms of periods at which the images/audiovisual clips to be taken or periods determined based upon the information collected from various sources) and the camera management support module 415 and/or video clip management support module 419 instruct to take images/audiovisual clips of high quality and store them in photo storage database 417 and/or video clip storage database 421 .
  • the special events identification module 431 identifies special moments in the user's and family member's lives and store them for the use, as mentioned above, of user behavior observation module 413 .
  • Calendar/Dairy management support module 433 maintains records in written texts (based on the inputs from the user, family members and guests) their opinions, suggestions and simple explanations of the events, for example.
  • weather conditions record module 439 stores recorded weather events outside the premises.
  • the Day-In-Life search module 437 makes searching through the photo storage database 417 and/or video clip storage database 421 possible.
  • the tier 0-2 user's devices 481 (such as mobile units 483 , 485 , 487 and 489 ) contain camera photo/video clip record module 493 , event record scheduling module 495 , calendar/dairy record module 497 and weather conditions identification module, which contain same processing features as that of the Day-In-Life recorder module 411 at very basic levels, so that the mobile unit 483 , 485 , 487 or 489 can continue to function in case of emergencies (Internet unavailability, for example).
  • emergencies Internet unavailability, for example.
  • the mobile unit 483 , 485 , 487 or 489 recognizes the command and the Day-In-Life recorder module 411 instruction directs the mobile unit 483 , 485 , 487 or 489 to capture the video/image/audio.
  • communication networks 491 that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 5 is a perspective block diagram detailing the tier 0-2 user's device of FIG. 1 .
  • the functionalities available with the tier 0-2 user's device 581 are not much different from those available at tier 3 user and manufacturer cloud based systems and services 511 (as described with reference to the FIG. 1 , FIG. 2 , FIG. 3 and FIG. 4 ), but they are available at basic levels, so that they are resource-efficient and can function autonomously (even when Internet connection is not available, for example).
  • Typical tier 0-2 user's devices 581 include telescopic image bot 583 , vacuum cleaner 585 , indoor/outdoor drone 587 , lawnmower 589 among many other specific designs that serve a plurality of purposes. They have basic level autonomous capabilities as well as advanced capabilities, when they are connected to the tier 3 user and manufacturer cloud based systems and services 511 or other custom built servers.
  • Artificial intelligence module 525 provides some basic AI functionalities to the mobile units 483 , 485 , 487 and 489 , but also interacts with the tier 3 user and manufacturer cloud based systems and services 511 and functions cooperatively and works to provide advanced AI functionalities.
  • the user or a family member might say, “Bot, play me some music . . . ”
  • the vacuum cleaner 585 checks mood of the user or family member, by taking a video clip of the face and sending it to the tier 3 user and manufacturer cloud based systems and services 511 , and plays music that they like for such a mood. This decision accounts for the experience gained by the vacuum cleaner 585 by interacting with them over a period.
  • User interface module 521 manages gesture, voice, visual, keyboard and remote control communications based interactions with the user, family members or anyone else.
  • wireless communication (WiFi/BT) module 525 manages wireless communications, by identifying possibilities for wireless connections (via WiFi and Bluetooth connects, for example) and logging on to the Internet, autonomously.
  • Lighting control module 527 manages the lighting of the house and premises while music rendering module 531 handles the music delivery aspects of the tier 0-2 user's devices 581 .
  • Live camera support module 533 handles built-in camera for live communications or live broadcast.
  • User personalization module 541 manages the user's, family members' and guest's personalization information, either gathered via a voice/video/screen-keyboard based questionnaire or via everyday interactions with them. This information is utilized for the future interactions with the user, family members and guests.
  • Premises safety module 543 are also built into the vacuum cleaners 585 , lawn mowers 589 and drones 587 , which together network to perform safety related functionalities. They identify the source of the intrusion, for example, and inform the user, family members and guests, especially during the nights. The tier 0-2 user's devices 581 wake them up, inform about the intrusion and damage done, if any. Also, they provide emergency medical support to the users, family members and the guests.
  • GPS based device positioning module 545 assists in positioning the mobile unit 583 , 585 , 587 or 589 in some specific position and with some specific angle. This functionality is essential, for example, in taking pictures or audiovisual clips from a clear and best possible angle, in a party environment (with many guests), to get that perfect picture or video clip.
  • self-recharging support module 547 manages aspects related to recharging of the mobile units 583 , 585 , 587 and 589 , all by themselves. When the power level goes down beyond a preset level, the mobile unit 583 , 585 , 587 or 589 identifies the charging stations or electric outlets, and plugin themselves to recharge, without human assistance.
  • AR house and garden mapping module 549 manages mapping of the entire house or business premises and keep them stored for future use.
  • the AR (Augmented Reality) functionality requires that this map is available, for example, for the user to view remotely, or to fathom how the house would appear when repairs are done, carpet is changed or wall painting is done.
  • Lab-on-a-chip diagnostics module 551 handles a plurality of diagnosis related functions of the mobile units 583 , 585 , 587 and 589 .
  • Clothing and wardrobe support module 553 handles the networking of some of the mobile units (such as vacuum cleaner 585 and drone 587 ) and then performing the wardrobe related functionalities (as described with reference to the FIG. 1 ).
  • the communication between the tier 3 user and manufacturer cloud based systems and services 511 and mobile units 583 , 585 , 587 and 589 occur via communication networks 591 , that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 6 is a perspective block diagram illustrating few additional functionalities of the tier 3 user and manufacturer cloud based systems and services, and artificial intelligence analyst module, of FIG. 1 .
  • the tier 3 user and manufacturer cloud based systems and services 611 , and artificial intelligence analyst module 613 support a plurality of user devices, gain experiences from interactions with them (by collecting data about their preferences, choices and reactions, under a variety of contexts) and apply the experiences to new and emerging situations. That is, when the contexts are similar, the tier 3 user and manufacturer cloud based systems and services 611 , and in specific, artificial intelligence analyst module 613 , attempt to figure out the response to a given input and then apply the best possible response.
  • the tier 3 user and manufacturer cloud based systems and services 611 , and artificial intelligence analyst module 613 learn that when friends gather in the evening, there is a likelihood of a party happening. This situation deserves a Day-In-Life recording, the user device 661 , 663 or 665 , assisted by the tier 3 user and manufacturer cloud based systems and services 611 , and artificial intelligence analyst module 613 , takes the decision autonomously (with the support of a decision support server 671 , in cases of emergencies).
  • the decision support server 671 The most important function of the decision support server 671 is to assist the user devices 661 , 663 or 665 , to function autonomously, by taking swift decisions. Its functionality is same as that of the autonomous decision support module 213 (of FIG. 2 ), however, in this embodiment, it is implemented as a separate server.
  • the decision support server 671 support autonomous decisions of the user device 661 , 663 or 665 (supported by the tier 3 user and manufacturer cloud based systems and services 611 , and artificial intelligence analyst module 613 ), in emergencies.
  • the user device 661 , 663 or 665 responds rapidly, by switching on all the lights and alarms (for example), thus waking up the users and family members.
  • communication networks 691 that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 7 is a perspective block diagram illustrating functionalities of an exemplary wardrobe management role of the mobile units of FIG. 1 .
  • the mobile unit (a vacuum cleaner or a custom-built unit) 727 or 715 has a built-in drone 723 or 721 mounted on it.
  • the drone 723 or 721 parks on the top of the vacuum cleaner 727 or 715 and charges itself.
  • the drone 723 or 721 has a plurality of functionalities, such as taking stock of clothes in the wardrobe 719 , suggesting clothes for different types of occasions and interfacing with online retailers for new clothes or shoes purchasing.
  • the vacuum cleaner 727 or 715 too has a plurality of functionalities, that include clothes sizing, inner garments sizing, socks and shoes sizing, weight measurements and predicting weight gain or loss, height and body volume and BMI (Body Mass Index) measurements, among many other wardrobe 719 related functionalities.
  • the user (or a family member or guest) 717 may come out of shower and say “Bot, measure me.” Then, the vacuum cleaner 715 circles around the user 717 and via infrared light reflections, for example, measures the body volume. Then, the vacuum cleaner 715 may reply “Your body volume is up by 10% since last month, most to your waist line and thighs. Your waist line volume is up by 26% . . . . would you like clothes sizing information updated?” Then, the vacuum cleaner updates the cloth, undergarments sizing and if the user 717 requests, it purchases clothes and under garments for the user 717 . Simultaneously, with that updated information sent to the drone 721 , the drone 721 checks for any cloth that fits the user 717 and suggests the cloths for the user 717 .
  • the vacuum cleaner 715 responds by saying “Your periodic measurements are being taken, could you please stand still?” and then measures the height, weight and BMI.
  • FIG. 8 is a flowchart illustrating the processes involved in the autonomous functioning and decision making of the mobile units.
  • the processes begin at a block 807 , when the user puts a brand new (or at first) mobile unit to work, in residential and business premises.
  • the mobile unit begins to follow the user and observe the user's behavior. That is, the mobile unit records the user throughout the day that he or she exists in the residential or business premises and stores them in the cloud.
  • the mobile unit analyzes the user's behavioral pattern, by identifying routines in the user's everyday behavior. For the mobile unit, these patterns in behavior are essential in sketching out a plan, and its autonomous functioning and decision making.
  • the mobile unit analyzes the scheduled activity. This process involves figuring out preexisting (or even manufacturer set) scheduled activity of the mobile unit, either stored locally or existing in the cloud.
  • the weather forecasting is factored-in in the process of scheduling the activity. This is important because some of the scheduled activity execution is entirely dependent on the weather conditions.
  • the mobile unit develops a schedule of autonomous operation for itself. This may take up to a few weeks for developing such an autonomous scheduled activity, fully finalized. For example, in a simplified logic, the user may wake up in the morning, may finish his or her routine bathroom tasks and may switch on television at about 7 AM. Then, he or she might have breakfast. In the meanwhile, he or she might vacuum the house. So, the morning scheduled activity for the mobile unit would be to switch on the television at 7 AM, and vacuum the house after that.
  • the mobile unit begins to follow its fully developed scheduled activities and makes decisions based on this logic where need be (for example, to switch on the television only when the user is at the premises).
  • the process ends at the block 821 .
  • FIG. 9 is a flowchart illustrating the processes involved in the Day-In-Life functionality of the mobile units.
  • the processes begin at a start block 907 .
  • the mobile unit fetches behavioral analysis and activity schedule data, from the processes of autonomous functioning and decision making (described with reference to the FIG. 8 ).
  • the mobile unit develops a schedule for predictable Day-In-Life events.
  • the mobile unit predicts the behavioral patterns of the user.
  • the mobile unit attempts to identify present behaviors and their contexts, at a next block 915 .
  • the mobile unit decides that the event taking place is important for the user and offers to record video clips/audio clips and/or take pictures, at a next block 917 .
  • the mobile unit also records salient weather features of the day as well, at a next block 919 .
  • the mobile unit predicts this from the behavioral analysis and activity schedule data and occasionally or on days of significant news (detected based on the social media activities, and keyword and context based searches, for example), it offers to record an audio/video clip recording or take pictures (along with the news clip).
  • the mobile unit identifies the user and family members excited on a Saturday evening, and identifies friends and relatives arriving, it decides that the ongoing event is important for the user, and starts recording audio/video recording and take pictures (along with weather conditions).
  • the mobile unit allows the user to edit it, before uploading the audio/video, image, news clip and weather condition clip contents to the cloud.
  • operably coupled and “communicatively coupled,” as may be used herein, include direct coupling and indirect coupling via another component, element, circuit, or module where, for indirect coupling, the intervening component, element, circuit, or module does not modify the information of a signal but may adjust its current level, voltage level, and/or power level.
  • inferred coupling i.e., where one element is coupled to another element by inference
  • inferred coupling includes direct and indirect coupling between two elements in the same manner as “operably coupled” and “communicatively coupled.”
  • the present invention has been described in terms of GPS coordinates/and navigational information communication involving mobile phones and computers, it must be clear that the present invention also applies to other types of devices including mobile devices, laptops with a browser, a hand held device such as a PDA, a television, a set-top-box, a media center at home, robots, robotic devices, vehicles capable of navigation, and a computer communicatively coupled to the network.

Abstract

A network of multitier user's devices (mobile units) that performs household chores and functions as a mobile general purpose autonomous intelligent machines, having some primary functions and a plurality of additional secondary functions. The primary functions include one or more of vacuum cleaning, lawn mowing, typical drone functionalities, the additional autonomous functions include wardrobe management, premises safety, sentry job, rendering personalized music, self-recharging from typical electric outlet, self-learning to predict the users' behaviors and planning and scheduling the functionalities (decision making) accordingly, providing AI (Artificial Intelligence) assistance and support, Day-In-Life recording and support and so forth. The Day-In-Life recording functionalities involve predicting users' behaviors, identifying routine, salient features and important events ahead of time, and electronically recording and storing the routine events, salient features and important events, with the support of a cloud based central support and services server system.

Description

    CROSS REFERENCES TO RELATED APPLICATIONS
  • The present U.S. Utility Patent Application claims priority pursuant to 35 U.S.C. §119(e) to U.S. Provisional Application No. 62/350,187, entitled “Modular Mobile Units,” filed Jun. 15, 2016, which is hereby incorporated herein by reference in its entirety and made part of the present U.S. Utility Patent Application for all purposes.
  • BACKGROUND 1. Technical Field
  • The present invention relates generally to artificial intelligence based assistant devices, and, more specifically, devices that perform household chores and functions as a mobile general purpose autonomous intelligent machine.
  • 2. Related Art
  • Many household and business-related everyday mobile, functional devices that typically perform a single household chore are widely in use today. For example, vacuum cleaners that just function as a vacuum cleaner, lawnmowers that just accomplish lawn mowing (even though they might be robotic, in some sense), and drones that just photograph and video record aerial views of the house or business premises (some of the drones may have some alternative functionalities and/or additional functionalities though) and so forth.
  • All the above-mentioned devices are mobile and possess wheels or flying rotor blades, accordingly. Many of these devices (vacuum cleaners, lawn mowers, drones and such) are autonomous to a certain degree, but require constant attention (for fuel or recharging them, for example). These devices exhibit limited human interaction capabilities (that is, possess some form of user interfaces) and limited personalization-ability or customizability.
  • The vacuum cleaners, lawnmowers and drones, as they are available today, do possess certain Internet based updating capability. This makes updating themselves with the latest available firmware from their manufacturer's server possible, with human assistance in most of the cases though, for example.
  • Further limitations and disadvantages of conventional and traditional approaches will become apparent to one of ordinary skill in the art through comparison of such systems with the present invention.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Drawings, the Detailed Description of the Invention, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective block diagram of a mobile unit infrastructure that performs a plurality of additional household chores and functions as a mobile general purpose autonomous intelligent machine;
  • FIG. 2 is a perspective block diagram detailing the artificial intelligence analyst (decision making) of FIG. 1;
  • FIG. 3 is a perspective block diagram detailing the user behavior pattern module of FIG. 1;
  • FIG. 4 is a perspective block diagram detailing the Day-In-Life recorder module of FIG. 1;
  • FIG. 5 is a perspective block diagram detailing the tier 0-2 user's device of FIG. 1;
  • FIG. 6 is a perspective block diagram illustrating few additional functionalities of the tier 3 user and manufacturer cloud based systems and services, and artificial intelligence analyst module, of FIG. 1;
  • FIG. 7 is a perspective block diagram illustrating functionalities of an exemplary wardrobe management role of the mobile units of FIG. 1;
  • FIG. 8 is a flowchart illustrating the processes involved in the autonomous functioning and decision making of the mobile units; and
  • FIG. 9 is a flowchart illustrating the processes involved in the Day-In-Life functionality of the mobile units.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a perspective block diagram of a mobile unit infrastructure 105 that performs a plurality of additional household chores and functions as a mobile general purpose autonomous intelligent machine. Whereas primary functions may include one or more of vacuum cleaning, lawn mowing, typical drone functionalities, the additional autonomous functions include wardrobe management, premises safety, sentry job, rendering personalized music, self-recharging from typical electric outlet, self-learning to predict the user's, rest of the family member's and guest's behaviors, and planning and scheduling the functionalities (decision making) accordingly, providing AI (Artificial Intelligence) assistance and support, Day-In-Life recording and support and so forth.
  • As mentioned above, the mobile units 183, 185, 187 and 189 support wardrobe management functionalities. These functionalities essentially involve clothes sizing, inner garments sizing, socks and shoes sizing, weight measurements and predicting weight gain or loss, height and body volume and BMI (Body Mass Index) measurements, taking stock of clothes in the wardrobe, suggesting clothes for different types of occasions and interfacing with online retailers for new clothes or shoes purchasing (the mobile unit 183, 185, 187 or 189 provides the clothes or shoes dimensions' interior and the user filters the selections), among many other wardrobe related functionalities. In addition, a database records of all the measurements taken are stored in a tier 3 user and manufacturer cloud based systems and services 111. To perform these functions, the tier 3 user and manufacturer cloud based systems and services 111 contains wardrobe management module 145. Similarly, for the wardrobe management functionalities, the mobile units 183, 185, 187 and 189 contain user interface module (audio/visual and keyboard interfaces) 193, artificial intelligence module 195 and user personalization module 197.
  • For example, the user (or a family member or guest) may come out of shower and say “Bot, measure me.” Then, the mobile unit 183, 185, 187 or 189 circles around the user and says “Your body volume is up by 22% since last month, most to your waist line . . . . Would you like clothes sizing information updated?” Alternatively, the user may have instructed the mobile unit 183, 185, 187 or 189 to measure height, weight and BMI (Body Mass Index) periodically, once a week or a month, for example. The mobile unit 183, 185, 187 or 189 may say “Your periodic measurements are being taken, could you please stand in the center of carpet?” and then start measuring all the above-mentioned sizes, weight and other dimensions.
  • With a strong online retailer's systems 160 interface, the mobile unit 183, 185, 187 or 189 provides accurate sizes, weight and other dimensions to the online retailer's systems 160 and results in far less likely returns. Orders via the online retailer's systems 160 also take into consideration sizes, weight and other dimensions' fluctuations and anticipate the user putting on weight, losing weight or yo-yoing between gaining and losing weight, based on the fitness records, for example.
  • In addition, the mobile unit 183, 185, 187 or 189 also observes the contents of the wardrobe based on what the user wears, figures out the wear and tear on the items (for example, figures out out-of-style materials, or worn holes in socks or under garments, or knees, or rips or missing buttons) and suggests updates or offers items with new colors that coordinate. The mobile unit 183, 185, 187 or 189, in conjunction with the online retailer's systems 160 for example, also recommends wardrobe makeover or update. For example, the user may put on a shirt from the wardrobe and the mobile unit 183, 185, 187 or 189 may say “Do you like that dirty shirt?” The user may answer, “It is my favorite, Bot.” The mobile unit 183, 185, 187 or 189 then responds by saying “Ok then, can I offer you some sweater options to hide it from the public?”
  • The mobile units 183, 185, 187 and 189 also support premises safety functionalities. These functionalities include smoke detection, fire inspection, alarm triggering, making sure that the smoke and fire has been brought to everyone's attention, extinguishing the fire, alerting the user about unusual sounds (that may involve destruction of property) or odors and so forth. Besides the Smoke detection, the mobile unit 183, 185, 187 or 189 also identifies the source of the fire and figures out the extent of the damage. Furthermore, the mobile unit 183, 185, 187 or 189 also informs the user, family members and guests about the smoke, fire or odor, its source and the extent of damage. The mobile unit 183, 185, 187 or 189 informs the user (or a predetermined responsible person) about the smoke, fire or odor even when there is nobody at home, via a communication network 191, that involves mobile (cell) phones (via SMS, for example) or computers (via emails, for example). To support these functionalities, the tier 3 user and manufacturer cloud based systems and services 111 contains safety management module 129.
  • Another additional autonomous function that the mobile unit 183, 185, 187 or 189 performs is rendering personalized music to the user, family member(s) or guest(s). This functionality involves observing and analyzing the behavior of the user, family member(s) or guest(s) as pertaining to their liking or disliking (studied over a prolonged period), identifying their present moods, and the current contexts and circumstances. For example, in the past, the user might have said “I love this song,” while coming back from the work. Alternatively, during a stressful circumstance, he or she might have said “Bot, please play another song, I don't like this kind of songs when stressed out . . . ” The mobile unit 183, 185, 187 or 189 identifies the current mood and contexts and renders personalized music accordingly. Over a period, the mobile unit 183, 185, 187 or 189 begin to make less and less mistakes. To support the personalized music rendering functionalities, the tier 3 user and manufacturer cloud based systems and services 111 contains user behavior pattern module 139 and behavior observation module 147.
  • Beyond the primary functions for which they were originally meant for, the mobile unit 183, 185, 187 or 189 also solves many of everyday problems such as a wandering WiFi hub, light turner offer, mobile place to integrate preexisting artificial intelligence engines, working with RoboApps via a Robo SDK (Software Development Kit), integrating with cloud artificial engine tools presently available and so forth. To perform as a wandering WiFi hub, the mobile unit 183, 185, 187 or 189 firstly verifies the WiFi signal strength near the user (who is working on a device that needs to be connected to the Internet to function, for example) and if the signal strength is too low, it amplifies the signal and rebroadcasts them within the vicinity of the user. Similarly, when the light is low, the mobile unit 183, 185, 187 or 189 offers to switch on the light, thereby saving the electricity. For example, the mobile unit 183, 185, 187 or 189 may follow the user, while trying to make out a noise coming from outside the home, and switch on a built-in flash light or switch on the lights along that path. Moreover, the mobile unit 183, 185, 187 or 189 could locate itself exactly to have a look at things, and assume exact locations and angles/configs.
  • One more application (controlled via a cell phone app) that the mobile unit 183, 185, 187 or 189 performs is that of a sentry job. Some of the mobile units 183, 185, 187 and 189 are designed to function indoors, some others are for outdoors and many others to perform both indoors and outdoors. When they are functioning within a household or business premises, if there are more than one mobile unit 183, 185, 187 or 189, they network together to share responsibilities. Thus, the sentry job involves both indoor and outdoor functionalities and these devices network together to perform a plurality of functionalities. For example, a vacuum cleaner 185 does the sentry job inside a house, during nights to keep a vigil on the house, a lawn mower 189 investigates sounds or flash lights outside the house and responds appropriately. The lawn mower 189 patrols around the house periodically, in an unpredictable manner (to catch the intruders off-guard), and communicates with the vacuum cleaner 185 about the intruders. The network of vacuum cleaner 185, lawn mower 189 and drone 187 together identify the source of the intrusion and inform the user, family members and guests, during nights, for example, by waking them up, about the intrusion and damage done if any. The mobile units 183, 185, 187 and 189 also assemble topography details and share between them, to determine the path to follow in the within the house or building area, garden area and backyard, to perform some specific functionality, such as the sentry job.
  • As an example of furthermore additional functionalities of the mobile units 183, 185, 187 and 189, the artificial intelligence module 195 is adaptable to follow worn paths, roadways, sidewalks, furrows, ground cover transition edges, and is fully automated, semi-automated and fully driven remotely (internet interaction). Similarly, while in motion, the mobile units 183, 185, 187 and 189 can select nearest charger to recharge themselves. Alternatively, via vision systems and artificial intelligence, instead of charging stations for some of the mobile units 183, 185, 187 and 189, they just identify and plug their charging cables to a nearest electric outlet. This is done by a flexible telescopic system, that raises itself from above the ground, toward the electric outlet and plug into them. Alternatively, the same goal can be accomplished via a vertical rail system, along which the plug raises toward the electric socket and plugs itself in.
  • The mobile units 183, 185, 187 and 189 can predict and forecast results of many of the actions to be taken. This allows them to compare the present reality with that of the possible reality (if certain actions are taken), via their VR (Virtual Reality) capabilities. This can be done for various actions to be taken, for example, before and after for any repairs, furniture layout, lawn repairs, roof repairs, laying out carpets and so forth. This allows the user to correctly and confidently experiment with new possibilities for his or her home, garden and the premises. For example, the mobile unit 183, 185, 187 or 189 (the lawn mower 189, for example), via a VR headset, demonstrates the looks and feels of the lawn and garden with many new layouts, showing a variety of possibilities, to try. This saves the user financially and allows him or her move ahead with new arrangements for the garden confidently, and is less likely to be disappointed with the money spent on it.
  • One another important additional autonomous functionality of the mobile units 183, 185, 187 and 189 involves their capability to self-learn, by observing, identifying and analyzing the behaviors of the user, family members and guests. The mobile unit 183, 185, 187 or 189 collects data by observing the patterns in the behaviors of the user, family members and guests, by following them closely during the initial stages. For example, the vacuum cleaner 185 may begin to follow the user all the times after purchasing and collets the data about what he or she does. It observes that as soon as the user wakes up, he or she heads straight to the bathroom to brush. Then takes shower, haves breakfast and leaves home for work. This also involves behavior analysis, by sending this information to the tier 3 user and manufacturer cloud based system and services 111, and comparing with the stored database information to take appropriate and relevant actions. This type of data collected over a prolonged period allows the vacuum cleaner 185 to assist the user in a variety of ways. Moreover, such collected data, in aggregation, from many of the users, also allows the tier 3 user and manufacturer cloud based system and services 111 to predict behaviors of users in general.
  • Yet another important additional autonomous functionality of the mobile units 183, 185, 187 and 189 involves Day-In-Life recording. The mobile units 183, 185, 187 and 189 communicate with each other to determine the salient features (as well as mundane features) of the day's events of the user, family members and guests and offer to store the recorded audio/video clips of them. The vacuum cleaner 185 and drone 187, for example, record events of a party held at home, by the user, family members and guests, and store them in the tier 3 user and manufacturer cloud based system and services 111, with the user's permission. The mobile units 183, 185, 187 and 189 also offer to store weather conditions, many other (news) happenings and so forth.
  • To perform all the abovementioned functionalities, that include both the primary and secondary autonomous functionalities, the tier 0-2 user's device 181 contains user interface module 193, artificial intelligence module 195, user personalization module 197 and premises safety module 199. Similarly, the tier 3 user and manufacturer cloud based system and services 111 contains operational control support 113, user/app command interface 115 and vice recognition and synthesis 117, WiFi/Bluetooth control module, repair support module 121, Day-In-Life recorder module 143, artificial intelligence engines 123, sensor processing support 125 and voice, security, recognition 127, safety management module 129, prediction/forecasting module 131, wardrobe management module 145, sentry support module 133, user personalization module 135 and music rendering module 137, user behavior pattern module 139, artificial intelligence analyst (decision making) 141 and behavior observation module 147. Finally, the communication between the tier 0-2 user's device 181, tier 3 user and manufacturer cloud based systems and services 111 and online retailer's systems 160 occurs via communication networks 191, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 2 is a perspective block diagram detailing the artificial intelligence analyst (decision making) of FIG. 1. The tier 0-2 user's devices 281 (mobile units) are smart machines, they can learn from their environment and make decisions autonomously. Over a period, after activation (that is, from the day they are put to work), they begin to improve and perfect their responses, in response to the environmental inputs, which includes specifically the user's inputs, but also that of the family members' and guests′. The mobile units 283, 285, 287 and 289 learn, with functional support from artificial intelligence analyst 211, by collecting data on the conditions in which they can operate, and then they identify patterns in these environmental inputs. Further, they make plan, and schedule, for repeating such operations in the future based on the patterns detected.
  • In addition, the artificial intelligence based operations of the mobile units 283, 285, 287 and 289 (with support from the artificial intelligence analyst 211) in a user's premises stretch from automating mundane tasks (vacuuming or lawn mowing, for example) to predict user, family member or guest behaviors, and suggesting them different possible actions. For example, the vacuum cleaner 285 follows the user as soon as he or she returns from the work (in the beginning of its life as a mobile unit), and begins to ask questions, such as “What kind of music you like soon as you return from work?” and “What kind of drinks you like to be prepared now?” and in conjunction with the artificial intelligence analyst 211 analyzes the answers to figure out what is the best possible actions to take after returning from the work. In addition to questioning, the vacuum cleaner 285 also follows the user to figure out the house layout and where all things are placed. Then, it begins to offer many possibilities and lets the user choose what they want right then.
  • Artificial intelligence based operations of the mobile units 283, 285, 287 and 289 (with support from the artificial intelligence analyst 211) supplement user's own regular scheduling and planning. The user receives images and graphics showing weather patterns, lawn mowing and repairs to be conducted to mitigate weather impact (including what-if scenarios for repairs), and rescheduling of outdoor sprinkler and lawnmowing operations. For example, lowering the grass height to be trimmed by lawnmowing while simultaneously increasing sprinkler watering time upon determination that the next week is anticipated to be hotter and drier than usual.
  • The artificial intelligence analyst 211 employs proprietary algorithms that weighs hundreds of variables to produce probabilistic forecasts of weather impact on a given user's or residence's lawnmowing schedule, repair schedule, home package delivery schedule, vacuum cleaning schedule and so forth. To perform this functionality, the artificial intelligence analyst 211 consists of probabilistic forecast module 219. The artificial intelligence analyst 211 then recommends ways to lessen disruptions or save money and time, if rescheduling is inevitable. The artificial intelligence analyst 211 presents different possible outcomes of various decisions, and it ranks possible scenarios associated with an anticipated weather patterns. If user's inputs are not available at that time, the artificial intelligence analyst 211 autonomously takes decision and responds to the calamity by mitigating the impacts. Part of the artificial intelligence analyst 211 is stand-alone software, such as decision support software in tier 0-2 user's devices 281. The other part is embedded within larger software systems in the artificial intelligence analyst (decision making) 211.
  • The autonomous decision support module 213 provides support to the tier 0-2 user's device 281 (such as telescopic image bot 283, vacuum cleaner 285, indoor/outdoor drone 287 and lawnmower 289) as and when needed, in cases of being offline, the tier 0-2 user's devices 281 perform these functions all by themselves. It interacts with most of the modules of the artificial intelligence analyst 211 as well as tier 0-2 user's device 281 to provide autonomous response capabilities to the tier 0-2 user's devices 281. For example, the autonomous decision support module 213 assists the tier 0-2 user's devices 281 to construct a schedule of autonomous operation for themselves.
  • The artificial intelligence analyst 211 conducts data collection and management of behavioral tracking, prediction and recommendation. To perform this, the artificial intelligence analyst 211 consists of user behavior prediction module 215 and guest user behavior prediction module 217. The robotic movement and decision making module 221 assists household or business premises mobile units 283, 285, 287 and 289 network together by sharing necessary information and assisting in making decisions (for all cooperating smart tier 0-2 user's devices 281), and assists in robotic movements as well. The situational and threat analysis module 223 provides the mobile units 283, 285, 287 and 289 capability to analyze based on contexts (situation based) and respond accordingly, and analyze threats to the people living in the premises (that is, the user, family members and guests) and pets. The natural language processing module 225 provides the capability of processing incoming spoken language and responding (user interaction capabilities) accordingly.
  • The medical diagnostics (and symptoms) module 233 supports the tier 0-2 user's devices 281 capability of diagnosing physical and mental illnesses based on symptoms (not as a substitution for licensed practitioner's diagnosis though). For example, heart disease (Coronary Artery Disease CAD or Ischemic Heart Disease, for example, is the most common disease in the world) symptoms can be identified and diagnosed by the tier 0-2 user's devices 281, with the assistance of the medical diagnostics (and symptoms) module 233. The most common heart disease symptoms include chest pain (angina—generally triggered by physical or emotional stress), shortness of breath (occurs when the heart cannot pump enough blood to meet body's needs) or extreme fatigue with exertion and heart attack. These symptoms are observed by the tier 0-2 user's devices 281 and sent to the medical diagnostics (and symptoms) module 233 for further analysis. In addition, a medical patient treatment module 235 manages the tier 0-2 user's devices 281 capability of providing patients (who are already diagnosed with a disease by a medical practitioner) living in the premises with treatment and therapy. For example, the tier 0-2 user's devices 281 have the capability of providing therapeutic assistance the user's or a family member's diabetes by attempting to restore carbohydrate metabolism to a normal state. (Patients with diabetes tend to have absolute deficiency of insulin.) This goal is achieved by the tier 0-2 user's devices 281 by providing the insulin replacement therapy (as prescribed by the therapist), which is given through injections or an insulin pump.
  • Interactive computer avatar module 231 are chat bots, they provide the tier 0-2 user's devices 281 capabilities of different avatars talking with different residents and guests of the premises. Pattern recognition module 237 provides image, signal or sequence based pattern recognition (in a complex stream of data) capabilities. Similarly, next sequence/games prediction module 239 makes predictions used in games and behavior prediction possible. Identity systems module 241 is utilized in identifying living residents, handymen and others authorized to work at the premises, as well as intruders within the premises. Advanced decision simulation module 243 provides the tier 0-2 user's devices 281 capability to perform advanced decision simulation for entertainment purposes, such as anticipating the movies residents would want to watch, music that residents might prefer to listen to at a given time and so forth.
  • Furthermore, anticipatory operations module 227 provides the tier 0-2 user's devices 281 capability to perform anticipatory operations for the premises defense. Automated service & management 229 performs automated services of the tier 0-2 user's devices 281 and management of service activities for the tier 0-2 user's devices 281. Surveillance systems monitoring module 245 provides prediction and analysis capabilities to the tier 0-2 user's devices 281. Cognitive assistance module 247 reasons, learns, and accepts guidance to provide effective and personalized support to the users, family members and guests. Finally, believable and intelligent non-player characters (not shown) are also utilized to enhance the user's, family member's and guest's gaming experiences.
  • The communication between the systems occur via communication networks 291, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 3 is a perspective block diagram detailing the user behavior pattern module of the FIG. 1. The user behavior pattern module 311 consists of user behavior observation support module 313, which provides support (higher level behavioral observation processing support) to the basic level observations done by behavior observation module 393 of the tier 0-2 user's devices 381. The user behavior observation starts with the audiovisual recording of face and facial muscle movements, and determining that the face belongs to the user (owner) of the mobile unit 383, 385, 387 or 389. Furthermore, the user behavior observation also involves bodily movements' video and audio recording. Similarly, in the current situation, the environmental context in which the user exists are also recorded. This information is kept stored, for further processing, in the server (tier 3 user and manufacturer cloud based system and services 111 of FIG. 1). The family member/guest behavior observation support module 319, in conjunction with behavior observation module 393 of the tier 0-2 user's devices 381, performs similar recording and processing with regards to each of the family members and guests (though it is done with somewhat lesser storage and power consumption than that of the user or owner).
  • The user behavior pattern module 311 also consists of user behavior pattern identification support module 315 and family member/guest behavior identification support module 321. The user behavior pattern identification support module 315 works in conjunction with behavior identity module 395 of the tier 0-2 user's devices 381. The most important function of the user behavior pattern identification support module 315 is to find patterns in the recorded audiovisual streams. For example, the incoming stream of audiovisual information (coming in live, from the sensors) are compared with the stored information to find patterns of behaviors. The contexts (coming in live, from the sensors) are also compared with the stored information to find patterns, in an analogous manner. The family member/guest behavior identification support module 321 processes in a similar manner for the family members and guests (and works in conjunction with behavior identify module 395 of the tier 0-2 user's devices 381 as well). These patterns are stored in the tier 3 user and manufacturer cloud based system and services 111 of FIG. 1, for the future use (that assists in autonomous functioning of the mobile units 383, 385, 387 and 389).
  • These identified patterns are further processed by the user behavior pattern analysis support module 317, together with the behavior analysis module 397 of the tier 0-2 user's devices 381. The family member/guest behavior analysis support module 323 also performs similar processing for the family members and guests (and works in conjunction with behavior identify module 395 of the tier 0-2 user's devices 381). These analyses are mostly done by the user behavior pattern analysis support module 317 or family member/guest behavior analysis support module 323, based mostly upon statistical methodologies. These processes of analysis assist in generation of autonomous operations of the mobile units 383, 385, 387 and 389. The communication between the systems occur via communication networks 391, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 4 is a perspective block diagram detailing the Day-In-Life recorder module of FIG. 1. The Day-In-Life recording, as a concept, essentially involves keeping a record of salient features of everyday of the user (and user's family members) as a chronicle. They include maintaining an audiovisual record of the user and family members, news happenings of the day and user's and family member's reactions to them, weather conditions of the day, and the user's and family member's emotions throughout the day. All these records are kept connected to each other and with the user's (and family member's permission). The user can edit the Day-In-Life records, if they wish so. Furthermore, the days for which records are kept need not necessarily be for every day, but can also be only for days in which special events are going to take place, only for prescheduled days (once a week, once a month or once a year and so forth, for example), only for prescheduled times during every day (only after returning from work, for couple of hours, for example) and so forth.
  • The Day-In-Life recorder module 411 consists of user behavior observation module 413, which assists in recording the user, in their environment, and stores them. In specific, the user behavior observation involves audiovisual recording of the user's face and body, after determining to whom the face belongs (user, family members, guests or intruders), and recording the family members and guests and the environmental context in which the user, family members and guests exist. This information is kept stored, as a record of events, in the server (tier 3 user and manufacturer cloud based system and services 111 of FIG. 1). The purpose of the user behavior observation module 413 is just to identify the user, family members, guests and intruders (if any) and identify the events which might be important (to be able to record the salient features of the day), but not necessarily to take actual pictures or audiovisual clips of high quality.
  • The Day-In-Life recorder module 411 also consists of camera management support module 415 and video clip management support module 419. The user behavior observation module 413 identifies user determined or self-determined moments (either in terms of periods at which the images/audiovisual clips to be taken or periods determined based upon the information collected from various sources) and the camera management support module 415 and/or video clip management support module 419 instruct to take images/audiovisual clips of high quality and store them in photo storage database 417 and/or video clip storage database 421. The special events identification module 431 identifies special moments in the user's and family member's lives and store them for the use, as mentioned above, of user behavior observation module 413.
  • Calendar/Dairy management support module 433 maintains records in written texts (based on the inputs from the user, family members and guests) their opinions, suggestions and simple explanations of the events, for example. Similarly, weather conditions record module 439 stores recorded weather events outside the premises. The Day-In-Life search module 437 makes searching through the photo storage database 417 and/or video clip storage database 421 possible. Finally, the tier 0-2 user's devices 481 (such as mobile units 483, 485, 487 and 489) contain camera photo/video clip record module 493, event record scheduling module 495, calendar/dairy record module 497 and weather conditions identification module, which contain same processing features as that of the Day-In-Life recorder module 411 at very basic levels, so that the mobile unit 483, 485, 487 or 489 can continue to function in case of emergencies (Internet unavailability, for example).
  • For example, via a command “Bot, Cheese” or “Bot, Action” and via a Day-In-Life recorder module 411 service, the mobile unit 483, 485, 487 or 489 recognizes the command and the Day-In-Life recorder module 411 instruction directs the mobile unit 483, 485, 487 or 489 to capture the video/image/audio.
  • Finally, the communication between the systems occur via communication networks 491, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 5 is a perspective block diagram detailing the tier 0-2 user's device of FIG. 1. The functionalities available with the tier 0-2 user's device 581 are not much different from those available at tier 3 user and manufacturer cloud based systems and services 511 (as described with reference to the FIG. 1, FIG. 2, FIG. 3 and FIG. 4), but they are available at basic levels, so that they are resource-efficient and can function autonomously (even when Internet connection is not available, for example). Typical tier 0-2 user's devices 581 include telescopic image bot 583, vacuum cleaner 585, indoor/outdoor drone 587, lawnmower 589 among many other specific designs that serve a plurality of purposes. They have basic level autonomous capabilities as well as advanced capabilities, when they are connected to the tier 3 user and manufacturer cloud based systems and services 511 or other custom built servers.
  • Artificial intelligence module 525 provides some basic AI functionalities to the mobile units 483, 485, 487 and 489, but also interacts with the tier 3 user and manufacturer cloud based systems and services 511 and functions cooperatively and works to provide advanced AI functionalities. For example, the user or a family member might say, “Bot, play me some music . . . ” The vacuum cleaner 585 checks mood of the user or family member, by taking a video clip of the face and sending it to the tier 3 user and manufacturer cloud based systems and services 511, and plays music that they like for such a mood. This decision accounts for the experience gained by the vacuum cleaner 585 by interacting with them over a period.
  • User interface module 521 manages gesture, voice, visual, keyboard and remote control communications based interactions with the user, family members or anyone else. Similarly, wireless communication (WiFi/BT) module 525 manages wireless communications, by identifying possibilities for wireless connections (via WiFi and Bluetooth connects, for example) and logging on to the Internet, autonomously. Lighting control module 527 manages the lighting of the house and premises while music rendering module 531 handles the music delivery aspects of the tier 0-2 user's devices 581. Live camera support module 533 handles built-in camera for live communications or live broadcast. User personalization module 541 manages the user's, family members' and guest's personalization information, either gathered via a voice/video/screen-keyboard based questionnaire or via everyday interactions with them. This information is utilized for the future interactions with the user, family members and guests.
  • Premises safety module 543 are also built into the vacuum cleaners 585, lawn mowers 589 and drones 587, which together network to perform safety related functionalities. They identify the source of the intrusion, for example, and inform the user, family members and guests, especially during the nights. The tier 0-2 user's devices 581 wake them up, inform about the intrusion and damage done, if any. Also, they provide emergency medical support to the users, family members and the guests.
  • GPS based device positioning module 545 assists in positioning the mobile unit 583, 585, 587 or 589 in some specific position and with some specific angle. This functionality is essential, for example, in taking pictures or audiovisual clips from a clear and best possible angle, in a party environment (with many guests), to get that perfect picture or video clip. Furthermore, self-recharging support module 547 manages aspects related to recharging of the mobile units 583, 585, 587 and 589, all by themselves. When the power level goes down beyond a preset level, the mobile unit 583, 585, 587 or 589 identifies the charging stations or electric outlets, and plugin themselves to recharge, without human assistance.
  • AR house and garden mapping module 549 manages mapping of the entire house or business premises and keep them stored for future use. The AR (Augmented Reality) functionality requires that this map is available, for example, for the user to view remotely, or to fathom how the house would appear when repairs are done, carpet is changed or wall painting is done. Lab-on-a-chip diagnostics module 551 handles a plurality of diagnosis related functions of the mobile units 583, 585, 587 and 589. Clothing and wardrobe support module 553 handles the networking of some of the mobile units (such as vacuum cleaner 585 and drone 587) and then performing the wardrobe related functionalities (as described with reference to the FIG. 1).
  • The communication between the tier 3 user and manufacturer cloud based systems and services 511 and mobile units 583, 585, 587 and 589 occur via communication networks 591, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 6 is a perspective block diagram illustrating few additional functionalities of the tier 3 user and manufacturer cloud based systems and services, and artificial intelligence analyst module, of FIG. 1. The tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613, support a plurality of user devices, gain experiences from interactions with them (by collecting data about their preferences, choices and reactions, under a variety of contexts) and apply the experiences to new and emerging situations. That is, when the contexts are similar, the tier 3 user and manufacturer cloud based systems and services 611, and in specific, artificial intelligence analyst module 613, attempt to figure out the response to a given input and then apply the best possible response.
  • For example, by interacting with thousands of users, the tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613, learn that when friends gather in the evening, there is a likelihood of a party happening. This situation deserves a Day-In-Life recording, the user device 661, 663 or 665, assisted by the tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613, takes the decision autonomously (with the support of a decision support server 671, in cases of emergencies).
  • The most important function of the decision support server 671 is to assist the user devices 661, 663 or 665, to function autonomously, by taking swift decisions. Its functionality is same as that of the autonomous decision support module 213 (of FIG. 2), however, in this embodiment, it is implemented as a separate server.
  • For example, when the intruders trespass the business property, especially during nights, the decision support server 671 support autonomous decisions of the user device 661, 663 or 665 (supported by the tier 3 user and manufacturer cloud based systems and services 611, and artificial intelligence analyst module 613), in emergencies. As a result, the user device 661, 663 or 665 responds rapidly, by switching on all the lights and alarms (for example), thus waking up the users and family members.
  • Finally, the communication between the tier 3 user and manufacturer cloud based systems and services 611, user devices 661, 663 and 665 and decision support server 671 is implemented via communication networks 691, that includes internet, intranet and household wired or wireless (via Wi-Fi, Bluetooth®, optical or infrared) communication networks.
  • FIG. 7 is a perspective block diagram illustrating functionalities of an exemplary wardrobe management role of the mobile units of FIG. 1. In accordance with the present embodiment, the mobile unit (a vacuum cleaner or a custom-built unit) 727 or 715 has a built-in drone 723 or 721 mounted on it. The drone 723 or 721 parks on the top of the vacuum cleaner 727 or 715 and charges itself. The drone 723 or 721 and vacuum cleaner 727 or 715 network, while the drone 723 or 721 is flying, for example, communicate and share information between each other via a wired connection, Bluetooth® or WiFi connection.
  • The drone 723 or 721 has a plurality of functionalities, such as taking stock of clothes in the wardrobe 719, suggesting clothes for different types of occasions and interfacing with online retailers for new clothes or shoes purchasing. Similarly, the vacuum cleaner 727 or 715 too has a plurality of functionalities, that include clothes sizing, inner garments sizing, socks and shoes sizing, weight measurements and predicting weight gain or loss, height and body volume and BMI (Body Mass Index) measurements, among many other wardrobe 719 related functionalities.
  • For example, the user (or a family member or guest) 717 may come out of shower and say “Bot, measure me.” Then, the vacuum cleaner 715 circles around the user 717 and via infrared light reflections, for example, measures the body volume. Then, the vacuum cleaner 715 may reply “Your body volume is up by 10% since last month, most to your waist line and thighs. Your waist line volume is up by 26% . . . . Would you like clothes sizing information updated?” Then, the vacuum cleaner updates the cloth, undergarments sizing and if the user 717 requests, it purchases clothes and under garments for the user 717. Simultaneously, with that updated information sent to the drone 721, the drone 721 checks for any cloth that fits the user 717 and suggests the cloths for the user 717.
  • Alternatively, if the user instructs the vacuum cleaner 715 to measure height, weight and BMI (Body Mass Index) periodically, the vacuum cleaner 715 responds by saying “Your periodic measurements are being taken, could you please stand still?” and then measures the height, weight and BMI.
  • FIG. 8 is a flowchart illustrating the processes involved in the autonomous functioning and decision making of the mobile units. The processes begin at a block 807, when the user puts a brand new (or at first) mobile unit to work, in residential and business premises. Then, at a next block 809, the mobile unit begins to follow the user and observe the user's behavior. That is, the mobile unit records the user throughout the day that he or she exists in the residential or business premises and stores them in the cloud.
  • At a next block 811, the mobile unit analyzes the user's behavioral pattern, by identifying routines in the user's everyday behavior. For the mobile unit, these patterns in behavior are essential in sketching out a plan, and its autonomous functioning and decision making. At a next block 813, the mobile unit analyzes the scheduled activity. This process involves figuring out preexisting (or even manufacturer set) scheduled activity of the mobile unit, either stored locally or existing in the cloud.
  • At a next block 815, the weather forecasting is factored-in in the process of scheduling the activity. This is important because some of the scheduled activity execution is entirely dependent on the weather conditions.
  • Then, at a next block 817, the mobile unit develops a schedule of autonomous operation for itself. This may take up to a few weeks for developing such an autonomous scheduled activity, fully finalized. For example, in a simplified logic, the user may wake up in the morning, may finish his or her routine bathroom tasks and may switch on television at about 7 AM. Then, he or she might have breakfast. In the meanwhile, he or she might vacuum the house. So, the morning scheduled activity for the mobile unit would be to switch on the television at 7 AM, and vacuum the house after that.
  • At a next block 819, the mobile unit begins to follow its fully developed scheduled activities and makes decisions based on this logic where need be (for example, to switch on the television only when the user is at the premises). The process ends at the block 821.
  • FIG. 9 is a flowchart illustrating the processes involved in the Day-In-Life functionality of the mobile units. The processes begin at a start block 907. Then, at a next block 909, the mobile unit fetches behavioral analysis and activity schedule data, from the processes of autonomous functioning and decision making (described with reference to the FIG. 8). Then, at a next block 911, the mobile unit develops a schedule for predictable Day-In-Life events.
  • Further, at a next block 913, the mobile unit predicts the behavioral patterns of the user. The mobile unit then attempts to identify present behaviors and their contexts, at a next block 915. Once the behavioral patterns are predicted and present behaviors and their contexts identified, and the contents are compared, the mobile unit decides that the event taking place is important for the user and offers to record video clips/audio clips and/or take pictures, at a next block 917. Along with that, the mobile unit also records salient weather features of the day as well, at a next block 919.
  • For an example of the abovementioned processes, consider the user having a habit of waking up at 7 AM, brushing teeth, switching on the television and watching news (most of the days). The mobile unit predicts this from the behavioral analysis and activity schedule data and occasionally or on days of significant news (detected based on the social media activities, and keyword and context based searches, for example), it offers to record an audio/video clip recording or take pictures (along with the news clip). Similarly, when the mobile unit identifies the user and family members excited on a Saturday evening, and identifies friends and relatives arriving, it decides that the ongoing event is important for the user, and starts recording audio/video recording and take pictures (along with weather conditions). At the end of the day or next day, the mobile unit allows the user to edit it, before uploading the audio/video, image, news clip and weather condition clip contents to the cloud.
  • As one of ordinary skill in the art will appreciate, the terms “operably coupled” and “communicatively coupled,” as may be used herein, include direct coupling and indirect coupling via another component, element, circuit, or module where, for indirect coupling, the intervening component, element, circuit, or module does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. A s one of ordinary skill in the art will also appreciate, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two elements in the same manner as “operably coupled” and “communicatively coupled.”
  • Although the present invention has been described in terms of GPS coordinates/and navigational information communication involving mobile phones and computers, it must be clear that the present invention also applies to other types of devices including mobile devices, laptops with a browser, a hand held device such as a PDA, a television, a set-top-box, a media center at home, robots, robotic devices, vehicles capable of navigation, and a computer communicatively coupled to the network.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention.
  • One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.
  • Moreover, although described in detail for purposes of clarity and understanding by way of the aforementioned embodiments, the present invention is not limited to such embodiments. It will be obvious to one of average skill in the art that various changes and modifications may be practiced within the spirit and scope of the invention, as limited only by the scope of the appended claims.

Claims (20)

What is claimed is:
1. A mobile electronic system network, comprising:
a plurality of user's mobile devices;
a cloud based central support and services server system; and
the plurality of user's mobile devices performs a multitude of autonomous functionalities, comprising:
wardrobe management, comprising:
clothes, garments and shoes sizing;
waist, height, body volume and BMI (Body Mass Index) measurements;
predicting weight gain or loss;
taking stock of clothes in the wardrobe;
suggesting clothes for different types of occasions; and
interfacing with online retailers for new clothes or shoes purchasing; and
the cloud based central support and services server system provides support and services to the plurality of user's mobile devices.
2. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising:
premises safety, comprising:
smoke detection;
fire inspection;
alarm triggering;
identifying the source of smoke, unusual odor and fire;
alerting about smoke, unusual odor and fire; and
extinguishing the fire.
3. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising:
sentry job, comprising:
keeping a vigil on indoors of the premises;
identifying intruders, indoors and outdoors;
patrolling indoor and outdoor, within the premises;
investigating sounds or flash lights outdoors of the premises;
identifying the source of intrusion; and
waking up and altering the users.
4. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising:
self-recharging, comprising;
identifying typical electric outlet;
plugging itself in;
identifying recharging station; and
plugging itself in.
5. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising:
self-learning, to predict the users' behaviors; and
applying the learnt knowledge in the future interactions.
6. The mobile electronic system network of claim 1, wherein the multitude of
autonomous functionalities further comprising providing AI (Artificial Intelligence) assistance.
7. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising rendering personalized music.
8. The mobile electronic system network of claim 1, wherein the multitude of autonomous functionalities further comprising diagnosing the users' illnesses.
9. The mobile electronic system network of claim 1, wherein the cloud based central support and services server system further comprising an artificial intelligence analyst module that collects data about a plurality of users' preferences, choices and reactions, under a variety of contexts, and applies this knowledge to new and emerging situations.
10. The mobile electronic system network of claim 1, wherein the user's mobile devices comprising vacuum cleaners.
11. The mobile electronic system network of claim 1, wherein the user's mobile devices comprising lawn mowers.
12. The mobile electronic system network of claim 1, wherein the user's mobile devices comprising drones.
13. A mobile electronic network infrastructure, comprising:
a plurality of user's mobile devices;
a cloud based central support and services server system; and
the plurality of user's mobile devices performs Day-In-Life recording functionalities, comprising:
predicting users' behaviors;
identifying routine and salient features of the day ahead of time;
identifying important events ahead of time, by observing the user contexts; and
electronically recording and storing the routine events, salient features and important events; and
the cloud based central support and services server system provides support and services to the plurality of user's mobile devices.
14. The mobile electronic network infrastructure of claim 13, wherein the predicting user's behavior comprising:
observing the user's behavior through the day;
developing a schedule for the observed routine behaviors; and
developing a schedule for anniversaries and repeating events.
15. The mobile electronic network infrastructure of claim 13, wherein the Day-In-Life recording functionalities further comprising recording and storing news of the day.
16. The mobile electronic network infrastructure of claim 13, wherein the Day-In-Life recording functionalities further comprising recording and storing weather conditions of the day.
17. The mobile electronic network infrastructure of claim 13, wherein the Day-In-Life recording functionalities further comprising time stamping the recordings.
18. A method performed by a user's device, to produce its autonomous functioning and decision making, the method comprising:
following the user and observing the pattern of behaviors;
analyzing the pattern of behaviors;
analyzing the scheduled activities;
factoring in the weather conditions;
developing a schedule of autonomous operation; and
operating autonomously, in accordance with the scheduled activities.
19. The method of claim 18, wherein the observing the pattern of behaviors comprising recording the user behavior and comparing it with the present behavior for any similarities.
20. The method of claim 18, wherein the observing the pattern of behaviors comprising recording the user behavior contexts and comparing it with the present behavior contexts for any similarities.
US15/622,707 2016-06-15 2017-06-14 Multifunction mobile units Abandoned US20170364828A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/622,707 US20170364828A1 (en) 2016-06-15 2017-06-14 Multifunction mobile units

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662350187P 2016-06-15 2016-06-15
US15/622,707 US20170364828A1 (en) 2016-06-15 2017-06-14 Multifunction mobile units

Publications (1)

Publication Number Publication Date
US20170364828A1 true US20170364828A1 (en) 2017-12-21

Family

ID=60659657

Family Applications (7)

Application Number Title Priority Date Filing Date
US15/622,516 Active 2038-01-21 US10726103B2 (en) 2016-06-15 2017-06-14 Premises composition and modular rights management
US15/622,707 Abandoned US20170364828A1 (en) 2016-06-15 2017-06-14 Multifunction mobile units
US15/622,531 Abandoned US20170364091A1 (en) 2016-06-15 2017-06-14 Modular multitier mobile units
US15/622,554 Active US10127362B2 (en) 2016-06-15 2017-06-14 Pool mobile units
US15/622,573 Abandoned US20170364924A1 (en) 2016-06-15 2017-06-14 Mobile units for furnishing, repairing and refurbishing residences
US16/114,379 Active US10942989B2 (en) 2016-06-15 2018-08-28 Pool mobile units
US16/919,506 Abandoned US20200334340A1 (en) 2016-06-15 2020-07-02 Premises composition & modular rights management

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/622,516 Active 2038-01-21 US10726103B2 (en) 2016-06-15 2017-06-14 Premises composition and modular rights management

Family Applications After (5)

Application Number Title Priority Date Filing Date
US15/622,531 Abandoned US20170364091A1 (en) 2016-06-15 2017-06-14 Modular multitier mobile units
US15/622,554 Active US10127362B2 (en) 2016-06-15 2017-06-14 Pool mobile units
US15/622,573 Abandoned US20170364924A1 (en) 2016-06-15 2017-06-14 Mobile units for furnishing, repairing and refurbishing residences
US16/114,379 Active US10942989B2 (en) 2016-06-15 2018-08-28 Pool mobile units
US16/919,506 Abandoned US20200334340A1 (en) 2016-06-15 2020-07-02 Premises composition & modular rights management

Country Status (1)

Country Link
US (7) US10726103B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10935943B2 (en) 2018-08-08 2021-03-02 International Business Machines Corporation Combining home automation and UAV technology
US11073362B1 (en) * 2020-08-24 2021-07-27 King Abdulaziz University Distributed airborne acoustic anti drone system (DAAADS)
US11130409B1 (en) 2017-11-30 2021-09-28 Hydro-Gear Limited Partnership Automatic performance learning system for utility vehicles
US20210369068A1 (en) * 2018-10-16 2021-12-02 Samsung Electronics Co., Ltd. Robotic cleaner and controlling method therefor
US11457558B1 (en) 2019-05-15 2022-10-04 Hydro-Gear Limited Partnership Autonomous vehicle navigation
US20230237430A1 (en) * 2020-02-07 2023-07-27 eMeasurematics Inc. Drone-based inventory management methods and systems

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10839665B2 (en) 2013-03-15 2020-11-17 Hayward Industries, Inc. Underwater lighting system with bather detection circuitry
EP2972902B1 (en) 2013-03-15 2019-10-02 Hayward Industries, Inc. Modular pool/spa control system
WO2016019367A2 (en) 2014-08-01 2016-02-04 Hygenia, LLC Hand sanitizer station
JP6340433B2 (en) * 2014-11-28 2018-06-06 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd UAV and its water sample detection method
US20170211285A1 (en) 2016-01-22 2017-07-27 Hayward Industries, Inc. Systems and Methods for Providing Network Connectivity and Remote Monitoring, Optimization, and Control of Pool/Spa Equipment
US11720085B2 (en) 2016-01-22 2023-08-08 Hayward Industries, Inc. Systems and methods for providing network connectivity and remote monitoring, optimization, and control of pool/spa equipment
WO2017161086A1 (en) 2016-03-18 2017-09-21 ConnectedYard, Inc. Chemical monitoring devices and methods
US10942990B2 (en) * 2016-06-15 2021-03-09 James Duane Bennett Safety monitoring system with in-water and above water monitoring devices
US10726103B2 (en) * 2016-06-15 2020-07-28 James Duane Bennett Premises composition and modular rights management
CA3057298A1 (en) 2017-03-21 2018-09-27 Hayward Industries, Inc. Systems and methods for sanitizing pool and spa water
US10769844B1 (en) * 2017-05-12 2020-09-08 Alarm.Com Incorporated Marker aided three-dimensional mapping and object labeling
US10796711B2 (en) * 2017-09-29 2020-10-06 Honda Motor Co., Ltd. System and method for dynamic optical microphone
US10717659B2 (en) * 2017-09-30 2020-07-21 Sensor Electronic Technology, Inc. Ultraviolet irradiation of aquatic environment
US10925804B2 (en) 2017-10-04 2021-02-23 Sundance Spas, Inc. Remote spa control system
CN109982563A (en) * 2017-10-27 2019-07-05 苏州宝时得电动工具有限公司 From mobile device, charging station, automatic working system and its worm inhibit device
US11017317B2 (en) 2017-12-27 2021-05-25 X Development Llc Evaluating robot learning
US10730181B1 (en) 2017-12-27 2020-08-04 X Development Llc Enhancing robot learning
US11475291B2 (en) 2017-12-27 2022-10-18 X Development Llc Sharing learned information among robots
US10163323B1 (en) * 2018-02-14 2018-12-25 National Chin-Yi University Of Technology Swimming pool safety surveillance system
US10982456B2 (en) * 2018-03-16 2021-04-20 Maytronic Ltd. Pool cleaning system
WO2019183100A1 (en) * 2018-03-19 2019-09-26 Thomas Rogers Swimming pool monitoring
CN109875466B (en) * 2018-04-18 2021-02-19 松下家电研究开发(杭州)有限公司 Floor wiping method of floor wiping robot
CN108597186B (en) * 2018-04-18 2020-04-28 广东小天才科技有限公司 Drowning alarm method based on user behavior and wearable device
CN108814431B (en) * 2018-05-31 2019-12-31 深圳市无限动力发展有限公司 Floor sweeping robot system and method for detecting access of water tank and garbage can
DE202019006027U1 (en) * 2018-06-13 2024-03-15 Positec Power Tools (Suzhou) Co., Ltd Self-propelled device, functional module and automatic working system
CN108781705B (en) * 2018-06-27 2021-04-30 长安大学 Intelligent mower based on laser scanning radar sensor and control method thereof
US11357376B2 (en) * 2018-07-27 2022-06-14 Panasonic Intellectual Property Corporation Of America Information processing method, information processing apparatus and computer-readable recording medium storing information processing program
US11399682B2 (en) * 2018-07-27 2022-08-02 Panasonic Intellectual Property Corporation Of America Information processing method, information processing apparatus and computer-readable recording medium storing information processing program
JP2021177265A (en) * 2018-08-01 2021-11-11 ソニーグループ株式会社 Movable body and control method
CN108965720A (en) * 2018-08-13 2018-12-07 河南亚视软件技术有限公司 A kind of recognition of face camera system towards intelligent robot
CA3111465A1 (en) * 2018-09-28 2020-04-02 Techtronic Cordless Gp An electronic communication device for use in a navigation system
AU2018442762A1 (en) * 2018-09-28 2021-04-22 Techtronic Cordless Gp A grass maintenance system
WO2020127530A1 (en) 2018-12-18 2020-06-25 Trinamix Gmbh Autonomous household appliance
US11322010B1 (en) * 2019-01-17 2022-05-03 Alarm.Com Incorporated Swimming pool monitoring
CN110084512B (en) * 2019-04-26 2022-04-22 河海大学常州校区 Multi-robot task allocation method for intelligent warehousing system
US10803723B2 (en) * 2019-06-27 2020-10-13 Darryl L Hurt Safety apparatus for a water body
KR20190086631A (en) * 2019-07-02 2019-07-23 엘지전자 주식회사 An artificial intelligence apparatus for cleaning in consideration of user's action and method for the same
SE544459C2 (en) * 2019-08-22 2022-06-07 Husqvarna Ab Improved operation for a robotic work tool based on an analysis by a cloud service of sensor data from two robotic work tools
KR20190106867A (en) * 2019-08-27 2019-09-18 엘지전자 주식회사 An artificial intelligence apparatus for guiding arrangement location of furniture and operating method thereof
CN110598648B (en) * 2019-09-17 2023-05-09 无锡慧眼人工智能科技有限公司 Video face detection method, video face detection unit and system
US20210122467A1 (en) * 2019-10-24 2021-04-29 Alarm.Com Incorporated Drone navigation and landing
WO2021167441A1 (en) * 2020-02-20 2021-08-26 Enotaker Technology Sdn Bhd An online bidding system and method thereof
JP7288416B2 (en) * 2020-03-27 2023-06-07 本田技研工業株式会社 AUTONOMOUS WORK SYSTEM, AUTONOMOUS WORK SETTING METHOD AND PROGRAM
DE102020208575A1 (en) * 2020-07-08 2022-01-13 BSH Hausgeräte GmbH Mobile, self-propelled device
US11004324B1 (en) * 2020-07-24 2021-05-11 Jet Rocafort of America, Inc. Pool alarm
CN112381247A (en) * 2020-11-26 2021-02-19 邹伟龙 Maintenance service system after sale of decoration project
CA3181592A1 (en) * 2021-11-10 2023-05-10 Techtronic Cordless Gp Robotic lawn mowers
US20230142590A1 (en) * 2021-11-10 2023-05-11 Techtronic Cordless Gp Robotic lawn mowers

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064309A (en) * 1998-09-28 2000-05-16 Sellers; Scott D. Swimming pool drowning prevention system
US7089876B2 (en) * 2002-11-12 2006-08-15 Aquatron Llc Floating electronic platform for swimming pools and spas
US7118678B2 (en) * 2003-03-07 2006-10-10 Aqua Products, Inc. Portable ozone treatment for swimming pools
CA2521390C (en) * 2003-04-07 2012-01-03 Silverbrook Research Pty Ltd Sensing device for coded data
WO2007047827A1 (en) * 2005-10-18 2007-04-26 Aquatron Inc. Customized programmable pool cleaner method and apparatus
US7398138B2 (en) * 2005-11-10 2008-07-08 Zodiac Pool Care, Inc. Swimming pool and spa controller systems and equipment
ES2706729T3 (en) * 2005-12-02 2019-04-01 Irobot Corp Robot system
US20080084317A1 (en) * 2006-10-06 2008-04-10 Kimberly-Clark Worldwide, Inc. RFID-based methods and systems to enhance personal safety
DE102007053311A1 (en) * 2007-06-21 2008-12-24 Robert Bosch Gmbh Drive system for a robotic vehicle
US8245617B2 (en) * 2007-08-07 2012-08-21 Engineering Science Analysis Corporation Non-lethal restraint device with diverse deployability applications
US9811849B2 (en) * 2007-09-28 2017-11-07 Great-Circle Technologies, Inc. Contextual execution of automated workflows
US7839291B1 (en) * 2007-10-02 2010-11-23 Flir Systems, Inc. Water safety monitor systems and methods
US8237574B2 (en) * 2008-06-05 2012-08-07 Hawkeye Systems, Inc. Above-water monitoring of swimming pools
KR101040193B1 (en) * 2008-10-09 2011-06-09 한국전자통신연구원 Method for offering service in pervasive computing environement and apparatus thereof
US9305445B1 (en) * 2009-07-10 2016-04-05 Jeffrey L. Hanning Alarm system for passageways
US9105169B2 (en) * 2009-07-10 2015-08-11 Jeffrey L. Hanning Alarm system for passageways
US8295979B2 (en) * 2010-01-06 2012-10-23 Deere & Company Adaptive scheduling of a service robot
JP2013524352A (en) * 2010-03-31 2013-06-17 セキュリティー ファースト コーポレイション System and method for securing data in motion
CN106639392A (en) * 2010-12-10 2017-05-10 亨沃工业公司 Power supplies for pool and spa equipment
US10803724B2 (en) * 2011-04-19 2020-10-13 Innovation By Imagination LLC System, device, and method of detecting dangerous situations
US20140282786A1 (en) * 2013-03-12 2014-09-18 Time Warner Cable Enterprises Llc Methods and apparatus for providing and uploading content to personalized network storage
EP3008707A4 (en) * 2013-07-10 2017-04-05 Seal Innovation, Inc. Water safety monitoring systems and related methods
ES2613138T3 (en) * 2013-08-23 2017-05-22 Lg Electronics Inc. Robot cleaner and method to control it
US9641285B2 (en) * 2014-03-06 2017-05-02 Samsung Electronics Co., Ltd. Ultra low power (ULP) decoder and decoding processing
US9633547B2 (en) * 2014-05-20 2017-04-25 Ooma, Inc. Security monitoring and control
EP3155790A1 (en) * 2014-06-13 2017-04-19 Zodiac Pool Systems, Inc. Controlling aspects of pools and spas
US9672716B2 (en) * 2014-07-01 2017-06-06 Clarke V Carroll Swim-A-Sure system and device
WO2016007858A2 (en) * 2014-07-11 2016-01-14 Matko Michelle Anna Alert system for children within proximity of a pool or water
US20210256614A1 (en) * 2014-09-22 2021-08-19 State Farm Mutual Automobile Insurance Company Theft identification and insurance claim adjustment using drone data
WO2016049079A1 (en) * 2014-09-22 2016-03-31 Homdna, Inc. Apparatus, system and method for electronic interrelating of a home and the goods and services within it
CA2968018A1 (en) * 2014-11-18 2016-05-26 Jalousier Ou Device and method for blind control and automation
KR102314637B1 (en) * 2015-03-23 2021-10-18 엘지전자 주식회사 Robot cleaner, and robot cleaning system
US9944366B2 (en) * 2015-05-19 2018-04-17 Rujing Tang Unmanned aerial vehicle system and methods for use
US20170102681A1 (en) * 2015-10-13 2017-04-13 Google Inc. Coordinating energy use of disparately-controlled devices in the smart home based on near-term predicted hvac control trajectories
EP3375146B1 (en) * 2015-11-11 2020-09-16 Telefonaktiebolaget LM Ericsson (publ) Systems and methods relating to a smart home manager
US20170167151A1 (en) * 2015-12-10 2017-06-15 Elazar Segal Lifesaving system and method for swimming pool
US9963230B2 (en) * 2016-01-11 2018-05-08 The Procter & Gamble Company Aerial drone cleaning device and method of cleaning a target surface therewith
ES2797774T3 (en) * 2016-01-26 2020-12-03 Maytronics Ltd Method to operate an interactive pool cleaning robot
US10452037B2 (en) * 2016-03-30 2019-10-22 Lenovo (Singapore) Pte. Ltd. Apparatus, method, and program product for controlling appliances
WO2017182047A1 (en) * 2016-04-18 2017-10-26 Nec Europe Ltd. Method for operating one or more service systems
US10726103B2 (en) * 2016-06-15 2020-07-28 James Duane Bennett Premises composition and modular rights management
US20180188695A1 (en) * 2016-12-30 2018-07-05 Qualcomm Incorporated User location and activity based smart reminders
US11504607B2 (en) * 2019-02-05 2022-11-22 Deep Innovations Ltd. System and method for using a camera unit for the pool cleaning robot for safety monitoring and augmented reality games

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11130409B1 (en) 2017-11-30 2021-09-28 Hydro-Gear Limited Partnership Automatic performance learning system for utility vehicles
US10935943B2 (en) 2018-08-08 2021-03-02 International Business Machines Corporation Combining home automation and UAV technology
US20210369068A1 (en) * 2018-10-16 2021-12-02 Samsung Electronics Co., Ltd. Robotic cleaner and controlling method therefor
US11457558B1 (en) 2019-05-15 2022-10-04 Hydro-Gear Limited Partnership Autonomous vehicle navigation
US11849668B1 (en) 2019-05-15 2023-12-26 Hydro-Gear Limited Partnership Autonomous vehicle navigation
US20230237430A1 (en) * 2020-02-07 2023-07-27 eMeasurematics Inc. Drone-based inventory management methods and systems
US11073362B1 (en) * 2020-08-24 2021-07-27 King Abdulaziz University Distributed airborne acoustic anti drone system (DAAADS)
US11118870B1 (en) * 2020-08-24 2021-09-14 King Abdulaziz University Blimp-deployed anti-drone system
US11307003B2 (en) * 2020-08-24 2022-04-19 King Abdulaziz University Blimp-based aerial UAV defense system
US20220214145A1 (en) * 2020-08-24 2022-07-07 King Abdulaziz University Method to identify routes of unmanned aerial vehicles approaching a protected site
US11421965B2 (en) * 2020-08-24 2022-08-23 King Abdulaziz University Method to identify routes of unmanned aerial vehicles approaching a protected site

Also Published As

Publication number Publication date
US20170364667A1 (en) 2017-12-21
US10726103B2 (en) 2020-07-28
US20170364924A1 (en) 2017-12-21
US10127362B2 (en) 2018-11-13
US20180365394A1 (en) 2018-12-20
US10942989B2 (en) 2021-03-09
US20200334340A1 (en) 2020-10-22
US20170365150A1 (en) 2017-12-21
US20170364091A1 (en) 2017-12-21

Similar Documents

Publication Publication Date Title
US20170364828A1 (en) Multifunction mobile units
US20240062081A1 (en) Private artificial intelligence (ai) model of a user for use by an autonomous personal companion
JP6984004B2 (en) Continuous selection of scenarios based on identification tags that describe the user's contextual environment for the user's artificial intelligence model to run through an autonomous personal companion.
Skouby et al. Smart cities and the ageing population
US20200167631A1 (en) Human-Robots: The New Specie
US20190102667A1 (en) Modular hierarchical vision system of an autonomous personal companion
US20210097893A1 (en) Technical solutions for customized tours
Honig et al. Toward socially aware person-following robots
Zielonka et al. Smart homes: How much will they support us? A research on recent trends and advances
US10405745B2 (en) Human socializable entity for improving digital health care delivery
CN110139732A (en) Social robot with environmental Kuznets Curves feature
CN107431649B (en) The method and system realized for resident's strategy in intelligent household's environment
JP7346019B2 (en) Systems and methods in object history association
WO2016202524A1 (en) Device for assisting a user in a household
JP7281198B2 (en) Autonomous behavioral robot that acts based on experience
JP7375770B2 (en) Information processing device, information processing method, and program
JP7160092B2 (en) Information processing device, information processing method, program, and autonomous action robot control system
US20230196691A1 (en) Extended reality 3d space monitoring
CA3104823C (en) Network activity validation
US11557142B1 (en) Home wildlife deterrence
CN208337600U (en) Cloud wired home brain
US20230196692A1 (en) Coordinating extended reality 3d space
US20230196681A1 (en) Extended reality 3d space creation and management
Wang et al. The Future Home in the 5G Era: Next Generation Strategies for Hyper-connected Living
Sung Towards the human-centered design of everyday robots

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION