US20230196144A1 - Robotic h matrix creation - Google Patents

Robotic h matrix creation Download PDF

Info

Publication number
US20230196144A1
US20230196144A1 US18/112,349 US202318112349A US2023196144A1 US 20230196144 A1 US20230196144 A1 US 20230196144A1 US 202318112349 A US202318112349 A US 202318112349A US 2023196144 A1 US2023196144 A1 US 2023196144A1
Authority
US
United States
Prior art keywords
robot
training
frequency response
data
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/112,349
Inventor
Michel Allegue Martinez
Negar Ghourchian
David Grant
Francois Morel
Pascal Paradis-Theberge
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerial Technologies Inc
Original Assignee
Aerial Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerial Technologies Inc filed Critical Aerial Technologies Inc
Priority to US18/112,349 priority Critical patent/US20230196144A1/en
Assigned to AERIAL TECHNOLOGIES INC. reassignment AERIAL TECHNOLOGIES INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MOREL, FRANCOIS, GHOURCHIAN, Negar, GRANT, DAVID, MARTINEZ, MICHEL ALLEGUE, PARADIS-THEBERGE, PASCAL
Publication of US20230196144A1 publication Critical patent/US20230196144A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/01Determining conditions which influence positioning, e.g. radio environment, state of motion or energy consumption
    • G01S5/017Detecting state or type of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/163Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0284Relative positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present disclosure is generally related to using a robot and machine learning to train a motion detection system. Specifically, using a robot to perform human motions repeatedly to teach a Wi-Fi motion detection system to identify different human activities with in a target environment.
  • Motion detection is the process of detecting a change in the position of an object relative to associated surroundings or a change in the surroundings relative to an object.
  • Motion detection is usually a software-based monitoring algorithm which. For example when motions are detected, the surveillance camera may be signaled to begin capturing the event.
  • An advanced motion detection surveillance system can analyze the type of motion to see if an alarm is warranted.
  • Wi-Fi location determination also known as Wi-Fi localization or Wi-Fi location estimation refers to methods of translating observed Wi-Fi signal strengths into locations.
  • RSSI Received Signal Strength Indicators
  • Embodiments of the present invention provides for training a motion detection system in which a robot can replicate movements repeatedly in the exact same manner to provide a better model for frequency data for motion detection.
  • a system for training a motion detection system includes a training robot, a training module, a correction module, a variation module, a training database, a profile database, an execution module, a speed sensitivity module, and a location sensitivity module.
  • the training module sends pre-programmed movements to the training robot to perform and monitors and stores the received signal generated in the impulse response data by the training robot.
  • the new training data is updated in the profile database.
  • the system improves a Wi-Fi motion detection system by using IoT device data.
  • the system includes a Wi-Fi motion detection system, an IoT device database, a cloud server, and an IoT device.
  • the system collects location and activity data and associates such data with data from the IoT devices.
  • FIG. 1 illustrates an exemplary network environment in which a system for robotic training regarding Wi-Fi motion detection may be implemented.
  • FIG. 2 illustrates an exemplary profile database
  • FIG. 3 illustrates an exemplary training database
  • FIG. 4 is a flowchart illustrating an exemplary method of robotic training.
  • FIG. 5 is a flowchart illustrating an exemplary of training correction.
  • FIG. 6 is a flowchart illustrating an exemplary of detecting training variation.
  • FIG. 7 is an illustration of an exemplary method of execution module, according to various embodiments.
  • FIG. 8 is a flowchart illustrating an exemplary of speed sensitivity.
  • FIG. 9 is a flowchart illustrating an exemplary of location sensitivity.
  • FIG. 10 illustrates an exemplary IoT device database.
  • FIG. 11 is a flowchart illustrating an exemplary method of IoT device management for Wi-Fi motion detection.
  • FIG. 12 is a flowchart illustrating an exemplary method of IoT device installation.
  • FIG. 13 is a flowchart illustrating an exemplary method of IoT location tracking.
  • FIG. 14 is a flowchart illustrating an exemplary method of IoT activity tracking.
  • a simple robot can be put in a room and programmed not to move except in accordance with specific programmed command. Such commands may be sent to the robot regarding movement(s) at a certain rate than could be seen in the response to a channel.
  • a data set may be built over time, where the robot may be programmed to move such that the robot does change at specific times in duration and amount. Such robot motion may also be iterated.
  • the algorithm records the impulse response changes associated with the robot changes and a database may be built based on such recorded and associated changes.
  • FIG. 1 illustrates an exemplary network environment in which a system for robotic training regarding Wi-Fi motion detection may be implemented.
  • Such network environment may include a wireless access point 102 , which may be a Wi-Fi access point.
  • the wireless access point 102 is an 802.11n access point.
  • the wireless transceiver of the wireless access point 102 is in communication with the further stationary device over a corresponding further one of the at least one radio frequency communication link.
  • the wireless access point 102 is configured to record a further channel state information data set for the further one of the at least one radio frequency communication link at a corresponding time.
  • determining the activity of the person in the environment includes determining the activity of the person in the environment based on a comparison of the further channel state information data set to each of the at least one channel state information profile of each of the plurality of activity profiles. In an embodiment, the activity is determined based on a sum of a similarity measurement of the channel state information data set and a similarity measurement of the further channel state information data set.
  • a central processing unit (CPU) 104 is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logic, controlling and input/output (I/O) operations specified by the instructions.
  • a graphics processing unit (GPU) 106 is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device. GPUs 106 are used in embedded systems, mobile phones, personal computers, workstations, and game consoles. Modern GPUs 106 are very efficient at manipulating computer graphics and image processing. The highly parallel structure makes such processors more efficient than general-purpose CPUs 104 for algorithms that process large blocks of data in parallel.
  • a digital signal processor (DSP) 108 is a specialized microprocessor (or a SIP block), with associated architecture optimized for the operational needs of digital signal processing.
  • the DSP 108 may measure, filter or compress continuous real-world analog signals.
  • An application program interface (API) 110 is a set of routines, protocols, and tools for building software applications. Basically, an API 110 specifies how software components could interact. Additionally, APIs 110 are used when programming graphical user interface (GUI) components. The API 110 may provide access to the channel state data to the agent 114 .
  • a wireless access point 102 compliant with either 802.11b or 802.11g, using an omnidirectional antenna might have a range of 100 m (0.062 mi).
  • the same radio 112 with an external semi parabolic antenna (15 dB gain) with a similarly equipped receiver at the far end might have a range over 20 miles.
  • An agent 114 may be a separate device or integrated module executable to collect data from the Wi-Fi chipset of wireless access point 102 , filter the incoming data then feed, and pass such data to the cloud server 120 for activity identification. Depending on the configuration, the activity identification can be done on the edge, at the level of the agent 114 , or in the cloud server 120 , or some combination of the two.
  • a local profile database 116 is utilized when at least a portion of the activity identification is done on the edge.
  • the activity identification could be a simple motion/no-motion determination profile, or a plurality of profiles for identifying activities, objects, individuals, biometrics, etc.
  • An activity identification module 118 distinguishes between walking activities and in-place activities.
  • a walking activity causes significant pattern changes of the impulse response amplitude over time, since walking generally involves significant body movements and location changes.
  • an in-place activity (such as watching TV from a sofa) only involves relative smaller body movements and may not cause significant amplitude changes, instead presenting certain repetitive patterns within the impulse response measurements.
  • a cloud server 120 may analyze and create profiles describing various activities. As illustrated, the cloud server 120 may include a profiled database 122 , device database 124 , profile module 126 , training database 128 , and training module 130 (which may further include correction module 132 and variation module 134 ).
  • the profile module 126 monitors the data set resulting from continuously monitoring a target environment so as to identify multiple similar instances of an activity without a matching profile in such a data set, to combine that data with user feedback, to label the resulting clusters, and to define new profiles that are then added to the profile database 122 .
  • a profile database 122 may be utilized when at least a portion of the activity identification is done in the cloud server 120 or when a profile is sent to cloud server 120 for storage.
  • the activity identification could be a simple motion/no-motion determination profile, or a plurality of profiles for identifying activities, objects, individuals, biometrics, etc.
  • a device database 124 may store the device ID of all connected wireless access points 102 .
  • one or more of the devices may be IoT devices (e.g., IoT device database of FIG. 10 ).
  • a profile module 126 monitors the data set resulting from continuous monitoring of a target environment to identify multiple similar instances of an activity without a matching profile in such a data set, to combine such data with user feedback, to label the resulting clusters, and to define new profiles that are then added to the profile database 122 .
  • a training database 128 may store programming for a training robot 136 , and the stored programming may include executable commands each representing different human actions. These stored programs or executable commands are sent to a training robot 136 to make the training robot 136 perform certain actions. For example, the training database 128 may store programming or executable commands for a training robot 136 to “walk slowly” or “jump in one place”. The training database 128 could store the actual code or programming required to control the movements of the training robot 136 , thereby limiting the need for program or code to be stored on the training robot 136 . Alternatively, the training database 128 may store just a command to execute a program stored on the memory of the training robot 136 . Furthermore, the training database 128 also stores the signal data that is monitored during the test, including the speed sensitivity data and location sensitivity data.
  • a training module 130 may select an action from the training database 128 and send the selected programming to the training robot 136 .
  • the training robot 136 may then perform the programmed response.
  • the training module 130 then monitors the signal of the target environment of the training robot 136 .
  • the monitored signal data is then compared to profile database 122 to see if there is a similar monitored signal data stored in the profile database 122 and determines if the action by the training robot 136 matches the action associated with data in the profile database 122 based on the comparison. If the monitored data and action do not match, the correction module 132 is initiated and the new data and action are added to the profile database 122 .
  • the training module 130 selects the first program from the training database 128 , which makes the training robot 136 “walk” at a predetermined standard pace. As the training robot 136 is performing the programmed motions the signal data is monitored and then compared to the profile database 122 to see if there is a matching “walking” action. If there is no “walking” action in the profile database 122 , a “walking” action may be added. If there is a match, the new data is added to the profile database 122 to update the “walking” profile with the new signal data.
  • a correction module 132 is initiated when there is no matching action that was performed by the training robot 136 in the profile database 122 but there is matching signal data. For example, a training robot 136 performs a predefined walking function, and that signal data is captured, but there is no matching “walking” action in the profile database 122 while the new signal data matches the action “jumping” in the profile database 122 . To prevent duplicate signals, the “walking” action and associated signal data are added to the profile database 122 ; and the correction module 132 performs, captures, and update the signal data for a “jumping” action.
  • a variation module 134 is executed by the training module 130 and re-runs the exact same training program.
  • the variation module 134 evaluates the difference between the first training run by the training module 130 and the second identical training run by the variation module 134 to identify and train the system on any possible variations in the frequency response data.
  • a training robot 136 is designed to mimic the human body and human movements.
  • the training robot 136 receives a programmed commands from the training module 130 .
  • the program tells the training robot 136 to make a series of movements to mimics human motions. Unlike a human, the training robot 136 can perform a task repeatedly the exact same way to get a better data set. Alternatively, the training robot 136 could be programmed to make slight alterations to robotic movement to create a more human type movement.
  • a memory 138 stores software and program code that controls the training robot 136 .
  • the memory 138 may store all of the programming required for controlling the training robot 136 or can store programming sent from a remote user or the training module 130 .
  • An antenna 140 for receiving commands and data from a remote user or directly from the training module 130 .
  • An execution module 142 receives the program from the training module 130 or correction module 132 , as well as executes the receive program.
  • the execution module 142 is stored on the memory 138 of the training robot 136 .
  • the execution module 142 further sends a signal to the training module 130 or correction module 132 upon completion of program execution.
  • a speed sensitivity module 144 is called by the execution module 142 to vary the speed of the training robot 136 (e.g., moving from slow to fast) to determine when the monitored Wi-Fi signal changes from a predefined speed of movement. As such, the system can test and eliminate speed sensitivity.
  • the speed sensitivity module 144 can be stored and executed, either from the training robot 136 or from the cloud server 120 .
  • a location sensitivity module 146 may be called by the execution module 142 , and the location of the training robot 136 may be varied as the programmed movement is performed. Specifically, the speed sensitivity module 144 could move the training robot 136 away from and towards the wireless access point 102 . Such movement may allow for determination of when the monitored signal changes. As such, the system can test and eliminate location sensitivity.
  • the location sensitivity module 146 can be stored and executed, either from the training robot 136 or from the cloud server 120 .
  • FIG. 2 illustrates an exemplary profile database (e.g., profile database 122 ).
  • profile database 122 e.g., profile database 122 .
  • FIG. 2 illustrates an exemplary profile database (e.g., profile database 122 ).
  • the profile database 122 is utilized when at least a portion of the activity identification is done in the cloud server 120 .
  • the profile database 122 may store a profile program number 200 and a profile action name 202 that describes the activity or action of that is represented by the frequency response data.
  • the profile database 122 may store the frequency response data that is associated with the action as the profile signal data 204 .
  • Profile database 122 may further store speed sensitivity profile data and location sensitivity profile data.
  • FIG. 3 illustrates an exemplary training database (e.g., training database 128 ).
  • the training database 128 is utilized to store programming for a training robot 136 and the monitored frequency response data.
  • the training database 128 stores software code or an execution command that represents different actions that can be sent the training robot 136 . These actions represent human movements.
  • the training database 128 further stores the name of the action or movement performed (e.g., wave hand, Jump, kick leg out, etc.), the execution command or program code, and the frequency response data for three different tests, normal test, speed sensitivity test, and locations test.
  • training database 128 may store data regarding training program number, training action name, training program or execution code, normal training data, speed sensitivity training data, location sensitivity training data, normal variation data, speed variation data, and location variation data.
  • FIG. 4 is a flowchart illustrating an exemplary method of robotic training.
  • the process begins with initiation of the training module 130 (e.g., based on user input received via a user interface of a user device) in step 400 .
  • the user interface could be associated with a remote terminal or a mobile device.
  • the training program in the training database 128 is selected at step 402 .
  • the selected training program is sent to the execution module 142 on the training robot 136 .
  • such selection may be “wavehand.exe” at step 404 .
  • the training module 130 then begins monitoring the frequency response signal data at the target environment where the training robot 136 is located.
  • the frequency response data is only monitored while the training robot 136 performs the training program at step 406 . While monitoring frequency response data, the training module 130 waits for a command from the training robot 136 that a “normal” program has been completed. The command tells the training module 130 to stop monitoring the frequency responses data. For example, the training program for mimicking waving a hand (e.g., “wave hand”) is sent to the training robot 136 . The training module 130 could then immediately begin to monitor the frequency response data until the “completed normal program” command is received.
  • the monitored data could now represent the motion of waving a hand at step 408 .
  • the training module 130 then stores the monitored frequency response data in to the training database 128 with the program that was originally selected in step 410 .
  • the frequency response data that was monitored from the time the training program was executed to receiving the “completed normal program” command from the training robot 136 may be stored in the training database 128 with the “wave hand” training program.
  • the “wave hand” training program could now have frequency response data stored in the training database 128 under the normal training signal data at step 410 .
  • the training module 130 then begins monitoring the frequency response signal data again at the target environment where the training robot 136 is located. The frequency response data may only be monitored while the training robot 136 performs the training program that was sent.
  • One aspect of monitoring the signal data again is associated with the training robot 136 performing the program but at different speeds at step 412 . While monitoring frequency response data, the training module 130 waits for a command from the training robot 136 that a “speed sensitivity” program has been completed.
  • the command tells the training module 130 to stop monitoring the frequency responses data at step 414 .
  • the training module 130 then stores the monitored frequency response data in to the training database 128 with the program that was originally selected. For example, the frequency response data that was monitored between the time the training program was executed and the time that the “completed speed sensitivity program” command was received from the training robot 136 is stored in the training database 128 with the “wave hand” training program.
  • the “wave hand” training program could now have frequency response data stored in the training database 128 under the speed sensitivity training data at step 416 .
  • the training module 130 may then begin monitoring the frequency response signal data again at the target environment where the training robot 136 is located. The frequency response data is only monitored while the training robot 136 performs the training program that was sent.
  • One aspect of monitoring the signal data includes the training robot 136 performing the program but at different distances from the access point 102 at step 418 . While monitoring frequency response data, the training module 130 waits for a command from the training robot 136 that a “location sensitivity” program has been completed. The command tells the training module 130 to stop monitoring the frequency responses data at step 420 . The training module 130 then stores the monitored frequency response data in to the training database 128 with the program that was originally selected. For example, the frequency response data that was monitored between the time the training program was executed and the time the “completed location sensitivity program” command was received from the training robot 136 is stored in the training database 128 with the “wave hand” training program.
  • the “wave hand” training program could now have frequency response data stored in the training database 128 under the location sensitivity training data at step 422 .
  • the variation module 134 is then executed.
  • the variations module 134 is executed at the end of the training module 130 and before the training module 130 moves on to the next training program.
  • the variation module 134 re-runs the same test program as the training module 130 , monitors the data during the identical test and then compares the frequency response data from training module 130 to the frequency response data collected while running the variation module 134 .
  • the training module 130 then updates variations in the two data sets and updates the training database 128 at step 424 .
  • the number of the training program “n” is sent to the variation database regarding which training program to run at step 426 .
  • the training module 130 then compares the training action name from the training database 128 with the profile action names 202 in the profile database 122 .
  • the action names are compared to see if there is already a similar action in the profile database 122 .
  • the training module 130 could compare the action name “wave hand”, which was the training program that was executed, to see if there was a matching action in the profile database 122 at step 428 .
  • the training module 130 then checks to see if the selected training action name matches any profile action names (e.g., “wave hand”) at step 430 . If the training action name and profile action name 202 match, the training module 130 updates the profile signal data 204 with the training signal data in the profile database 122 . For example, if there is already a “wave hand” action in the profile database 122 the new signal data collected during the training program is added to the “wave hand” profile in the profile database 122 at step 432 . If there is no match of the training action name and the profile action names, the training module 130 then compares the stored training signal data with the profile signal data 204 to see if there is similar signal data with a different action name in the profile database 122 . One consideration is to make sure that there are not any duplicates of the signal data in the profile database 122 for the same action even if the action names do not match.
  • profile action names e.g., “wave hand”
  • the training module 130 selects the next program in the training database 128 (n+1) is selected until there are no more programs to run from the training database 128 at step 438 .
  • the system adds the data from the training database 128 to the profile database 122 to create a new action (e.g., wave hand) at step 440 .
  • a new action e.g., wave hand
  • the training module 130 selects the profile action name 202 where the training signal data matched the profile signal data 204 at step 442 .
  • the profile action name 202 that was selected is then sent to the correction module 132 and the corrections module is executed at step 444 .
  • the correction module 132 runs and correct signal data in the profile database 122 that matched a training signal data. For example, if the “wave hand” training signal data matches a profile signal data 204 for “kick out leg” then the correction module 132 could find the corresponding training program in the training database 128 , run the training program for “kick out leg” with the training robot 136 , monitor the new signal and update the “kick out leg” with the new signal data.
  • One aspect is to eliminate any duplicate signal data at step 446 .
  • the correction module 132 sends the corrected signal and action data back, and such data is updated in the profile database 122 at step 448 .
  • FIG. 5 is a flowchart illustrating an exemplary of training correction.
  • the process begins with the correction module 132 receiving from the training module 130 the profile action name 202 from the profile database 122 where the training and profile signals matched at step 500 .
  • the correction module 132 compares the received profile action name 202 with the training database 128 to find the matching training action name. For example, if the training data for “Wave hand” matches the profile data for “kick out leg” the correction module 132 could look for the same training action name (“kick out leg”) in the training database 128 at step 502 .
  • the training program is selected from the training database 128 (e.g., “Kick leg out”) at step 504 .
  • the selected training program is sent to the execution module 142 on the training robot 136 .
  • the program could send the “kickoutleg.exe” training program for execution at step 506 .
  • the frequency response data from the Wi-Fi signal from the environment which the training robot 136 is placed is monitored while the training robot 136 executes the program at step 508 .
  • the correction module 132 stops monitoring the frequency response data once a message is received from the training robot 136 that the program is complete at step 510 .
  • the correction module 132 then stores the monitored signal data in to the training database 128 with the program that was originally selected. In this example, the signal data is stored with the “kick out leg” program at step 512 .
  • the variation module 134 is then executed at the end of the training module 130 and before the training module 130 moves on to the next training program.
  • the variation module 134 re-runs the same test program as the training module 130 , monitors the data during the identical test, and then compares the frequency response data from training module 130 to the frequency response data collected while running the variation module 134 .
  • the correction module 132 then updates variations in the two data sets and updates the training database 128 at step 514 .
  • the number of the training program “n” is sent to the variation database regarding which training program to run at step 516 .
  • the stored training data is then sent to back to the training module 130 to be used to update the profile database 122 at step 518 .
  • the correction module 132 ends and returns to the training module 130 at step 520 .
  • FIG. 6 is a flowchart illustrating an exemplary of detecting training variation.
  • the process begins with execution of the variation module 134 to receive from the training module 130 the training program that is has just run. The variation module 134 then sends that training program to the training robot 136 at step 602 . The variation module 134 begins to monitor the impulse response of the channel, while the training robot 136 is performing the training program at step 604 . The variation module 134 stops monitoring the signal when the command is received indicating that the training robot 136 has completed the normal run of the training program at step 606 .
  • the monitored data is then stored in the training database 128 as normal variation data at step 608 .
  • the variation module 134 then begins monitoring the impulse response of the channel again as the training robot 136 performs the training program at different speeds at step 610 .
  • the variation module 134 stops monitoring the signal when the command is received indicating that the training robot 136 has completed the speed sensitivity test of the training program at step 612 .
  • the monitored data is then stored in the training database 128 as speed variation data at step 614 .
  • the variation module 134 then begins monitoring the impulse response of the channel again as the training robot 136 performs the training program at different distances from the wireless access point 102 at step 616 .
  • the variation module 134 stops monitoring the signal when the command is received indicating that the training robot 136 has completed the location sensitivity test of the training program at step 618 .
  • the monitored data is then stored in the training database 128 as location variation data at step 620 .
  • the stored training data is then compared to the variation data of the same training program.
  • the normal training data is compared to the normal variation data, as well as the speed and location sensitivity data. As the data is compared, the variation module 132 checks for major variations in the data.
  • FIG. 7 is an illustration of an exemplary method of execution module 142 , according to various embodiments.
  • the process begins with execution of the training robot 136 to receive from the training module 130 a program or command.
  • the training robot 136 then executes the program or command from memory 138 .
  • the instructions for controlling the training robot 136 may be sent to the training robot 136 .
  • Some embodiments allow for the code that controls the training robot 136 to be stored in memory 138 and executed at step 702 .
  • the program a signal is sent back to the training module 130 regarding a completion of the program at step 704 .
  • the execution module 142 then executes the speed sensitivity module 144 , which runs the training program at different speeds, first running the program at double or triple the defined normal speed and then at either two or three times slower than normal at step 706 .
  • a signal is sent back to the training module 130 regarding a normal completion of the program at step 708 .
  • the execution module 142 then executes the location sensitivity module 146 , which runs the training program at different at different distances from the wireless access point 102 at step 710 .
  • the program, signal may be sent back to the training module 130 regarding completion of the program at step 712 .
  • the program may end at step 714 .
  • FIG. 8 is a flowchart illustrating an exemplary of speed sensitivity.
  • the process begins with the speed sensitivity module 144 —which may be initiated by the execution module 142 .
  • the speed sensitivity module 144 may start to re-run the training program that the execution module 142 had just run at step 800 .
  • the speed sensitivity module 144 then completes the training program at step 804 .
  • the training is then re-run again at step 806 .
  • the speed sensitivity module 144 then completes the training program at step 810 .
  • the program ends and returns to the execution module 142 at step 812 .
  • FIG. 9 is a flowchart illustrating an exemplary of location sensitivity.
  • the process begins with the location sensitivity module 146 being initiated by the execution module 142 at step 900 .
  • the location sensitivity module 146 may then determine where the wireless access point 102 is located.
  • One aspect of the location sensitivity module 146 allows the training robot 136 can be moved either closer or further way from the access point 102 at step 902 .
  • After determining the direction and/or location of the wireless access point 102 the training robot 136 is moved closer to the access point 102 .
  • the location sensitivity module 146 may monitor the signal strength of the wireless access point 102 as an indicator of how close the training robot 136 is to the access point 102 at step 904 .
  • the training program that was initially run by the execution module 142 is now re-run, but with the training robot 136 closer to the wireless access point 102 at step 906 .
  • the training program may run and be completed at a location near to the wireless access point 102 at step 908 .
  • the training robot 136 then moves a distance away from the wireless access point 102 .
  • training robot 136 may continue to move farther away from the wireless access point 102 by monitoring the signal strength at step 910 .
  • the training program is then re-run again but this time with the training robot 136 at a distance from the wireless access point 102 at step 912 .
  • the training is program is then run and completed at a distance from the wireless access point 102 at step 914 .
  • the location sensitivity module 146 then ends and returns to the execution module 142 at step 916 .
  • an IoT device may be used to improve accuracy of passive Wi-Fi based motion detection systems.
  • An IoT device database of FIG. 10 stores information related to each IoT device connected to the wireless access point 102 . That information includes, but not limited to, the location information, the function of the device, and the activity associated with operating the device, all of which is provided either by the IoT device database of FIG. 10 or by a user.
  • the IoT device database of FIG. 10 could also include sensor data feed columns for devices with additional capabilities, such as a virtual assistant, that could provide additional context data to further refine the profile associated with the activity.
  • the cloud server 120 may monitor that activity of the wireless access point 102 via the agent 114 , as well as the activities of the IoT devices connected to the wireless access point 102 in order to trigger the installation module when new IoT devices are connected, and the location modules and activity modules when data events are detected simultaneously in both the activity identification module 118 and an IoT device.
  • An installation module may connect new IoT devices to the system and register in the IoT device database of FIG. 10 information related to the location of the device the function of the device, and the activity associated with the function of the IoT device. These definitions are provided either by the IoT device (e.g., if the IoT device has that level of sensor and computational capabilities) or through definition by a user through a training interface.
  • a location module may compare the location associated with the impulse or frequency response of the channel (e.g., in the profile database 122 identified by the activity identification module 118 ) to the location associated with the data event from the IoT device. When the two locations do not match, the data provided by the IoT device is sent to the profile module 126 to be used to improve the profile definitions in the profile database 122 .
  • An activity module may compare the activity associated with the impulse or frequency response of the channel in the profile database 122 identified by the activity identification module 118 , to the activity associated with the data event from the IoT device. When the two activities do not match, the data provided by the IoT device is send to the profile module 126 to be used to improve the profile definitions in the profile database 122 .
  • At least one IoT device or a group of up to n number of IoT devices.
  • Consumer connected devices including smart TVs, smart speakers, toys, wearables and smart appliances, smart meters, commercial security systems and smart city technologies, such as those used to monitor traffic and weather conditions, are examples of industrial and enterprise IoT devices, that could all be incorporated into this system in various embodiments.
  • FIG. 10 illustrates an exemplary IoT device database.
  • the IoT device database of FIG. 10 contains information related to each IoT device associated with each IoT device 138 connected to the system. That information includes, but not limited to; the location information, the function of the device, and the activity associated with operating the device, all of which is provided either by the IoT device or by a user.
  • the IoT device database of FIG. 10 could also include sensor data feed columns for devices with additional capabilities, such as a virtual assistant, that could provide additional context data to further refine the profile associated with the activity.
  • FIG. 11 is a flowchart illustrating an exemplary method of IoT device management for Wi-Fi motion detection. Such method may be performed by cloud server 120 .
  • the process begins with data being received via the agent 114 from the wireless access point 102 at step 1100 .
  • the IoT database 128 may be queried for the presence of a new IoT device being added to the home network at step 1102 . It may be determined if a new IoT device 138 is present at step 1104 . If a new IoT device is detected in the IoT device database of FIG. 10 , the installation module may be launched at step 1106 .
  • the activity identification module 118 may be polled for new motion data at step 1108 . It may be determined if the activity identification module 118 has identified a motion at step 1110 . If there is motion identified, the IoT devices in the IoT device database of FIG. 10 may be polled for new data event(s) at step 1112 . It may be determined if there are any IoT device that coincides with the motion data at step 1114 . If there is both motion and IoT data, a call to the location module may determine if the location identified by the activity identification module 118 is in agreement with the location data from the IoT device at step 1116 .
  • a call to the activity module may determine if the activity identified by the activity identification module 118 is in agreement with the activity data from the IoT device at step 1118 . It may be determined if data is still being received from the wireless access point 102 , via the agent 114 at step 1120 . If data is still coming in from the agent 114 , the method may return to step 1102 . If the data is no longer coming from the agent 114 , the method may go to step 1122 .
  • FIG. 12 is a flowchart illustrating an exemplary method of IoT device installation. Such method may be performed based on execution of an installation module.
  • the process begins with receipt of a prompt from the cloud server 120 at step 1200 .
  • the new IoT device added to the IoT device database of FIG. 10 may be characterized by such parameters as make, model, capabilities, etc., at step 1202 . It may be determined if the IoT device provides data about the associated location, and the data provided to the IoT device database of FIG. 10 may be written to the IoT device database of FIG. 10 at step 1204 . If the device does not provide data about location, the user device may prompt a user to input the location of the new device (e.g., identifying that a specified light switch is in the bathroom) at step 1206 .
  • the location of the new device e.g., identifying that a specified light switch is in the bathroom
  • the user-provided location of the new device may be written to the IoT device database of FIG. 10 at step 1208 . It may be determine if the device provides data about associated function, and write the data provided to the IoT device database of FIG. 10 at step 1210 . If the device does not provide data about associated function, the user device may be prompted for input regarding the function of the new device (e.g., identifying that a specified light switch in the bathroom operates the bathroom lights) at step 1212 . The user-provided function of the new device may be written to the IoT device database of FIG. 10 at step 1214 . A prompt may be provided to obtain user input defining the activity, such as activating the light switch.
  • the process for defining this activity in this embodiment may allow the user to provide a natural language definition of the activity.
  • the user could perform the activity in response to a prompt or query from the system, allowing the activity identification module 118 to take a sample of the impulse or frequency response of the channel to the activity being defined as linked to the IoT device at step 1216 .
  • the user-provided activity of the new device may be written to the IoT device database of FIG. 10 at step 1218 .
  • the method may return to the cloud server 120 at step 1220 .
  • FIG. 13 is a flowchart illustrating an exemplary method of IoT location tracking.
  • the method may be performed based on execution of a location module.
  • the process begins with receiving a prompt from the cloud server 120 that there is IoT data that coincides with motion data from the activity identification module 118 at step 1300 .
  • the location of the IoT device from the IoT device database of FIG. 10 may be identified at step 1302 .
  • the method may return to the cloud server 120 at step 1304 . If the location identified by the activity identification module 118 does not match the location of the IoT device, the location of the IoT device is sent to the profile module 126 to update the profile database 122 . In the future when the activity identification module 118 sees the current profile in the impulse or frequency response of the channel, activity identification module 118 may recognize such activity as taking place in the location defined by the IoT device at step 1306 . The method 600 may further return to the cloud server 120 at step 1308 .
  • FIG. 14 is a flowchart illustrating an exemplary method of IoT activity tracking. Such method may be performed based on execution of an activity module. The process begins with receiving a prompt from the cloud server 120 that there is IoT data that coincides with motion data from the activity identification module 118 at step 1400 .
  • the activity associated with the IoT device from the IoT device database of FIG. 10 may be identified at step 1402 . It may be determine if the activity associated with the IoT device matches the activity associated the profile of the frequency response of the channel identified by the activity identification module 118 . If the activity matches, the method may return to the cloud server 120 at step 1404 . If the activity identified by the activity identification module 118 does not match the activity associated with the IoT device, the activity associated with the IoT device is sent to the profile module 126 to update the profile database 122 . In the future when the activity identification module 118 sees the current profile in the impulse or frequency response of the channel, activity identification module 118 may recognize such data as corresponding to the activity defined by the IoT device at step 1406 . The method may return to the cloud server 120 at step 1408 .
  • Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
  • a bus carries the data to system RAM, from which a CPU retrieves and executes the instructions.
  • the instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU.
  • Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computational Linguistics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

Systems and methods of using a robot to train the system for motion detection are provided. A simple robot can be put in a room and programmed not to move except in accordance with specific programmed command. Such commands may be sent to the robot regarding movement(s) at a certain rate than could be seen in the response to a channel. A data set may be built over time, where the robot may be programmed to move such that the robot does change at specific times in duration and amount. Such robot motion may also be iterated. The algorithm records the impulse response changes associated with the robot changes and a database may be built based on such recorded and associated changes.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 16/794,668 filed Feb. 19, 2020, now U.S. Pat. No. 11,586,952, which claims the priority benefit of U.S. provisional patent No. 62/809,356 filed Feb. 22, 2019, and of U.S. provisional patent No. 62/809,393 filed on Feb. 22, 2019, the disclosures of which are incorporated herein by reference in their entirety.
  • BACKGROUND OF THE INVENTION 1. Field of the Invention
  • The present disclosure is generally related to using a robot and machine learning to train a motion detection system. Specifically, using a robot to perform human motions repeatedly to teach a Wi-Fi motion detection system to identify different human activities with in a target environment.
  • 2. Description of the Related Art
  • Motion detection is the process of detecting a change in the position of an object relative to associated surroundings or a change in the surroundings relative to an object. Motion detection is usually a software-based monitoring algorithm which. For example when motions are detected, the surveillance camera may be signaled to begin capturing the event. An advanced motion detection surveillance system can analyze the type of motion to see if an alarm is warranted.
  • Wi-Fi location determination, also known as Wi-Fi localization or Wi-Fi location estimation refers to methods of translating observed Wi-Fi signal strengths into locations. A “radio map”, consisting of sets of metadata containing information about the frequency response of the channel, and/or phase response of the channel, and/or impulse response of the channel, and/or Received Signal Strength Indicators (RSSI), and/or any other statistic that describes the wireless communication link between paired devices is stored as a profile to be compared later to a signal scan to recognize the location of the device doing the scanning.
  • There is therefore a need in the art for improved systems and methods of robotic matric creation.
  • SUMMARY OF THE CLAIMED INVENTION
  • Embodiments of the present invention provides for training a motion detection system in which a robot can replicate movements repeatedly in the exact same manner to provide a better model for frequency data for motion detection. A system for training a motion detection system is disclosed that includes a training robot, a training module, a correction module, a variation module, a training database, a profile database, an execution module, a speed sensitivity module, and a location sensitivity module. The training module sends pre-programmed movements to the training robot to perform and monitors and stores the received signal generated in the impulse response data by the training robot. The new training data is updated in the profile database.
  • Methods are provided for utilizing the context data from the internet of things (IoT) devices to augment the data provided by the agent, so that a larger library of profiles can be matched to known activities and locations of those activities. The system provided improves a Wi-Fi motion detection system by using IoT device data. The system includes a Wi-Fi motion detection system, an IoT device database, a cloud server, and an IoT device. The system collects location and activity data and associates such data with data from the IoT devices.
  • BRIEF DESCRIPTIONS OF THE DRAWINGS
  • FIG. 1 illustrates an exemplary network environment in which a system for robotic training regarding Wi-Fi motion detection may be implemented.
  • FIG. 2 illustrates an exemplary profile database.
  • FIG. 3 illustrates an exemplary training database.
  • FIG. 4 is a flowchart illustrating an exemplary method of robotic training.
  • FIG. 5 is a flowchart illustrating an exemplary of training correction.
  • FIG. 6 is a flowchart illustrating an exemplary of detecting training variation.
  • FIG. 7 is an illustration of an exemplary method of execution module, according to various embodiments.
  • FIG. 8 is a flowchart illustrating an exemplary of speed sensitivity.
  • FIG. 9 is a flowchart illustrating an exemplary of location sensitivity.
  • FIG. 10 illustrates an exemplary IoT device database.
  • FIG. 11 is a flowchart illustrating an exemplary method of IoT device management for Wi-Fi motion detection.
  • FIG. 12 is a flowchart illustrating an exemplary method of IoT device installation.
  • FIG. 13 is a flowchart illustrating an exemplary method of IoT location tracking.
  • FIG. 14 is a flowchart illustrating an exemplary method of IoT activity tracking.
  • DETAILED DESCRIPTION
  • Systems and methods of using a robot to train the system for motion detection are provided. A simple robot can be put in a room and programmed not to move except in accordance with specific programmed command. Such commands may be sent to the robot regarding movement(s) at a certain rate than could be seen in the response to a channel. A data set may be built over time, where the robot may be programmed to move such that the robot does change at specific times in duration and amount. Such robot motion may also be iterated. The algorithm records the impulse response changes associated with the robot changes and a database may be built based on such recorded and associated changes.
  • FIG. 1 illustrates an exemplary network environment in which a system for robotic training regarding Wi-Fi motion detection may be implemented. Such network environment may include a wireless access point 102, which may be a Wi-Fi access point. In an embodiment, the wireless access point 102 is an 802.11n access point. The wireless transceiver of the wireless access point 102 is in communication with the further stationary device over a corresponding further one of the at least one radio frequency communication link. The wireless access point 102 is configured to record a further channel state information data set for the further one of the at least one radio frequency communication link at a corresponding time. In an embodiment, determining the activity of the person in the environment includes determining the activity of the person in the environment based on a comparison of the further channel state information data set to each of the at least one channel state information profile of each of the plurality of activity profiles. In an embodiment, the activity is determined based on a sum of a similarity measurement of the channel state information data set and a similarity measurement of the further channel state information data set.
  • A central processing unit (CPU) 104 is the electronic circuitry within a computer that carries out the instructions of a computer program by performing the basic arithmetic, logic, controlling and input/output (I/O) operations specified by the instructions. A graphics processing unit (GPU) 106 is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display device. GPUs 106 are used in embedded systems, mobile phones, personal computers, workstations, and game consoles. Modern GPUs 106 are very efficient at manipulating computer graphics and image processing. The highly parallel structure makes such processors more efficient than general-purpose CPUs 104 for algorithms that process large blocks of data in parallel.
  • A digital signal processor (DSP) 108 is a specialized microprocessor (or a SIP block), with associated architecture optimized for the operational needs of digital signal processing. The DSP 108 may measure, filter or compress continuous real-world analog signals.
  • An application program interface (API) 110 is a set of routines, protocols, and tools for building software applications. Basically, an API 110 specifies how software components could interact. Additionally, APIs 110 are used when programming graphical user interface (GUI) components. The API 110 may provide access to the channel state data to the agent 114. A wireless access point 102 compliant with either 802.11b or 802.11g, using an omnidirectional antenna might have a range of 100 m (0.062 mi). The same radio 112 with an external semi parabolic antenna (15 dB gain) with a similarly equipped receiver at the far end might have a range over 20 miles.
  • An agent 114 may be a separate device or integrated module executable to collect data from the Wi-Fi chipset of wireless access point 102, filter the incoming data then feed, and pass such data to the cloud server120 for activity identification. Depending on the configuration, the activity identification can be done on the edge, at the level of the agent 114, or in the cloud server 120, or some combination of the two.
  • A local profile database 116 is utilized when at least a portion of the activity identification is done on the edge. The activity identification could be a simple motion/no-motion determination profile, or a plurality of profiles for identifying activities, objects, individuals, biometrics, etc.
  • An activity identification module 118 distinguishes between walking activities and in-place activities. In general, a walking activity causes significant pattern changes of the impulse response amplitude over time, since walking generally involves significant body movements and location changes. In contrast, an in-place activity (such as watching TV from a sofa) only involves relative smaller body movements and may not cause significant amplitude changes, instead presenting certain repetitive patterns within the impulse response measurements.
  • A cloud server 120 may analyze and create profiles describing various activities. As illustrated, the cloud server 120 may include a profiled database 122, device database 124, profile module 126, training database 128, and training module 130 (which may further include correction module 132 and variation module 134).
  • The profile module 126 monitors the data set resulting from continuously monitoring a target environment so as to identify multiple similar instances of an activity without a matching profile in such a data set, to combine that data with user feedback, to label the resulting clusters, and to define new profiles that are then added to the profile database122.
  • A profile database 122 may be utilized when at least a portion of the activity identification is done in the cloud server 120 or when a profile is sent to cloud server 120 for storage. The activity identification could be a simple motion/no-motion determination profile, or a plurality of profiles for identifying activities, objects, individuals, biometrics, etc. A device database 124 may store the device ID of all connected wireless access points 102. In some embodiments, one or more of the devices may be IoT devices (e.g., IoT device database of FIG. 10 ).
  • A profile module 126 monitors the data set resulting from continuous monitoring of a target environment to identify multiple similar instances of an activity without a matching profile in such a data set, to combine such data with user feedback, to label the resulting clusters, and to define new profiles that are then added to the profile database122.
  • A training database 128 may store programming for a training robot 136, and the stored programming may include executable commands each representing different human actions. These stored programs or executable commands are sent to a training robot 136 to make the training robot 136 perform certain actions. For example, the training database 128 may store programming or executable commands for a training robot 136 to “walk slowly” or “jump in one place”. The training database 128 could store the actual code or programming required to control the movements of the training robot 136, thereby limiting the need for program or code to be stored on the training robot 136. Alternatively, the training database 128 may store just a command to execute a program stored on the memory of the training robot 136. Furthermore, the training database 128 also stores the signal data that is monitored during the test, including the speed sensitivity data and location sensitivity data.
  • A training module 130 may select an action from the training database 128 and send the selected programming to the training robot 136. The training robot 136 may then perform the programmed response. The training module 130 then monitors the signal of the target environment of the training robot 136. The monitored signal data is then compared to profile database 122 to see if there is a similar monitored signal data stored in the profile database 122 and determines if the action by the training robot 136 matches the action associated with data in the profile database 122 based on the comparison. If the monitored data and action do not match, the correction module 132 is initiated and the new data and action are added to the profile database 122. For example, the training module 130 selects the first program from the training database 128, which makes the training robot 136 “walk” at a predetermined standard pace. As the training robot 136 is performing the programmed motions the signal data is monitored and then compared to the profile database 122 to see if there is a matching “walking” action. If there is no “walking” action in the profile database 122, a “walking” action may be added. If there is a match, the new data is added to the profile database 122 to update the “walking” profile with the new signal data.
  • A correction module 132 is initiated when there is no matching action that was performed by the training robot 136 in the profile database 122 but there is matching signal data. For example, a training robot 136 performs a predefined walking function, and that signal data is captured, but there is no matching “walking” action in the profile database 122 while the new signal data matches the action “jumping” in the profile database 122. To prevent duplicate signals, the “walking” action and associated signal data are added to the profile database 122; and the correction module 132 performs, captures, and update the signal data for a “jumping” action.
  • A variation module 134 is executed by the training module 130 and re-runs the exact same training program. The variation module 134 evaluates the difference between the first training run by the training module 130 and the second identical training run by the variation module 134 to identify and train the system on any possible variations in the frequency response data.
  • A training robot 136 is designed to mimic the human body and human movements. The training robot 136 receives a programmed commands from the training module 130. The program tells the training robot 136 to make a series of movements to mimics human motions. Unlike a human, the training robot 136 can perform a task repeatedly the exact same way to get a better data set. Alternatively, the training robot 136 could be programmed to make slight alterations to robotic movement to create a more human type movement. A memory 138 stores software and program code that controls the training robot 136. The memory 138 may store all of the programming required for controlling the training robot 136 or can store programming sent from a remote user or the training module 130. An antenna 140 for receiving commands and data from a remote user or directly from the training module 130.
  • An execution module 142 receives the program from the training module 130 or correction module 132, as well as executes the receive program. The execution module 142 is stored on the memory 138 of the training robot 136. The execution module 142 further sends a signal to the training module 130 or correction module 132 upon completion of program execution. A speed sensitivity module 144 is called by the execution module 142 to vary the speed of the training robot 136 (e.g., moving from slow to fast) to determine when the monitored Wi-Fi signal changes from a predefined speed of movement. As such, the system can test and eliminate speed sensitivity.
  • Furthermore, the speed sensitivity module 144 can be stored and executed, either from the training robot 136 or from the cloud server 120. A location sensitivity module 146 may be called by the execution module 142, and the location of the training robot 136 may be varied as the programmed movement is performed. Specifically, the speed sensitivity module 144 could move the training robot 136 away from and towards the wireless access point 102. Such movement may allow for determination of when the monitored signal changes. As such, the system can test and eliminate location sensitivity. Furthermore, the location sensitivity module 146 can be stored and executed, either from the training robot 136 or from the cloud server 120.
  • FIG. 2 illustrates an exemplary profile database (e.g., profile database 122). One skilled in the art may appreciate that for this and other processes and methods disclosed herein, the functions performed in the processes and methods may be implemented in differing order. Furthermore, the outlined steps and operations are only provided as examples, and some of the steps and operations may be optional, combined into fewer steps and operations, or expanded into additional steps and operations without detracting from the essence of the disclosed embodiments.
  • The profile database 122 is utilized when at least a portion of the activity identification is done in the cloud server 120. This could be a simple motion/no-motion determination profile, or a plurality of profiles for identifying activities, objects, individuals, biometrics, etc. For example, the profile database 122 may store a profile program number 200 and a profile action name 202 that describes the activity or action of that is represented by the frequency response data. Additionally, the profile database 122 may store the frequency response data that is associated with the action as the profile signal data 204. Profile database 122 may further store speed sensitivity profile data and location sensitivity profile data.
  • FIG. 3 illustrates an exemplary training database (e.g., training database 128). The training database 128 is utilized to store programming for a training robot 136 and the monitored frequency response data. The training database 128 stores software code or an execution command that represents different actions that can be sent the training robot 136. These actions represent human movements. The training database 128 further stores the name of the action or movement performed (e.g., wave hand, Jump, kick leg out, etc.), the execution command or program code, and the frequency response data for three different tests, normal test, speed sensitivity test, and locations test. As illustrated, training database 128 may store data regarding training program number, training action name, training program or execution code, normal training data, speed sensitivity training data, location sensitivity training data, normal variation data, speed variation data, and location variation data.
  • FIG. 4 is a flowchart illustrating an exemplary method of robotic training. The process begins with initiation of the training module 130 (e.g., based on user input received via a user interface of a user device) in step 400. The user interface could be associated with a remote terminal or a mobile device. Once the training module 130 is initiated, a training program is selected from the training database 128. If this is the first time or part of the initial run of the program, “n” may equal 1 (n=1) where n represents the training program number in the training database 128.
  • The training program in the training database 128 is selected at step 402. The selected training program is sent to the execution module 142 on the training robot 136. For example, if this the first time the program is running and the first program in the training database 128 is selected, such selection may be “wavehand.exe” at step 404. The training module 130 then begins monitoring the frequency response signal data at the target environment where the training robot 136 is located.
  • The frequency response data is only monitored while the training robot 136 performs the training program at step 406. While monitoring frequency response data, the training module 130 waits for a command from the training robot 136 that a “normal” program has been completed. The command tells the training module 130 to stop monitoring the frequency responses data. For example, the training program for mimicking waving a hand (e.g., “wave hand”) is sent to the training robot 136. The training module 130 could then immediately begin to monitor the frequency response data until the “completed normal program” command is received.
  • The monitored data could now represent the motion of waving a hand at step 408. The training module 130 then stores the monitored frequency response data in to the training database 128 with the program that was originally selected in step 410. For example, the frequency response data that was monitored from the time the training program was executed to receiving the “completed normal program” command from the training robot 136 may be stored in the training database 128 with the “wave hand” training program. The “wave hand” training program could now have frequency response data stored in the training database 128 under the normal training signal data at step 410. The training module 130 then begins monitoring the frequency response signal data again at the target environment where the training robot 136 is located. The frequency response data may only be monitored while the training robot 136 performs the training program that was sent.
  • One aspect of monitoring the signal data again is associated with the training robot 136 performing the program but at different speeds at step 412. While monitoring frequency response data, the training module 130 waits for a command from the training robot 136 that a “speed sensitivity” program has been completed.
  • The command tells the training module 130 to stop monitoring the frequency responses data at step 414. The training module 130 then stores the monitored frequency response data in to the training database 128 with the program that was originally selected. For example, the frequency response data that was monitored between the time the training program was executed and the time that the “completed speed sensitivity program” command was received from the training robot 136 is stored in the training database 128 with the “wave hand” training program.
  • The “wave hand” training program could now have frequency response data stored in the training database 128 under the speed sensitivity training data at step 416. The training module 130 may then begin monitoring the frequency response signal data again at the target environment where the training robot 136 is located. The frequency response data is only monitored while the training robot 136 performs the training program that was sent.
  • One aspect of monitoring the signal data includes the training robot 136 performing the program but at different distances from the access point 102 at step 418. While monitoring frequency response data, the training module 130 waits for a command from the training robot 136 that a “location sensitivity” program has been completed. The command tells the training module 130 to stop monitoring the frequency responses data at step 420. The training module 130 then stores the monitored frequency response data in to the training database 128 with the program that was originally selected. For example, the frequency response data that was monitored between the time the training program was executed and the time the “completed location sensitivity program” command was received from the training robot 136 is stored in the training database 128 with the “wave hand” training program.
  • The “wave hand” training program could now have frequency response data stored in the training database 128 under the location sensitivity training data at step 422. The variation module 134 is then executed. The variations module 134 is executed at the end of the training module 130 and before the training module 130 moves on to the next training program. The variation module 134 re-runs the same test program as the training module 130, monitors the data during the identical test and then compares the frequency response data from training module 130 to the frequency response data collected while running the variation module 134.
  • The training module 130 then updates variations in the two data sets and updates the training database 128 at step 424. The number of the training program “n” is sent to the variation database regarding which training program to run at step 426. The training module 130 then compares the training action name from the training database 128 with the profile action names 202 in the profile database 122. The action names are compared to see if there is already a similar action in the profile database 122. For example, the training module 130 could compare the action name “wave hand”, which was the training program that was executed, to see if there was a matching action in the profile database 122 at step 428.
  • The training module 130 then checks to see if the selected training action name matches any profile action names (e.g., “wave hand”) at step 430. If the training action name and profile action name 202 match, the training module 130 updates the profile signal data 204 with the training signal data in the profile database 122. For example, if there is already a “wave hand” action in the profile database 122 the new signal data collected during the training program is added to the “wave hand” profile in the profile database 122at step 432. If there is no match of the training action name and the profile action names, the training module 130 then compares the stored training signal data with the profile signal data 204 to see if there is similar signal data with a different action name in the profile database 122. One consideration is to make sure that there are not any duplicates of the signal data in the profile database 122 for the same action even if the action names do not match.
  • For example, if there is no “wave hand” in the profile database 122 then signal data that was monitored during the training program is compared to all of the signal data in the profile database 122 at step 434. If the training signal data does not have any match with any of the profile signal data 204 and the action names do not match, a new action name is added to the profile database 122 using the data from the training database 128 at step 436. Once the profile database 122 is updated or new data added, the training module 130 selects the next program in the training database 128 (n+1) is selected until there are no more programs to run from the training database 128 at step 438. If there is no match of the profile action names 202 but there is a signal match as described at step 418, the system adds the data from the training database 128 to the profile database 122 to create a new action (e.g., wave hand) at step 440.
  • The training module 130 then selects the profile action name 202 where the training signal data matched the profile signal data 204 at step 442. The profile action name 202 that was selected is then sent to the correction module 132 and the corrections module is executed at step 444. The correction module 132 runs and correct signal data in the profile database 122 that matched a training signal data. For example, if the “wave hand” training signal data matches a profile signal data 204 for “kick out leg” then the correction module 132 could find the corresponding training program in the training database 128, run the training program for “kick out leg” with the training robot 136, monitor the new signal and update the “kick out leg” with the new signal data. One aspect is to eliminate any duplicate signal data at step 446. Once the correction module 132 completes a task, the correction module 132 sends the corrected signal and action data back, and such data is updated in the profile database 122 at step 448.
  • FIG. 5 is a flowchart illustrating an exemplary of training correction. The process begins with the correction module 132 receiving from the training module 130 the profile action name 202 from the profile database 122 where the training and profile signals matched at step 500. The correction module 132 compares the received profile action name 202 with the training database 128 to find the matching training action name. For example, if the training data for “Wave hand” matches the profile data for “kick out leg” the correction module 132 could look for the same training action name (“kick out leg”) in the training database 128 at step 502. The training program is selected from the training database 128 (e.g., “Kick leg out”) at step 504.
  • The selected training program is sent to the execution module 142 on the training robot 136. For example, the program could send the “kickoutleg.exe” training program for execution at step 506. The frequency response data from the Wi-Fi signal from the environment which the training robot 136 is placed is monitored while the training robot 136 executes the program at step 508. The correction module 132 stops monitoring the frequency response data once a message is received from the training robot 136 that the program is complete at step 510. The correction module 132 then stores the monitored signal data in to the training database 128 with the program that was originally selected. In this example, the signal data is stored with the “kick out leg” program at step 512. The variation module 134 is then executed at the end of the training module 130 and before the training module 130 moves on to the next training program. The variation module 134 re-runs the same test program as the training module 130, monitors the data during the identical test, and then compares the frequency response data from training module 130 to the frequency response data collected while running the variation module 134.
  • The correction module 132 then updates variations in the two data sets and updates the training database 128 at step 514. The number of the training program “n” is sent to the variation database regarding which training program to run at step 516. The stored training data is then sent to back to the training module 130 to be used to update the profile database 122 at step 518. The correction module 132 ends and returns to the training module 130 at step 520.
  • FIG. 6 is a flowchart illustrating an exemplary of detecting training variation. In step 600, the process begins with execution of the variation module 134 to receive from the training module 130 the training program that is has just run. The variation module 134 then sends that training program to the training robot 136 at step 602. The variation module 134 begins to monitor the impulse response of the channel, while the training robot 136 is performing the training program at step 604. The variation module 134 stops monitoring the signal when the command is received indicating that the training robot 136 has completed the normal run of the training program at step 606.
  • The monitored data is then stored in the training database 128 as normal variation data at step 608. The variation module 134 then begins monitoring the impulse response of the channel again as the training robot 136 performs the training program at different speeds at step 610. The variation module 134 stops monitoring the signal when the command is received indicating that the training robot 136 has completed the speed sensitivity test of the training program at step 612. The monitored data is then stored in the training database 128 as speed variation data at step 614. The variation module 134 then begins monitoring the impulse response of the channel again as the training robot 136 performs the training program at different distances from the wireless access point 102 at step 616.
  • The variation module 134 stops monitoring the signal when the command is received indicating that the training robot 136 has completed the location sensitivity test of the training program at step 618. The monitored data is then stored in the training database 128 as location variation data at step 620. The stored training data is then compared to the variation data of the same training program. In step 622, the normal training data is compared to the normal variation data, as well as the speed and location sensitivity data. As the data is compared, the variation module 132 checks for major variations in the data.
  • Small changes in amplitude on the same slope may be ignored, but fundamental changes in the shape of the H matrix could be considered an anomaly and removed from the signal data at step 624. Any variations or anomalies found in the data may be removed at step 626. The “variation free” data is then stored back in the training database 128 in the respective training columns at step 628. Once the data is stored in the training database 128, the correction module 132 ends execution and returns to the training module 130 at step 630.
  • FIG. 7 is an illustration of an exemplary method of execution module 142, according to various embodiments. In step 700, the process begins with execution of the training robot 136 to receive from the training module 130 a program or command. The training robot 136 then executes the program or command from memory 138. In some cases, the instructions for controlling the training robot 136 may be sent to the training robot 136. Some embodiments allow for the code that controls the training robot 136 to be stored in memory 138 and executed at step 702. Once the training robot 136 had completed, the program, a signal is sent back to the training module 130 regarding a completion of the program at step 704.
  • The execution module 142 then executes the speed sensitivity module 144, which runs the training program at different speeds, first running the program at double or triple the defined normal speed and then at either two or three times slower than normal at step 706. Once the training robot 136 had completed the program normally, a signal is sent back to the training module 130 regarding a normal completion of the program at step 708. The execution module 142 then executes the location sensitivity module 146, which runs the training program at different at different distances from the wireless access point 102 at step 710. Once the training robot 136 had completed, the program, signal may be sent back to the training module 130 regarding completion of the program at step 712. The program may end at step 714.
  • FIG. 8 is a flowchart illustrating an exemplary of speed sensitivity. The process begins with the speed sensitivity module 144—which may be initiated by the execution module 142. The speed sensitivity module 144 may start to re-run the training program that the execution module 142 had just run at step 800. In step 802, the speed sensitivity module 144 then speeds up the robotic movements by a factor of “x”. For example, the speed sensitivity module 144 may speed up the robotic motion by x=5 which could increase the robotic movements to five times faster than the defined normal speed.
  • The speed sensitivity module 144 then completes the training program at step 804. The training is then re-run again at step 806. In step 808, the speed sensitivity module 144 then slows down the robotic movements by a factor of “x”. For example, the speed sensitivity module 144 may slow down the robotic motion by x=5, which could decrease the speed of the robotic movements to five times slower than the defined normal speed. The speed sensitivity module 144 then completes the training program at step 810. The program ends and returns to the execution module 142 at step 812.
  • FIG. 9 is a flowchart illustrating an exemplary of location sensitivity. The process begins with the location sensitivity module 146 being initiated by the execution module 142 at step 900. The location sensitivity module 146 may then determine where the wireless access point 102 is located. One aspect of the location sensitivity module 146 allows the training robot 136 can be moved either closer or further way from the access point 102 at step 902. After determining the direction and/or location of the wireless access point 102 the training robot 136 is moved closer to the access point 102. The location sensitivity module 146 may monitor the signal strength of the wireless access point 102 as an indicator of how close the training robot 136 is to the access point 102 at step 904.
  • The training program that was initially run by the execution module 142 is now re-run, but with the training robot 136 closer to the wireless access point 102 at step 906. The training program may run and be completed at a location near to the wireless access point 102 at step 908. Once the program is completed, the training robot 136 then moves a distance away from the wireless access point 102. In one embodiment, training robot 136 may continue to move farther away from the wireless access point 102 by monitoring the signal strength at step 910. The training program is then re-run again but this time with the training robot 136 at a distance from the wireless access point 102 at step 912. The training is program is then run and completed at a distance from the wireless access point 102 at step 914. The location sensitivity module 146 then ends and returns to the execution module 142 at step 916.
  • In an embodiment, an IoT device may be used to improve accuracy of passive Wi-Fi based motion detection systems. An IoT device database of FIG. 10 stores information related to each IoT device connected to the wireless access point 102. That information includes, but not limited to, the location information, the function of the device, and the activity associated with operating the device, all of which is provided either by the IoT device database of FIG. 10 or by a user. The IoT device database of FIG. 10 could also include sensor data feed columns for devices with additional capabilities, such as a virtual assistant, that could provide additional context data to further refine the profile associated with the activity.
  • The cloud server 120 may monitor that activity of the wireless access point 102 via the agent 114, as well as the activities of the IoT devices connected to the wireless access point 102 in order to trigger the installation module when new IoT devices are connected, and the location modules and activity modules when data events are detected simultaneously in both the activity identification module 118 and an IoT device. An installation module may connect new IoT devices to the system and register in the IoT device database of FIG. 10 information related to the location of the device the function of the device, and the activity associated with the function of the IoT device. These definitions are provided either by the IoT device (e.g., if the IoT device has that level of sensor and computational capabilities) or through definition by a user through a training interface. A location module may compare the location associated with the impulse or frequency response of the channel (e.g., in the profile database 122 identified by the activity identification module 118) to the location associated with the data event from the IoT device. When the two locations do not match, the data provided by the IoT device is sent to the profile module 126 to be used to improve the profile definitions in the profile database 122. An activity module may compare the activity associated with the impulse or frequency response of the channel in the profile database 122 identified by the activity identification module 118, to the activity associated with the data event from the IoT device. When the two activities do not match, the data provided by the IoT device is send to the profile module 126 to be used to improve the profile definitions in the profile database 122. At least one IoT device, or a group of up to n number of IoT devices. Consumer connected devices including smart TVs, smart speakers, toys, wearables and smart appliances, smart meters, commercial security systems and smart city technologies, such as those used to monitor traffic and weather conditions, are examples of industrial and enterprise IoT devices, that could all be incorporated into this system in various embodiments.
  • FIG. 10 illustrates an exemplary IoT device database. The IoT device database of FIG. 10 contains information related to each IoT device associated with each IoT device 138 connected to the system. That information includes, but not limited to; the location information, the function of the device, and the activity associated with operating the device, all of which is provided either by the IoT device or by a user. The IoT device database of FIG. 10 could also include sensor data feed columns for devices with additional capabilities, such as a virtual assistant, that could provide additional context data to further refine the profile associated with the activity.
  • FIG. 11 is a flowchart illustrating an exemplary method of IoT device management for Wi-Fi motion detection. Such method may be performed by cloud server 120. The process begins with data being received via the agent 114 from the wireless access point 102 at step 1100. The IoT database 128 may be queried for the presence of a new IoT device being added to the home network at step 1102. It may be determined if a new IoT device 138 is present at step 1104. If a new IoT device is detected in the IoT device database of FIG. 10 , the installation module may be launched at step 1106.
  • The activity identification module 118 may be polled for new motion data at step 1108. It may be determined if the activity identification module 118 has identified a motion at step 1110. If there is motion identified, the IoT devices in the IoT device database of FIG. 10 may be polled for new data event(s) at step 1112. It may be determined if there are any IoT device that coincides with the motion data at step 1114. If there is both motion and IoT data, a call to the location module may determine if the location identified by the activity identification module 118 is in agreement with the location data from the IoT device at step 1116.
  • Once the location module has completed, a call to the activity module may determine if the activity identified by the activity identification module 118 is in agreement with the activity data from the IoT device at step 1118. It may be determined if data is still being received from the wireless access point 102, via the agent 114 at step 1120. If data is still coming in from the agent 114, the method may return to step 1102. If the data is no longer coming from the agent 114, the method may go to step 1122.
  • FIG. 12 is a flowchart illustrating an exemplary method of IoT device installation. Such method may be performed based on execution of an installation module. The process begins with receipt of a prompt from the cloud server 120 at step 1200. The new IoT device added to the IoT device database of FIG. 10 may be characterized by such parameters as make, model, capabilities, etc., at step 1202. It may be determined if the IoT device provides data about the associated location, and the data provided to the IoT device database of FIG. 10 may be written to the IoT device database of FIG. 10 at step 1204. If the device does not provide data about location, the user device may prompt a user to input the location of the new device (e.g., identifying that a specified light switch is in the bathroom) at step 1206.
  • The user-provided location of the new device may be written to the IoT device database of FIG. 10 at step 1208. It may be determine if the device provides data about associated function, and write the data provided to the IoT device database of FIG. 10 at step 1210. If the device does not provide data about associated function, the user device may be prompted for input regarding the function of the new device (e.g., identifying that a specified light switch in the bathroom operates the bathroom lights) at step 1212. The user-provided function of the new device may be written to the IoT device database of FIG. 10 at step 1214. A prompt may be provided to obtain user input defining the activity, such as activating the light switch. The process for defining this activity in this embodiment may allow the user to provide a natural language definition of the activity. In alternate embodiments, the user could perform the activity in response to a prompt or query from the system, allowing the activity identification module 118 to take a sample of the impulse or frequency response of the channel to the activity being defined as linked to the IoT device at step 1216. The user-provided activity of the new device may be written to the IoT device database of FIG. 10 at step 1218. The method may return to the cloud server 120 at step 1220.
  • FIG. 13 is a flowchart illustrating an exemplary method of IoT location tracking. The method may be performed based on execution of a location module. The process begins with receiving a prompt from the cloud server 120 that there is IoT data that coincides with motion data from the activity identification module 118 at step 1300. The location of the IoT device from the IoT device database of FIG. 10 may be identified at step 1302.
  • It may be determine if the location of the IoT device matches the location of the motion identified by the activity identification module 118. If the location matches, the method may return to the cloud server 120 at step 1304. If the location identified by the activity identification module 118 does not match the location of the IoT device, the location of the IoT device is sent to the profile module 126 to update the profile database 122. In the future when the activity identification module 118 sees the current profile in the impulse or frequency response of the channel, activity identification module 118 may recognize such activity as taking place in the location defined by the IoT device at step 1306. The method 600 may further return to the cloud server 120 at step 1308.
  • FIG. 14 is a flowchart illustrating an exemplary method of IoT activity tracking. Such method may be performed based on execution of an activity module. The process begins with receiving a prompt from the cloud server 120 that there is IoT data that coincides with motion data from the activity identification module 118 at step 1400.
  • The activity associated with the IoT device from the IoT device database of FIG. 10 may be identified at step 1402. It may be determine if the activity associated with the IoT device matches the activity associated the profile of the frequency response of the channel identified by the activity identification module 118. If the activity matches, the method may return to the cloud server 120 at step 1404. If the activity identified by the activity identification module 118 does not match the activity associated with the IoT device, the activity associated with the IoT device is sent to the profile module 126 to update the profile database 122. In the future when the activity identification module 118 sees the current profile in the impulse or frequency response of the channel, activity identification module 118 may recognize such data as corresponding to the activity defined by the IoT device at step 1406. The method may return to the cloud server 120 at step 1408.
  • The present invention may be implemented in an application that may be operable using a variety of devices. Non-transitory computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of non-transitory computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, and any other memory chip or cartridge.
  • Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU. Various forms of storage may likewise be implemented as well as the necessary network interfaces and network topologies to implement the same.
  • The foregoing detailed description of the technology has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology, its practical application, and to enable others skilled in the art to utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claim.

Claims (21)

What is claimed is:
1. A system for training a motion detection system, the system comprising:
a wireless access point located within a monitored space;
an agent device that collects frequency response signal data from the wireless access point regarding the monitored space;
a cloud server that generates a robot command corresponding to a human activity; and
a robot that receives the command sent by the cloud server over a communication network and executes the received command to perform an action mimicking the corresponding human activity by moving to a location a specified distance from the wireless access point.
2. The system of claim 1, wherein the cloud server further monitors a set of frequency response signal data associated with the action performed by the robot based on radio signal strength of the wireless access point.
3. The system of claim 2, further comprising a profile database that stores a plurality of sets of frequency response data each corresponding to different human activities, wherein the profile database is updated based on the monitored set of frequency response signal data associated with the action performed by the robot.
4. The system of claim 3, wherein the cloud server further executes a correction module to add a new profile to the profile database when subsequently collected frequency response signal data does not match any of the stored sets of frequency response data.
5. The system of claim 1, wherein the cloud server further receives subsequently collected frequency response signal data sent by the agent device over the communication network and identifies the human activity in the monitored space based on matching the monitored set frequency response data to the subsequently collected set of frequency response data.
6. The system of claim 1, wherein the cloud server generates the robot command by executing a speed sensitivity module to send signals to the robot indicating one or more varied speeds for the action; and wherein the cloud server further measures a speed sensitivity of the robot at each of the varied speeds by determining that each of the varied speeds corresponds to a different set of collected frequency response signal data.
7. The system of claim 1, wherein the cloud server generates the robot command by executing a location sensitivity module to send signals to the robot indicating one or more varied distances of the robot from the wireless access point; and wherein the cloud server further measures a location sensitivity of the robot at each of the varied distances by determining that each of the varied distances corresponds to a different set of collected frequency response signal data.
8. The system of claim 1, wherein the cloud server executes a variation module to repeat a training program and identify anomalous data associated with the repeated training program in comparison to stored data in a training database regarding one or more other human activities.
9. The system of claim 8, wherein the cloud server further executes the variation module to remove the anomalous data from the training database.
10. The system of claim 1, wherein the wireless access point is in communication with an Internet-of-Things (IoT) device, wherein a frequency response from the IoT device is associated with a location stored in a profile database.
11. A method for training a motion detection system, the method comprising:
receiving frequency response signal data regarding the monitored space, the frequency response signal data collected by an agent device from a wireless access point in the monitored space;
generating a robot command corresponding to a human activity, wherein the robot command is generated by a cloud server; and
sending the command from the cloud server over a communication network to a robot, wherein the robot executes the received command to perform an action mimicking the corresponding human activity by moving to a location a specified distance from the wireless access point.
12. The method of claim 11, further comprising monitoring a set of frequency response signal data associated with the action performed by the robot based on radio signal strength of the wireless access point.
13. The method of claim 12, further comprising storing a plurality of sets of frequency response data in a profile database, each set of frequency response data corresponding to different human activities, and updating the profile database based on the monitored set of frequency response signal data associated with the action performed by the robot.
14. The method of claim 13, further comprising executing a correction module to add a new profile to the profile database when subsequently collected frequency response signal data does not match any of the stored sets of frequency response data.
15. The method of claim 11, further comprising receiving subsequently collected frequency response signal data sent by the agent device over the communication network, and identifying the human activity in the monitored space based on matching the monitored set frequency response data to the subsequently collected set of frequency response data.
16. The method of claim 11, wherein generating the robot command includes executing a speed sensitivity module to send signals to the robot indicating one or more varied speeds for the action; and further comprising measuring a speed sensitivity of the robot at each of the varied speeds, wherein each of the varied speeds is determined to correspond to a different set of collected frequency response signal data.
17. The method of claim 11, wherein generating the robot command includes executing a location sensitivity module to send signals to the robot indicating one or more varied distances of the robot from the wireless access point, and further comprising measuring a location sensitivity of the robot at each of the varied distances, wherein each of the varied distances is determined to correspond to a different set of collected frequency response signal data.
18. The method of claim 11, further comprising executing a variation module to repeat a training program and identify anomalous data associated with the repeated training program in comparison to stored data in a training database regarding one or more other human activities.
19. The method of claim 18, further executing the variation module to remove the anomalous data from the training database.
20. The method of claim 11, wherein the wireless access point is in communication with an Internet-of-Things (IoT) device, wherein a frequency response from the IoT device is associated with a location stored in a profile database.
21. A non-transitory, computer-readable storage medium, having embodied thereon a program executable by a processor to perform a method for training a motion detection system, the method comprising:
receiving frequency response signal data regarding the monitored space, the frequency response signal data collected by an agent device from a wireless access point in the monitored space;
generating a robot command corresponding to a human activity, wherein the robot command is generated by a cloud server; and
sending the command from the cloud server over a communication network to a robot, wherein the robot executes the received command to perform an action mimicking the corresponding human activity by moving to a location a specified distance from the wireless access point.
US18/112,349 2019-02-22 2023-02-21 Robotic h matrix creation Pending US20230196144A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/112,349 US20230196144A1 (en) 2019-02-22 2023-02-21 Robotic h matrix creation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201962809393P 2019-02-22 2019-02-22
US201962809356P 2019-02-22 2019-02-22
US16/794,668 US11586952B2 (en) 2019-02-22 2020-02-19 Robotic H matrix creation
US18/112,349 US20230196144A1 (en) 2019-02-22 2023-02-21 Robotic h matrix creation

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/794,668 Continuation US11586952B2 (en) 2019-02-22 2020-02-19 Robotic H matrix creation

Publications (1)

Publication Number Publication Date
US20230196144A1 true US20230196144A1 (en) 2023-06-22

Family

ID=72748078

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/794,668 Active 2040-12-26 US11586952B2 (en) 2019-02-22 2020-02-19 Robotic H matrix creation
US18/112,349 Pending US20230196144A1 (en) 2019-02-22 2023-02-21 Robotic h matrix creation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/794,668 Active 2040-12-26 US11586952B2 (en) 2019-02-22 2020-02-19 Robotic H matrix creation

Country Status (1)

Country Link
US (2) US11586952B2 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10999705B2 (en) 2019-02-22 2021-05-04 Aerial Technologies Inc. Motion vector identification in a Wi-Fi motion detection system
US11913970B2 (en) 2019-02-22 2024-02-27 Aerial Technologies Inc. Wireless motion detection using multiband filters
US11218769B2 (en) 2019-02-22 2022-01-04 Aerial Technologies Inc. Smart media display
WO2020170221A1 (en) 2019-02-22 2020-08-27 Aerial Technologies Inc. Handling concept drift in wi-fi-based localization
US11593837B2 (en) 2019-02-22 2023-02-28 Aerial Technologies Inc. Advertisement engagement measurement
US11082109B2 (en) 2019-02-22 2021-08-03 Aerial Technologies Inc. Self-learning based on Wi-Fi-based monitoring and augmentation
US11448726B2 (en) 2019-08-28 2022-09-20 Aerial Technologies Inc. System and method for presence and pulse detection from wireless signals
US11523253B2 (en) 2019-09-06 2022-12-06 Aerial Technologies Inc. Monitoring activity using Wi-Fi motion detection
US11777763B2 (en) * 2020-03-20 2023-10-03 Nantworks, LLC Selecting a signal phase in a communication system
WO2023092238A1 (en) * 2021-11-29 2023-06-01 Cognitive Systems Corp. Context-dependent processing and encoding of motion data from a wireless communication network

Family Cites Families (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2001249994A1 (en) 2000-02-25 2001-09-03 Interval Research Corporation Method and system for selecting advertisements
CA2344743C (en) 2001-04-20 2011-12-06 Elysium Broadband Inc. Point to multi-point communications system
US8078164B2 (en) 2004-01-06 2011-12-13 Vasu Networks Corporation Mobile telephone VOIP/cellular seamless roaming switching controller
WO2006037014A2 (en) 2004-09-27 2006-04-06 Nielsen Media Research, Inc. Methods and apparatus for using location information to manage spillover in an audience monitoring system
US7694212B2 (en) 2005-03-31 2010-04-06 Google Inc. Systems and methods for providing a graphical display of search activity
US20070024580A1 (en) 2005-07-29 2007-02-01 Microsoft Corporation Interactive display device, such as in context-aware environments
US9703892B2 (en) 2005-09-14 2017-07-11 Millennial Media Llc Predictive text completion for a mobile communication facility
US7733224B2 (en) 2006-06-30 2010-06-08 Bao Tran Mesh network personal emergency response appliance
US20080262909A1 (en) 2007-04-18 2008-10-23 Microsoft Corporation Intelligent information display
JP2010016785A (en) 2008-06-03 2010-01-21 Nippon Telegr & Teleph Corp <Ntt> Receiving device and receiving method
US8121633B2 (en) 2009-07-24 2012-02-21 Research In Motion Limited Operator configurable preferred network and radio access technology selection for roaming multi-rat capable devices
US9163946B2 (en) 2009-07-28 2015-10-20 CSR Technology Holdings Inc. Methods and applications for motion mode detection for personal navigation systems
US20110117924A1 (en) 2009-11-18 2011-05-19 Qualcomm Incorporated Position determination using a wireless signal
AU2011219093A1 (en) 2010-02-24 2012-10-18 Performance Lab Technologies Limited Classification system and method
US20110258039A1 (en) 2010-04-14 2011-10-20 Microsoft Corporation Evaluating preferences of users engaging with advertisements
US8073441B1 (en) 2010-08-24 2011-12-06 Metropcs Wireless, Inc. Location-based network selection method for a mobile device
US20120053472A1 (en) 2010-08-30 2012-03-01 Bao Tran Inexpensive non-invasive safety monitoring apparatus
US10034034B2 (en) 2011-07-06 2018-07-24 Symphony Advanced Media Mobile remote media control platform methods
US9154826B2 (en) 2011-04-06 2015-10-06 Headwater Partners Ii Llc Distributing content and service launch objects to mobile devices
US9077458B2 (en) 2011-06-17 2015-07-07 Microsoft Technology Licensing, Llc Selection of advertisements via viewer feedback
US20130028443A1 (en) 2011-07-28 2013-01-31 Apple Inc. Devices with enhanced audio
EP2769485A1 (en) 2011-10-19 2014-08-27 Marvell World Trade Ltd. Systems and methods for suppressing interference in a signal received by a device having two or more antennas
KR101804338B1 (en) 2011-11-08 2017-12-04 엘지전자 주식회사 Mobile terminal
WO2013177592A2 (en) 2012-05-25 2013-11-28 Emotiv Lifesciences, Inc. System and method for providing and aggregating biosignals and action data
WO2013184488A1 (en) 2012-06-05 2013-12-12 Almondnet, Inc. Targeted television advertising based on a profile linked to an online device associated with a content-selecting device
US9219790B1 (en) 2012-06-29 2015-12-22 Google Inc. Determining user engagement with presented media content through mobile device usage
US9445163B2 (en) 2012-07-27 2016-09-13 Echostar Technologies L.L.C. Systems and methods for assessing viewer interest in content and advertisements
US10495725B2 (en) 2012-12-05 2019-12-03 Origin Wireless, Inc. Method, apparatus, server and system for real-time vital sign detection and monitoring
US20140278389A1 (en) 2013-03-12 2014-09-18 Motorola Mobility Llc Method and Apparatus for Adjusting Trigger Parameters for Voice Recognition Processing Based on Noise Characteristics
US20150026708A1 (en) 2012-12-14 2015-01-22 Biscotti Inc. Physical Presence and Advertising
US9344773B2 (en) 2013-02-05 2016-05-17 Microsoft Technology Licensing, Llc Providing recommendations based upon environmental sensing
WO2014175149A1 (en) 2013-04-24 2014-10-30 Nec Corporation Method for use in device-to-device communication, wireless communication system, and architecture
US9014790B2 (en) 2013-06-03 2015-04-21 Fitbit, Inc. Heart rate data collection
US9264862B2 (en) 2013-08-15 2016-02-16 Apple Inc. Determining exit from a vehicle
EP3534318A1 (en) 2013-09-26 2019-09-04 Mark W. Publicover Providing targeted content based on a user´s moral values
WO2015057231A1 (en) 2013-10-17 2015-04-23 Intel Corporation Context-aware location-based information distribution
US9516259B2 (en) 2013-10-22 2016-12-06 Google Inc. Capturing media content in accordance with a viewer expression
US9510042B2 (en) 2013-10-30 2016-11-29 Echostar Technologies L.L.C. Set-top box proximity detection
EP3066616A4 (en) 2013-11-08 2017-06-28 Performance Lab Technologies Limited Classification of activity derived from multiple locations
US10431340B2 (en) 2014-02-28 2019-10-01 Eco-Fusion Systems for predicting hypoglycemia and methods of use thereof
US9414115B1 (en) 2014-03-28 2016-08-09 Aquifi, Inc. Use of natural user interface realtime feedback to customize user viewable ads presented on broadcast media
US10440499B2 (en) 2014-06-16 2019-10-08 Comcast Cable Communications, Llc User location and identity awareness
US9258606B1 (en) 2014-07-31 2016-02-09 Google Inc. Using second screen devices to augment media engagement metrics
US9396480B2 (en) 2014-08-21 2016-07-19 Verizon Patent And Licensing Inc. Providing on-demand audience based on network
US9729915B2 (en) 2014-10-31 2017-08-08 Paypal, Inc. Detecting user devices to determine state of watched show
US9510319B2 (en) 2014-12-10 2016-11-29 Texas Instruments Incorporated Method and system for location estimation
US10104195B2 (en) 2015-03-20 2018-10-16 The Trustees Of The Stevens Institute Of Technology Device-free activity identification using fine-grained WiFi signatures
US20160315682A1 (en) 2015-04-24 2016-10-27 The Royal Institution For The Advancement Of Learning / Mcgill University Methods and systems for wireless crowd counting
US9883241B2 (en) 2015-05-17 2018-01-30 Surewaves Mediatech Private Limited System and method for automatic content recognition and audience measurement for television channels and advertisements
US9923937B2 (en) 2015-05-18 2018-03-20 Adobe Systems Incorporated Dynamic personalized content presentation to re-engage users during online sessions
EP3323060A4 (en) 2015-05-21 2019-06-12 Maruthi Siva P. Cherukuri Personalized activity data gathering based on multi-variable user input and multi-dimensional schema
US11439344B2 (en) 2015-07-17 2022-09-13 Origin Wireless, Inc. Method, apparatus, and system for wireless sleep monitoring
US10325641B2 (en) * 2017-08-10 2019-06-18 Ivani, LLC Detecting location within a network
US10231668B2 (en) 2015-11-13 2019-03-19 International Business Machines Corporation Instant messaging status reporting based on smart watch activity
US11137820B2 (en) 2015-12-01 2021-10-05 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
KR102500299B1 (en) 2015-12-03 2023-02-16 삼성전자주식회사 User terminal and control method thereof
CN105828289B (en) 2016-04-20 2019-09-03 浙江工业大学 A kind of passive indoor orientation method based on channel state information
US10359924B2 (en) 2016-04-28 2019-07-23 Blackberry Limited Control of an electronic device including display and keyboard moveable relative to the display
US9998856B2 (en) 2016-05-13 2018-06-12 Qualcomm Incorporated Method and/or system for positioning of a mobile device
US10045717B2 (en) 2016-06-10 2018-08-14 The Regents Of The University Of California WiFi-based person-identification technique for use in smart spaces
US10567943B2 (en) 2016-06-15 2020-02-18 Qualcomm Incorporated Methods and systems for handover of an emergency call between different wireless networks
US10413254B2 (en) 2016-07-08 2019-09-17 General Electric Company Dynamic automated adjustment of alarm threshold levels
US10205906B2 (en) 2016-07-26 2019-02-12 The Directv Group, Inc. Method and apparatus to present multiple audio content
EP3496476B1 (en) 2016-09-13 2022-03-30 LG Electronics Inc. Method and wireless device for performing position measurement in nb iot
US10356553B2 (en) 2016-09-23 2019-07-16 Apple Inc. Precise indoor localization and tracking of electronic devices
US10779127B2 (en) 2016-11-22 2020-09-15 Aerial Technologies Device free localization methods within smart indoor environments
WO2018112833A1 (en) * 2016-12-22 2018-06-28 Intel Corporation Efficient transferring of human experiences to robots and other autonomous machines
US20180181094A1 (en) 2016-12-23 2018-06-28 Centurylink Intellectual Property Llc Smart Home, Building, or Customer Premises Apparatus, System, and Method
US10341725B2 (en) 2016-12-27 2019-07-02 Rovi Guides, Inc. Methods and systems for determining user engagement based on user interactions during different time intervals
EP4354882A1 (en) 2016-12-27 2024-04-17 Rovi Guides, Inc. Systems and methods for dynamically adjusting media output based on presence detection of individuals
US9854292B1 (en) 2017-01-05 2017-12-26 Rovi Guides, Inc. Systems and methods for determining audience engagement based on user motion
US9985846B1 (en) 2017-01-15 2018-05-29 Essential Products, Inc. Assistant for management of network devices
US10176499B2 (en) 2017-05-15 2019-01-08 International Business Machines Corporation Advertisement selection by use of physical location behavior
US11019389B2 (en) 2017-12-04 2021-05-25 Comcast Cable Communications, Llc Determination of enhanced viewing experiences based on viewer engagement
US20190188756A1 (en) 2017-12-15 2019-06-20 At&T Intellectual Property I, L.P. Methods and devices for determining distraction level of users to select targeted advertisements
CN110113818B (en) 2018-02-01 2023-12-15 北京三星通信技术研究有限公司 Channel state information reporting method, user equipment, base station and computer readable medium
US10374646B1 (en) 2018-02-07 2019-08-06 Cubic Corporation Crowd size estimation based on wireless signal absorption
US10818384B1 (en) * 2018-02-20 2020-10-27 Verily Life Sciences Llc Valence profiling of virtual interactive objects
CN111095859B (en) 2018-04-02 2021-04-13 Lg电子株式会社 Method for transmitting or receiving signal in wireless communication system and apparatus therefor
US10680889B2 (en) 2018-04-02 2020-06-09 Cisco Technology, Inc. Network configuration change analysis using machine learning
US20200036592A1 (en) * 2018-07-30 2020-01-30 Hewlett Packard Enterprise Development Lp User profile environment-automation configurations
US10419880B1 (en) 2018-08-22 2019-09-17 Facebook, Inc. Robotics for indoor data curation
WO2020044192A1 (en) 2018-08-26 2020-03-05 Celeno Communications (Israel) Ltd. Wi-fi radar detection using synchronized wireless access point
US11039278B1 (en) 2018-08-31 2021-06-15 Facebook, Inc. Dynamic location collection
US10860864B2 (en) 2019-01-16 2020-12-08 Charter Communications Operating, Llc Surveillance and image analysis in a monitored environment
US10902714B2 (en) 2019-02-14 2021-01-26 Coho Industries LLC Systems and methods for reducing adverse health events in first responders
US20200303046A1 (en) 2019-02-22 2020-09-24 Aerial Technologies Inc. Wi-fi-based condition monitoring
US11082109B2 (en) 2019-02-22 2021-08-03 Aerial Technologies Inc. Self-learning based on Wi-Fi-based monitoring and augmentation
WO2020170221A1 (en) 2019-02-22 2020-08-27 Aerial Technologies Inc. Handling concept drift in wi-fi-based localization
US11218769B2 (en) 2019-02-22 2022-01-04 Aerial Technologies Inc. Smart media display
US10999705B2 (en) 2019-02-22 2021-05-04 Aerial Technologies Inc. Motion vector identification in a Wi-Fi motion detection system
US11913970B2 (en) 2019-02-22 2024-02-27 Aerial Technologies Inc. Wireless motion detection using multiband filters
US11593837B2 (en) 2019-02-22 2023-02-28 Aerial Technologies Inc. Advertisement engagement measurement
US11017688B1 (en) 2019-04-22 2021-05-25 Matan Arazi System, method, and program product for interactively prompting user decisions
EP3977793A4 (en) 2019-05-30 2023-06-07 Aerial Technologies Inc. Proximity-based model for indoor localization using wireless signals
US11448726B2 (en) 2019-08-28 2022-09-20 Aerial Technologies Inc. System and method for presence and pulse detection from wireless signals
US11523253B2 (en) 2019-09-06 2022-12-06 Aerial Technologies Inc. Monitoring activity using Wi-Fi motion detection
WO2021084519A1 (en) 2019-11-01 2021-05-06 Aerial Technologies Inc. System for multi-path 5g and wi-fi motion detection

Also Published As

Publication number Publication date
US11586952B2 (en) 2023-02-21
US20200327430A1 (en) 2020-10-15

Similar Documents

Publication Publication Date Title
US20230196144A1 (en) Robotic h matrix creation
US11916635B2 (en) Self-learning based on Wi-Fi-based monitoring and augmentation
US10999705B2 (en) Motion vector identification in a Wi-Fi motion detection system
CN107925821B (en) Monitoring
US20220070633A1 (en) Proximity-based model for indoor localization using wireless signals
US11908465B2 (en) Electronic device and controlling method thereof
US11622098B2 (en) Electronic device, and method for displaying three-dimensional image thereof
JP5873864B2 (en) Object tracking and recognition method and apparatus
US20240062861A1 (en) Wi-fi based condition monitoring
CN109325456B (en) Target identification method, target identification device, target identification equipment and storage medium
US10162737B2 (en) Emulating a user performing spatial gestures
US20240095143A1 (en) Electronic device and method for controlling same
JP2017523498A (en) Eye tracking based on efficient forest sensing
CN111670004A (en) Electronic device and method for measuring heart rate
US20220300774A1 (en) Methods, apparatuses, devices and storage media for detecting correlated objects involved in image
CN108875506B (en) Face shape point tracking method, device and system and storage medium
WO2020213099A1 (en) Object detection/tracking device, method, and program recording medium
WO2020039559A1 (en) Information processing device, information processing method, and work evaluation system
CN113190444A (en) Test method, test device and storage medium
CN108292437B (en) Image processing apparatus and method
CN111310595A (en) Method and apparatus for generating information
US20230104775A1 (en) Human robot collaboration for flexible and adaptive robot learning
CN111445499B (en) Method and device for identifying target information
US10832060B2 (en) Resident activity recognition system and method thereof
CN108696722B (en) Target monitoring method, system and device and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: AERIAL TECHNOLOGIES INC., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MARTINEZ, MICHEL ALLEGUE;GHOURCHIAN, NEGAR;GRANT, DAVID;AND OTHERS;SIGNING DATES FROM 20200603 TO 20200604;REEL/FRAME:062801/0275

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED