WO2017156188A1 - Systems and methods for a compound sensor system - Google Patents

Systems and methods for a compound sensor system Download PDF

Info

Publication number
WO2017156188A1
WO2017156188A1 PCT/US2017/021448 US2017021448W WO2017156188A1 WO 2017156188 A1 WO2017156188 A1 WO 2017156188A1 US 2017021448 W US2017021448 W US 2017021448W WO 2017156188 A1 WO2017156188 A1 WO 2017156188A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
detection range
compound
sensor system
command signal
Prior art date
Application number
PCT/US2017/021448
Other languages
French (fr)
Inventor
Daxiao Yu
Yang Sun
Original Assignee
Tinoq Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tinoq Inc. filed Critical Tinoq Inc.
Priority to CN201780023510.2A priority Critical patent/CN109074721A/en
Priority to EP17764046.3A priority patent/EP3427240A4/en
Publication of WO2017156188A1 publication Critical patent/WO2017156188A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0219Electrical interface; User interface
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0228Control of working procedures; Failure detection; Spectral bandwidth calculation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0247Details using a charging unit
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • G01J1/4204Photometry, e.g. photographic exposure meter using electric radiation detectors with determination of ambient light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/10Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
    • G01J5/12Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors using thermoelectric elements, e.g. thermocouples
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/14Systems for determining distance or velocity not using reflection or reradiation using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/04Systems determining presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/04Systems determining presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3287Power saving characterised by the action undertaken by switching off individual functional units in the computer system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1097Protocols in which an application is distributed across nodes in the network for distributed storage of data in networks, e.g. transport arrangements for network file system [NFS], storage area networks [SAN] or network attached storage [NAS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1001Protocols in which an application is distributed across nodes in the network for accessing one among a plurality of replicated servers
    • H04L67/1004Server selection for load balancing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • This invention relates generally to the field of low power sensor systems. Description of the Related Art
  • Proximity sensors such as touch screens for cellphones and touch switch for lights, are widely used in today's consumer and industrial electronics.
  • a proximity sensor typically functions by either (1) responding to a target object or an object's motion when the object is within the sensing range (e.g., a touch screen), or (2) directly detecting the distance between the object and the sensor (e.g. an infrared ranging sensor).
  • the range of interest between the sensor and the target object can be either explicitly or implicitly specified.
  • a touch light switch typically functions only when a hand is placed within around 10 centimeters of the sensor.
  • a three-dimensional (3D) hand gesture detection sensor for smart phones also works in similar ranges.
  • the proximity zone is directional, i.e., objects or motions can only be detected in front of the proximity sensor.
  • proximity sensors can use active emissions. Distance is determined by detecting the reflected emissions from the object. Typical emissions by the proximity sensors include infrared, ultrasonic, or any other suitable electromagnetic signals that can be bounced back from a target object. In some embodiments, the basic working principle of the ranging proximity sensor is similar to radars.
  • Active ranging proximity sensors can accurately sense whether an object is present or a motion happens in its proximity with well-defined range such as 10 centimeters, 35 centimeters, or any other suitable ranges.
  • Traditional proximity sensors typically consume more than 0.5 mA in current, which is less suitable for battery-powered systems.
  • one standard AA battery usually has the capacity of around lOOOmAh and can only support an active ranging proximity sensor for a few months. Design of the battery-operated sensor systems often requires a battery life that is longer than a year.
  • a compound sensor system includes a first sensor, a second sensor, a memory that stores a module, and a processor coupled to the first sensor, the second sensor, and the memory.
  • the first sensor is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to the processor.
  • the processor is configured to run the module stored in the memory that is configured to cause the processor to receive the first command signal from the first sensor and send a second command signal to the second sensor based on receiving the first command signal.
  • the second sensor is configured to operate at a sleep mode and switch to an active mode upon receiving the second command signal, and during the active mode the second sensor is configured to determine if the user enters or leaves the target area.
  • Disclosed subject matter includes, in another aspect, a method of determining a user enters or leave a target area using a compound sensor system.
  • the method includes detecting, using a first sensor of the compound sensor system, a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to a processor of the compound sensor system.
  • the method includes sending, from the processor, a second command signal to a second sensor of the compound sensor system based on receiving the first command signal.
  • the method includes switching the second sensor from a sleep mode to an active mode upon receiving the second command signal, and determining, using the second sensor, if the user enters or leaves the target area.
  • the present disclosure also discloses computer readable media that include executable instructions (e.g., computer program of instructions) operable to cause a device to perform the functions of the apparatuses described above.
  • executable instructions e.g., computer program of instructions
  • FIG. 1 illustrates an environment for detecting a user's presence and recording the user's activity in a gym according to certain embodiments of the disclosed subject matter.
  • FIG. 2 illustrates a block diagram of a sensor system according to certain embodiments of the present disclosure.
  • FIG. 3 shows detection ranges of the sensor system according to certain embodiments of the present disclosure.
  • FIG.4 is a flow chart illustrating a process of detecting whether or not a user enters or leaves a target area of an exercise device according to certain embodiments of the present disclosure.
  • a compound sensor system to detect whether a user is using an exercise device in a gym in a power efficient way.
  • the compound sensor includes at least two sensors.
  • One sensor is a low power sensor and can be configured to have a coarse detection.
  • the other sensor is a high power sensor and can be configured to have a fine detection.
  • the high power/fine sensor can normally operate at a low power/sleep mode.
  • the fine sensor is switched on from the sleep mode to confirm whether the object and/or the motion is indeed in its proximity. Overall power consumption of the compound sensor system is reduced by only utilizing the fine/high power sensor when necessary.
  • FIG. 1 illustrates an environment 100 for detecting a user's presence and recording the user's activity in a gym according to certain embodiments of the disclosed subject matter.
  • the environment 100 can include a communication network 102, a server 104, a sensor system 106, a local network storage medium 108, a remote network storage medium 1 10, a mobile device 1 12, a wireless network 1 14, a camera 1 16, and a registration system 1 18.
  • Some or all components of the environment 100 can be coupled directly or indirectly to the communication network 102.
  • the components included in the environment 100 can be further broken down into more than one component and/or combined together in any suitable arrangement.
  • the sensor system 106 and the camera 1 16 can be combined as one device.
  • the environment 100 can include more than one sensor system 106, more than one camera 1 16, and/or more than one mobile device 1 12.
  • the environment 100 can also include a tracking device.
  • the sensor system 106 can be attached to an exercise device.
  • exercise devices include treadmills, ellipticals, exercise bikes, rowing machines, stair climbers, weightlifting benches, weight machines, etc.
  • the sensor system 106 can be attached to an exercise device non- intrusively.
  • the sensor system 106 can be taken off from one exercise device and attached to another exercise device.
  • the sensor system 106 can be configured to communicate wirelessly with at least one mobile device 1 12, the server 104, and/or other suitable components of the environment 100.
  • the sensor system 106 can detect when a user or his or her mobile device 1 12 enters or leaves a target area of the sensor device 106 and notify other components of the environment 100, such as the mobile device 1 12, via the wireless network 1 14 and/or the communication network 102.
  • the target area can be above the base of the treadmill.
  • the sensor system 106 can sense or detect movements of an exercise device and/or the user using the exercise device.
  • the sensor system 106 can report the detection result to and/or trigger other components of the environment 100.
  • the structure and function of the sensor system 106 are described in more detail below.
  • the mobile device 1 12 can be connected to the sensor system 106 via the wireless network 114.
  • the mobile device 112 can also be configured to communicate wirelessly with the server 104 and/or other suitable components of the environment 100 via the wireless network 1 14 and/or the communication network 102.
  • the mobile device can be a tablet computer, a personal digital assistant (PDA), a pager, a mobile or smart phone, a wireless sensor, a wearable device, or any other suitable device.
  • PDA personal digital assistant
  • the communication network 102 can include a network or combination of networks that can accommodate private data communication.
  • the communication network 102 can include a local area network (LAN), a virtual private network (VPN) coupled to the LAN, a private cellular network, a private telephone network, a private computer network, a private packet switching network, a private line switching network, a private wide area network (WAN), a corporate network, or any number of private networks that can be referred to as an Intranet.
  • LAN local area network
  • VPN virtual private network
  • a private cellular network a private telephone network
  • private computer network a private packet switching network
  • private line switching network a private line switching network
  • WAN private wide area network
  • corporate network or any number of private networks that can be referred to as an Intranet.
  • FIG. 1 shows the communication network 102 as a single network; however, the communication network 102 can include multiple interconnected networks listed above.
  • the server 104 can be a single server, a network of servers, or a farm of servers in a data center.
  • the server 104 can be coupled to a network storage system.
  • the network storage system can include two types of network storage devices: a local network storage medium 108 and a remote network storage medium 110.
  • the local network storage medium 108 and the remote network storage medium 110 can each include at least one physical, non-transitory storage medium, flash memory, a magnetic disk drive, an optical drive, a programmable read-only memory (PROM), a read-only memory (ROM), or any other memory or combination of memories.
  • the local network storage medium 108 and the remote network storage medium 1 10 can be part of the server 104 or can be separated from the server 104.
  • the server 104 can be located within or near a gym or a fitness center. In some embodiments, the server 104 can be located at a remote location. In some embodiments, the server 104 can also include a gateway and/or an access point to direct any signals received from the sensor system 106, the mobile device 1 12, and/or other components of the environment 100. [0029] In some embodiments, the server 104 manages a database of the registered gym members including registered faces gathered from the registration system 1 18. In some embodiments, the server 104 also stores the face images captured from the camera 1 16 and performs face recognition.
  • the server 104 manages and stores user exercise data, which is collected by the exercise device with embedded sensors or by sensors attached to the exercise device. In some embodiments, the server 104 stores the exercise data in association with respective users, which can be identified by the face recognition process.
  • the server 104 determines that the image quality of the face image is not good enough for recognition, it sends commands back to the camera 1 16 to retake one or more photos and/or video clips.
  • the server 104 may offload some of its computing and/or storage tasks to one or more gateways, as described below.
  • the environment 100 may also include one or more gateways that are separate from the server 104. Multiple gateways can be deployed in one gym. In one embodiment, one or more gateway can be used as a communication hub to connect the camera 116 and/or other components of the environment 100 to the server 104.
  • a gateway can also help share the load of computing and reduce data storage required from the server 104.
  • the advantages include, among others, faster response time and lower cloud computing cost.
  • a gateway detects faces from one or more photos and/or video clips taken by the camera 1 16, extracts the face features from the photos, and transmits the extracted features together with the photos to the server 104 for face recognition and image storage.
  • the gateway detects faces from the one or more photos and/or video clips taken by the camera 1 16, extracts the face features from the photos, and performs face recognition locally.
  • the server 104 only stores the photos received from the gateway. If the gateway determines that the image quality is not good enough for face recognition, it send commands to the camera module to retake one or more photos and restarts the face recognition process.
  • face recognition tasks can be partitioned and shared between the gateway and the server 104, and the partitioning and sharing can be arranged or rearranged dynamically to meet the face recognition system requirements.
  • the camera 1 16 can be attached to an exercise device. In some embodiments, the camera 1 16 can be attached to an exercise device non-intrusively. In some embodiments, the camera 1 16 can be taken off from one exercise device and attached to another exercise device. In some embodiments, the camera 1 16 can be configured to communicate wirelessly with at least one sensor system 106, at least one mobile device 1 12, the server 104, and/or other suitable components of the environment 100. In some embodiments, the camera 1 16 can detect when a user starts to use the exercise device that the camera 1 16 is attached to and start to acquire one or more photos and/or video clips that contain sufficient facial information of one or more users that are near the camera 1 16. In some embodiments, each exercise device in a gym will have a dedicated camera 1 16. In some embodiments, one or more exercise devices can share one camera 1 16.
  • the registration system 1 18 typically locates near or at the entrance of a facility, for example, the registration system 1 18 can locate near or at the entrance of a gym. In some embodiments, when a user enters or leaves a gym, he or she will be registered by the registration system 118. In some embodiments, the registration system 1 18 also includes a camera, which can be configured to acquire one or more photos and/or video clips of a user who sign in at the gym. In some embodiments, each user may register his or her face multiple times, which in general improve the performance of face recognition algorithms. When a registered user walks in the gym and/or starts on an exercise device, face images of the user captured by the camera 1 16 associated with the exercise device will be compared against registered faces to identify the correct user.
  • Validation criteria can include one or more of the following: ( 1) whether the user has a valid membership, and (2) whether the face images captured at the registration system 118 contain sufficient information for recognition purpose.
  • his or her face information can be acquired by one of the following embodiments or any combinations of the following embodiments.
  • the user's face information can be acquired by the camera associated with the registration system 118.
  • the user's face information can be retrieved from the gym's member management system, where previously taken photos of gym members can be stored.
  • the user's face images can be acquired from mobile applications running on the user's mobile device 1 12 and/or other suitable devices associated with the user.
  • the sensor system 106, the camera 1 16, the mobile device 1 12, and/or other components of the environment 100 can communicate with each other through the wireless connection 1 14.
  • the wireless connection can be WiFi, ZigBee, IEEE802.15.4, Bluetooth, near field communication (NFC), or another connection using any other suitable wireless protocol standard or combination of standards.
  • the wireless connection 1 14 can be the same as the communication network 102. In some embodiments, the wireless connection 1 14 can be different from the communication network 102.
  • FIG. 2 illustrates a block diagram of a sensor system 106 according to certain embodiments of the present disclosure.
  • the sensor system 106 includes a first sensor 210, a second sensor 220, a wireless transceiver 230, a processor 240, a memory 250, a module 260, and a power supply 270.
  • the components included in the sensor system 106 can be further broken down into more than one component and/or combined together in any suitable arrangement.
  • the first sensor 210 can include one or more sensors.
  • the second sensor 220 can include one or more sensors.
  • one or more components can be rearranged, changed, added, and/or removed.
  • the first sensor 210 is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to the processor 240.
  • the target area is a specific space or interest that indicates whether a user is using an exercise device.
  • the first sensor 210 is or includes a low power coarse proximity sensor with a coarse detection range, and the parameter to be detected by the first sensor 210 is based on a user entering or leaving the coarse detection range of the coarse proximity sensor.
  • the coarse proximity sensor can be a passive infrared sensor and/ or any other suitable sensor.
  • the first sensor 210 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor.
  • the motion sensor can be an accelerometer and/or any other suitable motion sensor.
  • the first sensor 210 is or includes a temperature sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of temperatures within the detection range of the temperature sensor.
  • the temperature sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
  • the first sensor 210 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors.
  • a proximity sensor for example, a passive infrared sensor
  • an ambient light sensor for example, a photoelectric sensor
  • an ultrasonic sensor for example, a laser scanner
  • time of flight distance sensor for example, a thermopile sensor, or any other suitable sensors or combination of sensors.
  • the second sensor 220 is configured to more accurately determine whether or not a user enters or leaves a detection area.
  • the second sensor 220 can be configured to send detection results to the processor 240 and/or other suitable components.
  • the second sensor 220 is or includes a fine proximity sensor with a fine detection range, and the fine proximity sensor determines if the user enters or leaves the target area based on detecting if the user enters or leaves the fine detection range.
  • the fine proximity sensor is an active ranging sensor, which measures distance by emitting waves and calculating the distance based on the arrival time of the reflected waves.
  • the fine proximity sensor includes an infrared ranging sensor, an ultrasonic proximity sensor, and/or any other suitable sensor.
  • the second sensor 220 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 220 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor.
  • the motion sensor can be an accelerometer and/or any other suitable motion sensor.
  • the second sensor 220 is or includes a temperature sensor with a detection range, and the parameter to be detected by the second sensor 220 is based on detecting a change of temperatures within the detection range of the temperature sensor.
  • the motion sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
  • the second sensor 220 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors.
  • a proximity sensor for example, a passive infrared sensor
  • an ambient light sensor for example, a photoelectric sensor
  • an ultrasonic sensor for example, a time of flight distance sensor
  • thermopile sensor thermopile sensor
  • the second sensor 220 has a detection range that is smaller than the detection range of the first sensor 210, but the second sensor 220 can be more accurately detect whether or not a user enters or leaves the detection range of the second sensor 220.
  • the wireless transceiver 230 can be configured to transmit any detection results of the sensor system 106 to the mobile device 112, the gateway, the server 104, and/or any other components of the environment 100. In some embodiments, the wireless transceiver 230 can also be configured to receive signals from one or more components of the environment 100. In some embodiments, the communication model 230 can enable the communication with other components of the environment 100 via the wireless network 1 14. In some embodiments, the wireless transceiver 230 can be used as the interface among various components of the sensor system 106. [0055]
  • the processor 240 can include one or more cores and can accommodate one or more threads to run various applications and modules. The software can run on the processor 240 capable of executing computer instructions or computer code. The processor 240 might also be implemented in hardware using an application specific integrated circuit (ASIC), programmable logic array (PLA), field programmable gate array (FPGA), or any other integrated circuit.
  • ASIC application specific integrated circuit
  • PLA programmable logic array
  • FPGA field programmable gate
  • the memory 250 can be a non-transitory computer readable medium, flash memory, a magnetic disk drive, an optical drive, a PROM, a ROM, or any other memory or combination of memories.
  • the processor 240 can be configured to run the module 260 stored in the memory 250 that is configured to cause the processor 240 to perform various steps that are discussed in the disclosed subject matter.
  • the processor 240 can be configured to receive the first commend signal from the first sensor 210 when the first sensor 210 detects a parameter that indicates a likelihood of having a user enter or leave a target area.
  • the processor 240 can be configured to send a second command signal to the second sensor 220 based on receiving the first command signal.
  • the processor is preferably a low power processor.
  • the processor 240 can operate in an always-on mode.
  • the processor 240 can be configured to be in sleep mode and is only switched to an active mode upon receiving the first command signal from the first sensor 210.
  • the power supply 270 provides power to one or more other components of the sensor system 106.
  • the power supply 270 can be a battery source.
  • the power supply 270 can provide alternating current (AC) and/or direct current (DC) power via an external power source.
  • each of the first sensor 210 and the second sensor 220 has its own power supply.
  • the first sensor 210 is designed to consume less than a few hundreds of mircoramps of power.
  • the second sensor 220 generally consumes more power than the first sensor 210.
  • the first sensor 210 can also serve as a power control/ power switch for the second sensor system 220. For example, only when the first sensor 210 detects that there is a likelihood that a user enters or leaves a target area, can the second sensor 220 be switched on to more accurately determine whether a user enters or leaves the target area.
  • the second sensor 220 stays in a low power or sleep mode most of the time, and only wakes up to operate when it receives commands from the processor 240.
  • the sleep mode and the low power mode mean the same thing, and they are referred to a mode that consumes less power than the active mode.
  • the second sensor 220 can also be programmed to be active periodically with a preset timer. When active, it functions for a certain period of time and then goes back to sleep mode. In some embodiments, the second sensor 220 can still receive commands from the processor 240 and/or other components of the sensor system 106 during the sleep mode.
  • the first sensor 210 can be configured in always on mode.
  • the first sensor 210 can be configured to have an active or high power mode and a sleep or low power mode, and the first sensor 210 can periodically transition between these two modes.
  • the first sensor 210 is configured to have a high duty cycle so that it can stay at the high power/ active mode more often.
  • processor 240 can periodically send a wake up signal to the first sensor 210 to force the first sensor 210 to be in the high power/ active mode.
  • the sensor system 106 can be built as an integrated circuit. In some embodiments, the sensor system 106 can be built as a discrete circuit, and one or more components of the sensor system 106 can be built from commercially available components. In some embodiments, the processor 240 can be a standalone component. In some embodiments, the processor 240 can be embedded in the first sensor 210 and/or the second sensor 220.
  • FIG. 3 shows detection ranges of the sensor system 106 according to certain embodiments of the present disclosure.
  • the sensor system 106 is attached to a treadmill 310.
  • the sensor system 106 includes a first sensor 210 and a second sensor 220.
  • area 320 represents the detection range of the first sensor 320
  • area 330 represent the detection range of the second sensor 220.
  • the target area of the sensor system 106 is the space on top of the belt of the treadmill 310.
  • the target area in FIG. 3 is shown to be the same as area 330, which is the detection range of the second sensor 220, the target area can be different from the detection range of the second sensor 220 in other cases.
  • FIG. 3 shows detection ranges of the sensor system 106 according to certain embodiments of the present disclosure.
  • the sensor system 106 is attached to a treadmill 310.
  • area 320 represents the detection range of the first sensor 320
  • area 330 represent the detection range of the second sensor 220.
  • the target area of the sensor system 106 is the
  • the detection range of the first sensor 210 usually overlaps with the detection range of the second sensor 220, but the detection range of the first sensor 210 can cover more area than the target area and the detection range of the second sensor 220.
  • the detection range of the first sensor 210 can be larger than the detection range of the second sensor 220 because the first sensor 210 is designed to be a coarse and low power sensor, so it can sometime respond to activity outside the target area.
  • the first sensor 210 is an accelerometer sensor, it may respond to vibration sources coming from any direction, not necessarily in the target area of the exercise device.
  • FIG.4 is a flow chart illustrating a process 400 of detecting whether or not a user enters or leaves a target area of an exercise device according to certain embodiments of the present disclosure.
  • the process 400 is mainly illustrated from the perspective of the components of the sensor device 106.
  • the process 400 can be modified by, for example, having steps rearranged, changed, added, and/or removed.
  • the first sensor 210 is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area.
  • the target area is a specific space or interest that indicates whether a user is using an exercise device.
  • the first sensor 210 is or includes a low power coarse proximity sensor with a coarse detection range, and the parameter to be detected by the first sensor 210 is based on a user entering or leaving the coarse detection range of the coarse proximity sensor.
  • the coarse proximity sensor can be a passive infrared sensor and/ or any other suitable sensor.
  • the first sensor 210 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor.
  • the motion sensor can be an accelerometer and/or any other suitable motion sensor.
  • the first sensor 210 is or includes a temperature sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of temperatures within the detection range of the temperature sensor.
  • the motion sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
  • the first sensor 210 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors.
  • a proximity sensor for example, a passive infrared sensor
  • an ambient light sensor for example, a photoelectric sensor
  • an ultrasonic sensor for example, a time of flight distance sensor
  • thermopile sensor thermopile sensor
  • the first sensor 210 in response to detecting a parameter that indicates a likelihood of having a user enter or leave a target area, sends a first command signal to the processor 240.
  • the process 400 then proceeds to step 406.
  • the processor 240 receives the first command signal from the first sensor 210 and send a second command signal to the second sensor 220.
  • the processor 240 can normally operate in a low power mode and switch to a high power/active mode upon receiving the first command signal from the first sensor 210.
  • the process 400 then proceeds to step 408.
  • the second sensor 220 which is normally in a low power/sleep mode, is switched to an active mode upon receiving the second command signal from the processor 240.
  • the process 400 then proceeds to step 410.
  • the second sensor 220 determines if the user enters or leaves the target area of the exercise device that the sensor system 116 is attached to.
  • the process 400 then proceeds to step 412.
  • the second sensor 220 is or includes a fine proximity sensor with a fine detection range, and the fine proximity sensor determines if the user enters or leaves the target area based on detecting if the user enters or leaves the fine detection range.
  • the fine proximity sensor is an active ranging sensor, which measures distance by emitting waves and calculating the distance based on the arrival time of the reflected waves.
  • the fine proximity sensor includes an infrared ranging sensor, an ultrasonic proximity sensor, and/or any other suitable sensor.
  • the second sensor 220 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 220 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor.
  • the motion sensor can be an accelerometer and/or any other suitable motion sensor.
  • the second sensor 220 is or includes a temperature sensor with a detection range, and the parameter to be detected by the second sensor 220 is based on detecting a change of temperatures within the detection range of the temperature sensor.
  • the motion sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
  • the second sensor 220 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors.
  • a proximity sensor for example, a passive infrared sensor
  • an ambient light sensor for example, a photoelectric sensor
  • an ultrasonic sensor for example, a photoelectric sensor
  • time of flight distance sensor for example, a thermopile sensor, or any other suitable sensors or combination of sensors.
  • the second sensor 220 has a detection range that is smaller than the detection range of the first sensor 210, but the second sensor 220 can be more accurately detect whether or not a user enters or leaves the detection range of the second sensor 220.
  • the second sensor 220 sends the detection result back to the processor 240.
  • the sensor system 106 can send that information to other components of the environment 100.
  • one or more components of the environment 100 can use that information to start or finish recording the user's exercise data and/or the exercise device's operation data.
  • one or more components of the environment 100 can use that information to start or finish its operation.
  • one or more components of the environment 100 can use that information to toggle between different power modes, such as between an active mode and a low power mode.
  • the sensor system 106 can send battery information, such as a brownout event, to the gateway of the server 104, so that gym operators can be timely informed to replace the battery of the sensor system.
  • battery information such as a brownout event
  • the sensor system 106 can periodically report its run-time status and statistics to the gateway, for book-keeping and diagnosis purpose of the sensor system 106 and/or the exercise device.
  • the sensor system 106 can receive commands from the gateway, such as flashing an LED included in the sensor system 106 to identify itself, so that a gym operator can easily identify the sensor system 106.
  • the server 104 may provide a front-end user interface (UI), such as a website, a dedicated PC, or a mobile application, for gym operators and/or trainers to access the users exercise activities, so that proper guidance, advice, and/or training can be provided to the users.
  • UI front-end user interface
  • a user interface on mobile and/or web interface can also be provided to users on mobile devices, for the purpose to monitor and track their exercise activities, as described above.
  • a user's detailed exercise information is collected and stored in the server 104.
  • the information includes, but not limited to, start/end time and date, equipment type, duration, sets and repeats (for pin-loaded equipment, workbenches, and power racks), break intervals in all sessions recorded by the mobile device 1 12 and/or the sensor system 106 associated with exercise device.
  • the data can be organized and displayed in many ways through the front-end user interface (UI).
  • UI front-end user interface
  • the aggregated data of all members collected through mobile devices 1 12 can be combined to track the equipment usage, improve operation efficiency of gyms, and provide more insights to optimize members' exercise routines.
  • the same type of equipment can be grouped together. For a certain group, its total number of visiting members, total number of visits, and total operation time can be compared against those of other groups. If one group has significantly more users than another group, the gym can look into the scenarios and decide which group or groups need to add or reduce number of equipment.
  • individual equipment can be compared against others of the same type, particularly when they are physically close. If one specific exercise device always has less member accesses than others or no member accesses, the gym operators may be informed to check the device. This may indicate that the exercise device has certain issues, such as a defect, being close to an environment that is not user-friendly, or something else that needs the gym operators' attention.

Abstract

A compound sensor system includes a first sensor, a second sensor, a memory that stores a module, and a processor coupled to the first sensor, the second sensor, and the memory. The first sensor is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to the processor. The processor is configured to receive the first command signal from the first sensor and send a second command signal to the second sensor based on receiving the first command signal. The second sensor is configured to operate at a sleep mode and switch to an active mode upon receiving the second command signal, and during the active mode the second sensor is configured to determine if the user enters or leaves the target area.

Description

SYSTEMS AND METHODS FOR A COMPOUND SENSOR SYSTEM
RELATED APPLICATION
[0001] This application claims benefit under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/305,504, filed on March 8, 2016, which is explicitly incorporated by reference herein in its entirety. This application claims benefit under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/310,524, filed on March 18, 2016, which is explicitly incorporated by reference herein in its entirety. This application also claims benefit under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/314,993, filed on March 30, 2016, which is explicitly incorporated by reference herein in its entirety. This application also relates to U.S. Patent Application No. 15/262,494, filed on September 12, 2016, which is incorporated herein in its entirety. This application also relates to U.S. Patent Application No. 15/331,238, filed on October 21, 2016, which is incorporated herein in its entirety. This application also relates to U.S. Patent Application No.
15/448, 194, filed on March 2, 2017, which is incorporated herein in its entirety.
BACKGROUND OF THE INVENTION
Technical Field
[0002] This invention relates generally to the field of low power sensor systems. Description of the Related Art
[0003] Proximity sensors, such as touch screens for cellphones and touch switch for lights, are widely used in today's consumer and industrial electronics. A proximity sensor typically functions by either (1) responding to a target object or an object's motion when the object is within the sensing range (e.g., a touch screen), or (2) directly detecting the distance between the object and the sensor (e.g. an infrared ranging sensor).
[0004] In the application of proximity sensors, the range of interest between the sensor and the target object can be either explicitly or implicitly specified. For example, a touch light switch typically functions only when a hand is placed within around 10 centimeters of the sensor. A three-dimensional (3D) hand gesture detection sensor for smart phones also works in similar ranges. Often, the proximity zone is directional, i.e., objects or motions can only be detected in front of the proximity sensor.
[0005] To measure the distance of an object, proximity sensors can use active emissions. Distance is determined by detecting the reflected emissions from the object. Typical emissions by the proximity sensors include infrared, ultrasonic, or any other suitable electromagnetic signals that can be bounced back from a target object. In some embodiments, the basic working principle of the ranging proximity sensor is similar to radars.
[0006] Active ranging proximity sensors can accurately sense whether an object is present or a motion happens in its proximity with well-defined range such as 10 centimeters, 35 centimeters, or any other suitable ranges. Traditional proximity sensors, however, typically consume more than 0.5 mA in current, which is less suitable for battery-powered systems. For example, one standard AA battery usually has the capacity of around lOOOmAh and can only support an active ranging proximity sensor for a few months. Design of the battery-operated sensor systems often requires a battery life that is longer than a year.
[0007] Therefore, it is desirable to provide methods and systems for a sensor system that consumes low power.
SUMMARY
[0008] In accordance with the disclosed subject matter, systems and methods are provided for a low power compound sensor system.
[0009] Disclosed subject matter includes, in one aspect, a compound sensor system includes a first sensor, a second sensor, a memory that stores a module, and a processor coupled to the first sensor, the second sensor, and the memory. The first sensor is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to the processor. The processor is configured to run the module stored in the memory that is configured to cause the processor to receive the first command signal from the first sensor and send a second command signal to the second sensor based on receiving the first command signal. The second sensor is configured to operate at a sleep mode and switch to an active mode upon receiving the second command signal, and during the active mode the second sensor is configured to determine if the user enters or leaves the target area.
[0010] Disclosed subject matter includes, in another aspect, a method of determining a user enters or leave a target area using a compound sensor system. The method includes detecting, using a first sensor of the compound sensor system, a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to a processor of the compound sensor system. The method includes sending, from the processor, a second command signal to a second sensor of the compound sensor system based on receiving the first command signal. The method includes switching the second sensor from a sleep mode to an active mode upon receiving the second command signal, and determining, using the second sensor, if the user enters or leaves the target area.
[0011] The present disclosure also discloses computer readable media that include executable instructions (e.g., computer program of instructions) operable to cause a device to perform the functions of the apparatuses described above.
[0012] There has thus been outlined, rather broadly, the features of the disclosed subject matter in order that the detailed description thereof that follows may be better understood, and in order that the present contribution to the art may be better appreciated. There are, of course, additional features of the disclosed subject matter that will be described hereinafter and which will form the subject matter of the claims appended hereto.
[0013] In this respect, before explaining at least one embodiment of the disclosed subject matter in detail, it is to be understood that the disclosed subject matter is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The disclosed subject matter is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting.
[0014] As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based, may readily be utilized as a basis for the designing of other structures, methods, and systems for carrying out the several purposes of the disclosed subject matter. It is important, therefore, that the claims be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the disclosed subject matter.
[0015] These together with the other objects of the disclosed subject matter, along with the various features of novelty which characterize the disclosed subject matter, are pointed out with particularity in the claims annexed to and forming a part of this disclosure. For a better understanding of the disclosed subject matter, its operating advantages and the specific objects attained by its uses, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated preferred embodiments of the disclosed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] Various objects, features, and advantages of the disclosed subject matter can be more fully appreciated with reference to the following detailed description of the disclosed subject matter when considered in connection with the following drawings, in which like reference numerals identify like elements.
[0017] FIG. 1 illustrates an environment for detecting a user's presence and recording the user's activity in a gym according to certain embodiments of the disclosed subject matter.
[0018] FIG. 2 illustrates a block diagram of a sensor system according to certain embodiments of the present disclosure.
[0019] FIG. 3 shows detection ranges of the sensor system according to certain embodiments of the present disclosure. [0020] FIG.4 is a flow chart illustrating a process of detecting whether or not a user enters or leaves a target area of an exercise device according to certain embodiments of the present disclosure.
DETAILED DESCRIPTION
[0021] In the following description, numerous specific details are set forth regarding the systems and methods of the disclosed subject matter and the environment in which such systems and methods may operate, etc., in order to provide a thorough understanding of the disclosed subject matter. It will be apparent to one skilled in the art, however, that the disclosed subject matter may be practiced without such specific details, and that certain features, which are well known in the art, are not described in detail in order to avoid complication of the disclosed subject matter. In addition, it will be understood that the examples provided below are exemplary, and that it is contemplated that there are other systems and methods that are within the scope of the disclosed subject matter.
[0022] In some embodiments of the present disclosure, a compound sensor system is disclosed to detect whether a user is using an exercise device in a gym in a power efficient way. In some embodiments, the compound sensor includes at least two sensors. One sensor is a low power sensor and can be configured to have a coarse detection. The other sensor is a high power sensor and can be configured to have a fine detection. To save power, the high power/fine sensor can normally operate at a low power/sleep mode. When the low power/coarse sensor detects an object or a motion, the fine sensor is switched on from the sleep mode to confirm whether the object and/or the motion is indeed in its proximity. Overall power consumption of the compound sensor system is reduced by only utilizing the fine/high power sensor when necessary.
[0023] FIG. 1 illustrates an environment 100 for detecting a user's presence and recording the user's activity in a gym according to certain embodiments of the disclosed subject matter. The environment 100 can include a communication network 102, a server 104, a sensor system 106, a local network storage medium 108, a remote network storage medium 1 10, a mobile device 1 12, a wireless network 1 14, a camera 1 16, and a registration system 1 18. Some or all components of the environment 100 can be coupled directly or indirectly to the communication network 102. The components included in the environment 100 can be further broken down into more than one component and/or combined together in any suitable arrangement. For example, in some embodiments, the sensor system 106 and the camera 1 16 can be combined as one device. Further, one or more components can be rearranged, changed, added, and/or removed. For example, the environment 100 can include more than one sensor system 106, more than one camera 1 16, and/or more than one mobile device 1 12. In some embodiments, the environment 100 can also include a tracking device.
[0024] The sensor system 106 can be attached to an exercise device. Non-limiting examples of exercise devices include treadmills, ellipticals, exercise bikes, rowing machines, stair climbers, weightlifting benches, weight machines, etc. In some embodiments, the sensor system 106 can be attached to an exercise device non- intrusively. In some embodiments, the sensor system 106 can be taken off from one exercise device and attached to another exercise device. The sensor system 106 can be configured to communicate wirelessly with at least one mobile device 1 12, the server 104, and/or other suitable components of the environment 100. The sensor system 106 can detect when a user or his or her mobile device 1 12 enters or leaves a target area of the sensor device 106 and notify other components of the environment 100, such as the mobile device 1 12, via the wireless network 1 14 and/or the communication network 102. For example, when the sensor device 106 is attached to a treadmill, then the target area can be above the base of the treadmill. As another example, when the sensor device 106 is attached to a workbench, then the target area can be the area where an exerciser sits or lie on. In some embodiments, the sensor system 106 can sense or detect movements of an exercise device and/or the user using the exercise device. In some embodiments, once the sensor system 106 detects a user is using an exercise device, it can report the detection result to and/or trigger other components of the environment 100. The structure and function of the sensor system 106 are described in more detail below.
[0025] The mobile device 1 12 can be connected to the sensor system 106 via the wireless network 114. In some embodiments, the mobile device 112 can also be configured to communicate wirelessly with the server 104 and/or other suitable components of the environment 100 via the wireless network 1 14 and/or the communication network 102. The mobile device can be a tablet computer, a personal digital assistant (PDA), a pager, a mobile or smart phone, a wireless sensor, a wearable device, or any other suitable device.
[0026] The communication network 102 can include a network or combination of networks that can accommodate private data communication. For example, the communication network 102 can include a local area network (LAN), a virtual private network (VPN) coupled to the LAN, a private cellular network, a private telephone network, a private computer network, a private packet switching network, a private line switching network, a private wide area network (WAN), a corporate network, or any number of private networks that can be referred to as an Intranet. Such networks may be implemented with any number of hardware and software components, transmission media and network protocols. FIG. 1 shows the communication network 102 as a single network; however, the communication network 102 can include multiple interconnected networks listed above.
[0027] The server 104 can be a single server, a network of servers, or a farm of servers in a data center. The server 104 can be coupled to a network storage system. The network storage system can include two types of network storage devices: a local network storage medium 108 and a remote network storage medium 110. The local network storage medium 108 and the remote network storage medium 110 can each include at least one physical, non-transitory storage medium, flash memory, a magnetic disk drive, an optical drive, a programmable read-only memory (PROM), a read-only memory (ROM), or any other memory or combination of memories. The local network storage medium 108 and the remote network storage medium 1 10 can be part of the server 104 or can be separated from the server 104.
[0028] In some embodiments, the server 104 can be located within or near a gym or a fitness center. In some embodiments, the server 104 can be located at a remote location. In some embodiments, the server 104 can also include a gateway and/or an access point to direct any signals received from the sensor system 106, the mobile device 1 12, and/or other components of the environment 100. [0029] In some embodiments, the server 104 manages a database of the registered gym members including registered faces gathered from the registration system 1 18. In some embodiments, the server 104 also stores the face images captured from the camera 1 16 and performs face recognition.
[0030] In some embodiments, the server 104 manages and stores user exercise data, which is collected by the exercise device with embedded sensors or by sensors attached to the exercise device. In some embodiments, the server 104 stores the exercise data in association with respective users, which can be identified by the face recognition process.
[0031] In some embodiments, if during the face recognition process, the server 104 determines that the image quality of the face image is not good enough for recognition, it sends commands back to the camera 1 16 to retake one or more photos and/or video clips.
[0032] In some embodiments, the server 104 may offload some of its computing and/or storage tasks to one or more gateways, as described below.
[0033] In some embodiments, the environment 100 may also include one or more gateways that are separate from the server 104. Multiple gateways can be deployed in one gym. In one embodiment, one or more gateway can be used as a communication hub to connect the camera 116 and/or other components of the environment 100 to the server 104.
[0034] In some embodiments, besides serving as the communication hub between the camera 1 16 and/or other components of the environment 100 on one end and the server 104 on the other end, a gateway can also help share the load of computing and reduce data storage required from the server 104. The advantages include, among others, faster response time and lower cloud computing cost.
[0035] In some embodiments, a gateway detects faces from one or more photos and/or video clips taken by the camera 1 16, extracts the face features from the photos, and transmits the extracted features together with the photos to the server 104 for face recognition and image storage. [0036] In some embodiments, the gateway detects faces from the one or more photos and/or video clips taken by the camera 1 16, extracts the face features from the photos, and performs face recognition locally. In this case, the server 104 only stores the photos received from the gateway. If the gateway determines that the image quality is not good enough for face recognition, it send commands to the camera module to retake one or more photos and restarts the face recognition process.
[0037] Furthermore, face recognition tasks can be partitioned and shared between the gateway and the server 104, and the partitioning and sharing can be arranged or rearranged dynamically to meet the face recognition system requirements.
[0038] The camera 1 16 can be attached to an exercise device. In some embodiments, the camera 1 16 can be attached to an exercise device non-intrusively. In some embodiments, the camera 1 16 can be taken off from one exercise device and attached to another exercise device. In some embodiments, the camera 1 16 can be configured to communicate wirelessly with at least one sensor system 106, at least one mobile device 1 12, the server 104, and/or other suitable components of the environment 100. In some embodiments, the camera 1 16 can detect when a user starts to use the exercise device that the camera 1 16 is attached to and start to acquire one or more photos and/or video clips that contain sufficient facial information of one or more users that are near the camera 1 16. In some embodiments, each exercise device in a gym will have a dedicated camera 1 16. In some embodiments, one or more exercise devices can share one camera 1 16.
[0039] The registration system 1 18 typically locates near or at the entrance of a facility, for example, the registration system 1 18 can locate near or at the entrance of a gym. In some embodiments, when a user enters or leaves a gym, he or she will be registered by the registration system 118. In some embodiments, the registration system 1 18 also includes a camera, which can be configured to acquire one or more photos and/or video clips of a user who sign in at the gym. In some embodiments, each user may register his or her face multiple times, which in general improve the performance of face recognition algorithms. When a registered user walks in the gym and/or starts on an exercise device, face images of the user captured by the camera 1 16 associated with the exercise device will be compared against registered faces to identify the correct user. [0040] In some embodiments, during the face registration, registered faces need to be validated by the registration system 1 18 and/or other suitable components of the environment 100. Validation criteria can include one or more of the following: ( 1) whether the user has a valid membership, and (2) whether the face images captured at the registration system 118 contain sufficient information for recognition purpose.
[0041] Each time a user registers at the registration system 1 18, his or her face information, such as photos or video clips, can be acquired by one of the following embodiments or any combinations of the following embodiments. In one embodiment, the user's face information can be acquired by the camera associated with the registration system 118. In one embodiment, the user's face information can be retrieved from the gym's member management system, where previously taken photos of gym members can be stored. In one embodiment, the user's face images can be acquired from mobile applications running on the user's mobile device 1 12 and/or other suitable devices associated with the user.
[0042] In some embodiments, the sensor system 106, the camera 1 16, the mobile device 1 12, and/or other components of the environment 100 can communicate with each other through the wireless connection 1 14. The wireless connection can be WiFi, ZigBee, IEEE802.15.4, Bluetooth, near field communication (NFC), or another connection using any other suitable wireless protocol standard or combination of standards. In some embodiments, the wireless connection 1 14 can be the same as the communication network 102. In some embodiments, the wireless connection 1 14 can be different from the communication network 102.
[0043] FIG. 2 illustrates a block diagram of a sensor system 106 according to certain embodiments of the present disclosure. The sensor system 106 includes a first sensor 210, a second sensor 220, a wireless transceiver 230, a processor 240, a memory 250, a module 260, and a power supply 270. The components included in the sensor system 106 can be further broken down into more than one component and/or combined together in any suitable arrangement. For example, the first sensor 210 can include one or more sensors. Similarly, the second sensor 220 can include one or more sensors. Further, one or more components can be rearranged, changed, added, and/or removed. [0044] The first sensor 210 is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to the processor 240. As discussed above, generally the target area is a specific space or interest that indicates whether a user is using an exercise device.
[0045] In some embodiments, the first sensor 210 is or includes a low power coarse proximity sensor with a coarse detection range, and the parameter to be detected by the first sensor 210 is based on a user entering or leaving the coarse detection range of the coarse proximity sensor. In some embodiments, the coarse proximity sensor can be a passive infrared sensor and/ or any other suitable sensor.
[0046] In some embodiments, the first sensor 210 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor. In some embodiments, the motion sensor can be an accelerometer and/or any other suitable motion sensor.
[0047] In some embodiments, the first sensor 210 is or includes a temperature sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of temperatures within the detection range of the temperature sensor. In some embodiments, the temperature sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
[0048] In some embodiments, the first sensor 210 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors.
[0049] The second sensor 220 is configured to more accurately determine whether or not a user enters or leaves a detection area. The second sensor 220 can be configured to send detection results to the processor 240 and/or other suitable components. In some embodiments, the second sensor 220 is or includes a fine proximity sensor with a fine detection range, and the fine proximity sensor determines if the user enters or leaves the target area based on detecting if the user enters or leaves the fine detection range. In some embodiments, the fine proximity sensor is an active ranging sensor, which measures distance by emitting waves and calculating the distance based on the arrival time of the reflected waves. In some embodiments, the fine proximity sensor includes an infrared ranging sensor, an ultrasonic proximity sensor, and/or any other suitable sensor.
[0050] In some embodiments, the second sensor 220 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 220 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor. In some embodiments, the motion sensor can be an accelerometer and/or any other suitable motion sensor.
[0051] In some embodiments, the second sensor 220 is or includes a temperature sensor with a detection range, and the parameter to be detected by the second sensor 220 is based on detecting a change of temperatures within the detection range of the temperature sensor. In some embodiments, the motion sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
[0052] In some embodiments, the second sensor 220 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors.
[0053] In some embodiments, the second sensor 220 has a detection range that is smaller than the detection range of the first sensor 210, but the second sensor 220 can be more accurately detect whether or not a user enters or leaves the detection range of the second sensor 220.
[0054] The wireless transceiver 230 can be configured to transmit any detection results of the sensor system 106 to the mobile device 112, the gateway, the server 104, and/or any other components of the environment 100. In some embodiments, the wireless transceiver 230 can also be configured to receive signals from one or more components of the environment 100. In some embodiments, the communication model 230 can enable the communication with other components of the environment 100 via the wireless network 1 14. In some embodiments, the wireless transceiver 230 can be used as the interface among various components of the sensor system 106. [0055] The processor 240 can include one or more cores and can accommodate one or more threads to run various applications and modules. The software can run on the processor 240 capable of executing computer instructions or computer code. The processor 240 might also be implemented in hardware using an application specific integrated circuit (ASIC), programmable logic array (PLA), field programmable gate array (FPGA), or any other integrated circuit.
[0056] The memory 250 can be a non-transitory computer readable medium, flash memory, a magnetic disk drive, an optical drive, a PROM, a ROM, or any other memory or combination of memories.
[0057] The processor 240 can be configured to run the module 260 stored in the memory 250 that is configured to cause the processor 240 to perform various steps that are discussed in the disclosed subject matter. For example, the processor 240 can be configured to receive the first commend signal from the first sensor 210 when the first sensor 210 detects a parameter that indicates a likelihood of having a user enter or leave a target area. The processor 240 can be configured to send a second command signal to the second sensor 220 based on receiving the first command signal. In some embodiments, the processor is preferably a low power processor. In some embodiments, the processor 240 can operate in an always-on mode. In some embodiments, the processor 240 can be configured to be in sleep mode and is only switched to an active mode upon receiving the first command signal from the first sensor 210.
[0058] The power supply 270 provides power to one or more other components of the sensor system 106. In some embodiments, the power supply 270 can be a battery source. In some embodiments, the power supply 270 can provide alternating current (AC) and/or direct current (DC) power via an external power source. In some embodiments, each of the first sensor 210 and the second sensor 220 has its own power supply.
[0059] In some embodiments, the first sensor 210 is designed to consume less than a few hundreds of mircoramps of power. The second sensor 220 generally consumes more power than the first sensor 210. To save power, in some embodiments, the first sensor 210 can also serve as a power control/ power switch for the second sensor system 220. For example, only when the first sensor 210 detects that there is a likelihood that a user enters or leaves a target area, can the second sensor 220 be switched on to more accurately determine whether a user enters or leaves the target area. In some embodiments, the second sensor 220 stays in a low power or sleep mode most of the time, and only wakes up to operate when it receives commands from the processor 240. In some embodiments, the sleep mode and the low power mode mean the same thing, and they are referred to a mode that consumes less power than the active mode. Alternatively, the second sensor 220 can also be programmed to be active periodically with a preset timer. When active, it functions for a certain period of time and then goes back to sleep mode. In some embodiments, the second sensor 220 can still receive commands from the processor 240 and/or other components of the sensor system 106 during the sleep mode.
[0060] In some embodiments, since the first sensor 210 consumes low power, the first sensor 210 can be configured in always on mode. In some embodiments, the first sensor 210 can be configured to have an active or high power mode and a sleep or low power mode, and the first sensor 210 can periodically transition between these two modes. In some embodiments, the first sensor 210 is configured to have a high duty cycle so that it can stay at the high power/ active mode more often. In some embodiments, when both the first sensor 210 and the second sensor 220 are configured to be switching between a high power/active mode and a low power/sleep mode, the first sensor 210 will be stayed at the high power/active mode longer than the second sensor 220 because the first sensor consumes lower power. In some embodiments, processor 240 can periodically send a wake up signal to the first sensor 210 to force the first sensor 210 to be in the high power/ active mode.
[0061] In some embodiments, the sensor system 106 can be built as an integrated circuit. In some embodiments, the sensor system 106 can be built as a discrete circuit, and one or more components of the sensor system 106 can be built from commercially available components. In some embodiments, the processor 240 can be a standalone component. In some embodiments, the processor 240 can be embedded in the first sensor 210 and/or the second sensor 220.
[0062] FIG. 3 shows detection ranges of the sensor system 106 according to certain embodiments of the present disclosure. In FIG. 3, the sensor system 106 is attached to a treadmill 310. As discussed above, the sensor system 106 includes a first sensor 210 and a second sensor 220. In FIG. 3, area 320 represents the detection range of the first sensor 320, and area 330 represent the detection range of the second sensor 220. In FIG. 3, the target area of the sensor system 106 is the space on top of the belt of the treadmill 310. Although the target area in FIG. 3 is shown to be the same as area 330, which is the detection range of the second sensor 220, the target area can be different from the detection range of the second sensor 220 in other cases. As shown in FIG. 3, in some embodiments, the detection range of the first sensor 210 usually overlaps with the detection range of the second sensor 220, but the detection range of the first sensor 210 can cover more area than the target area and the detection range of the second sensor 220. The detection range of the first sensor 210 can be larger than the detection range of the second sensor 220 because the first sensor 210 is designed to be a coarse and low power sensor, so it can sometime respond to activity outside the target area. For example, when the first sensor 210 is an accelerometer sensor, it may respond to vibration sources coming from any direction, not necessarily in the target area of the exercise device.
[0063] FIG.4 is a flow chart illustrating a process 400 of detecting whether or not a user enters or leaves a target area of an exercise device according to certain embodiments of the present disclosure. The process 400 is mainly illustrated from the perspective of the components of the sensor device 106. In some embodiments, the process 400 can be modified by, for example, having steps rearranged, changed, added, and/or removed.
[0064] At step 402, the first sensor 210 is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area. As discussed above, generally the target area is a specific space or interest that indicates whether a user is using an exercise device.
[0065] In some embodiments, the first sensor 210 is or includes a low power coarse proximity sensor with a coarse detection range, and the parameter to be detected by the first sensor 210 is based on a user entering or leaving the coarse detection range of the coarse proximity sensor. In some embodiments, the coarse proximity sensor can be a passive infrared sensor and/ or any other suitable sensor. [0066] In some embodiments, the first sensor 210 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor. In some embodiments, the motion sensor can be an accelerometer and/or any other suitable motion sensor.
[0067] In some embodiments, the first sensor 210 is or includes a temperature sensor with a detection range, and the parameter to be detected by the first sensor 210 is based on detecting a change of temperatures within the detection range of the temperature sensor. In some embodiments, the motion sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
[0068] In some embodiments, the first sensor 210 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors. The process 400 then proceeds to step 404.
[0069] At step 404, the first sensor 210, in response to detecting a parameter that indicates a likelihood of having a user enter or leave a target area, sends a first command signal to the processor 240. The process 400 then proceeds to step 406.
[0070] At step 406, the processor 240 receives the first command signal from the first sensor 210 and send a second command signal to the second sensor 220. In some embodiments, the processor 240 can normally operate in a low power mode and switch to a high power/active mode upon receiving the first command signal from the first sensor 210. The process 400 then proceeds to step 408.
[0071] At step 408, the second sensor 220, which is normally in a low power/sleep mode, is switched to an active mode upon receiving the second command signal from the processor 240. The process 400 then proceeds to step 410.
[0072] At step 410, the second sensor 220 determines if the user enters or leaves the target area of the exercise device that the sensor system 116 is attached to. The process 400 then proceeds to step 412. In some embodiments, the second sensor 220 is or includes a fine proximity sensor with a fine detection range, and the fine proximity sensor determines if the user enters or leaves the target area based on detecting if the user enters or leaves the fine detection range. In some embodiments, the fine proximity sensor is an active ranging sensor, which measures distance by emitting waves and calculating the distance based on the arrival time of the reflected waves. In some embodiments, the fine proximity sensor includes an infrared ranging sensor, an ultrasonic proximity sensor, and/or any other suitable sensor.
[0073] In some embodiments, the second sensor 220 is or includes a motion sensor with a detection range, and the parameter to be detected by the first sensor 220 is based on detecting a change of motions and/or vibration within the detection range of the motion sensor. In some embodiments, the motion sensor can be an accelerometer and/or any other suitable motion sensor.
[0074] In some embodiments, the second sensor 220 is or includes a temperature sensor with a detection range, and the parameter to be detected by the second sensor 220 is based on detecting a change of temperatures within the detection range of the temperature sensor. In some embodiments, the motion sensor can be an infrared thermopile sensor and/or any other suitable temperature sensor.
[0075] In some embodiments, the second sensor 220 can include more than one type of sensor, such as a proximity sensor (for example, a passive infrared sensor), an ambient light sensor, a photoelectric sensor, an ultrasonic sensor, a time of flight distance sensor, a thermopile sensor, or any other suitable sensors or combination of sensors.
[0076] In some embodiments, the second sensor 220 has a detection range that is smaller than the detection range of the first sensor 210, but the second sensor 220 can be more accurately detect whether or not a user enters or leaves the detection range of the second sensor 220.
[0077] At step 412, the second sensor 220 sends the detection result back to the processor 240.
[0078] In some embodiments, once the sensor system 106 determines that a user enters or leaves an exercise device, it can send that information to other components of the environment 100. In some embodiments, one or more components of the environment 100 can use that information to start or finish recording the user's exercise data and/or the exercise device's operation data. In some embodiments, one or more components of the environment 100 can use that information to start or finish its operation. In some embodiments, one or more components of the environment 100 can use that information to toggle between different power modes, such as between an active mode and a low power mode.
[0079] In some embodiments, if the sensor system 106 is battery-operated, the sensor system 106 can send battery information, such as a brownout event, to the gateway of the server 104, so that gym operators can be timely informed to replace the battery of the sensor system.
[0080] In some embodiments, regardless the exercise device is used or not, the sensor system 106 can periodically report its run-time status and statistics to the gateway, for book-keeping and diagnosis purpose of the sensor system 106 and/or the exercise device.
[0081] In some embodiments, the sensor system 106 can receive commands from the gateway, such as flashing an LED included in the sensor system 106 to identify itself, so that a gym operator can easily identify the sensor system 106.
[0082] In some embodiments, the server 104 may provide a front-end user interface (UI), such as a website, a dedicated PC, or a mobile application, for gym operators and/or trainers to access the users exercise activities, so that proper guidance, advice, and/or training can be provided to the users. In some embodiments, a user interface on mobile and/or web interface can also be provided to users on mobile devices, for the purpose to monitor and track their exercise activities, as described above.
[0083] In some embodiments, a user's detailed exercise information is collected and stored in the server 104. The information includes, but not limited to, start/end time and date, equipment type, duration, sets and repeats (for pin-loaded equipment, workbenches, and power racks), break intervals in all sessions recorded by the mobile device 1 12 and/or the sensor system 106 associated with exercise device. The data can be organized and displayed in many ways through the front-end user interface (UI). [0084] In some embodiments, the aggregated data of all members collected through mobile devices 1 12 can be combined to track the equipment usage, improve operation efficiency of gyms, and provide more insights to optimize members' exercise routines.
[0085] In some embodiments, the same type of equipment can be grouped together. For a certain group, its total number of visiting members, total number of visits, and total operation time can be compared against those of other groups. If one group has significantly more users than another group, the gym can look into the scenarios and decide which group or groups need to add or reduce number of equipment.
[0086] In some embodiments, individual equipment can be compared against others of the same type, particularly when they are physically close. If one specific exercise device always has less member accesses than others or no member accesses, the gym operators may be informed to check the device. This may indicate that the exercise device has certain issues, such as a defect, being close to an environment that is not user-friendly, or something else that needs the gym operators' attention.
[0087] It is to be understood that the disclosed subject matter is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The disclosed subject matter is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting.
[0088] As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based, may readily be utilized as a basis for the designing of other structures, systems, methods and media for carrying out the several purposes of the disclosed subject matter. It is important, therefore, that the claims be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the disclosed subject matter.
[0089] Although the disclosed subject matter has been described and illustrated in the foregoing exemplary embodiments, it is understood that the present disclosure has been made only by way of example, and that numerous changes in the details of implementation of the disclosed subject matter may be made without departing from the spirit and scope of the disclosed subject matter, which is limited only by the claims which follow.

Claims

What is claimed is:
1. A compound sensor system, comprising:
a first sensor;
a second sensor;
a memory that stores a module; and
a processor coupled to the first sensor, the second sensor, and the memory, wherein the first sensor is configured to detect a parameter that indicates a likelihood of having a user enter or leave a target area, and, in response, send a first command signal to the processor,
wherein the processor is configured to run the module stored in the memory that is configured to cause the processor to:
receive the first command signal from the first sensor, and send a second command signal to the second sensor based on receiving the first command signal,
wherein the second sensor is configured to operate at a sleep mode and switch to an active mode upon receiving the second command signal, wherein during the active mode the second sensor is configured to determine if the user enters or leaves the target area.
2. The compound sensor system of claim 1, wherein the second sensor, upon receiving the second command signal, is configured to remain at the active power mode for an active detection period before switching to the low power mode.
3. The compound sensor system of claim 2, wherein the first sensor has a first period, wherein during the first period, the first sensor is at a high power mode for a first high power time and at a low power mode for a first low power time, wherein the second sensor has a second period, wherein during the second period, if the second sensor is not in the active detection period, the second sensor is at the active mode for a second high power time and at the sleep mode for a second lower power time.
4. The compound sensor system of claim 3, wherein the second period is longer than the first period, and the second low power time is longer than the first lower power time.
5. The compound sensor system of claim 3, wherein the first high power time is the same as the first period, and the first low power time is zero.
6. The compound sensor system of claim 1, wherein second sensor includes a fine proximity sensor with a fine detection range, wherein the fine proximity sensor determines if the user enters or leaves the target area based on detecting if the user enters or leaves the fine detection range.
7. The compound sensor system of claim 6, wherein the fine proximity sensor includes at least one of an infrared ranging sensor or an ultrasonic proximity sensor.
8. The compound sensor system of claim 6, wherein the first sensor includes a coarse proximity sensor with a coarse detection range, wherein the parameter is based on the user entering or leaving the coarse detection range, wherein the coarse detection range is larger than the fine detection range.
9. The compound sensor system of claim 1, wherein the first sensor includes a coarse proximity sensor with a coarse detection range, wherein the parameter is based on the user entering or leaving the coarse detection range.
10. The compound sensor system of claim 9, wherein the coarse proximity sensor is a passive infrared sensor.
11. The compound sensor system of claim 1, wherein the first sensor includes a motion sensor with a detection range, wherein the parameter is based on detecting a change of motions within the detection range.
12. The compound sensor system of claim 1 1, wherein the motion sensor is an accelerometer.
13. The compound sensor system of claim 1, wherein the first sensor includes a temperature sensor with a detection range, wherein the parameter is based on detecting a change of temperatures within the detection range.
14. The compound sensor system of claim 13, wherein the temperature sensor is an infrared thermopile sensor.
PCT/US2017/021448 2016-03-08 2017-03-08 Systems and methods for a compound sensor system WO2017156188A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780023510.2A CN109074721A (en) 2016-03-08 2017-03-08 System and method for multiple-sensor system
EP17764046.3A EP3427240A4 (en) 2016-03-08 2017-03-08 Systems and methods for a compound sensor system

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201662305504P 2016-03-08 2016-03-08
US62/305,504 2016-03-08
US201662310524P 2016-03-18 2016-03-18
US62/310,524 2016-03-18
US201662314993P 2016-03-30 2016-03-30
US62/314,993 2016-03-30

Publications (1)

Publication Number Publication Date
WO2017156188A1 true WO2017156188A1 (en) 2017-09-14

Family

ID=59787505

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/021448 WO2017156188A1 (en) 2016-03-08 2017-03-08 Systems and methods for a compound sensor system

Country Status (4)

Country Link
US (2) US10728694B2 (en)
EP (1) EP3427240A4 (en)
CN (1) CN109074721A (en)
WO (1) WO2017156188A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11525735B2 (en) * 2016-01-11 2022-12-13 Carrier Corporation Infrared presence detector system
GB2560553A (en) * 2017-03-15 2018-09-19 Senceive Ltd Wireless sensing apparatus and method
CN107992336A (en) * 2017-11-28 2018-05-04 深圳市筑泰防务智能科技有限公司 A kind of dual system switching method of enterprises mobile terminal
US11172180B2 (en) * 2018-09-19 2021-11-09 Canon Kabushiki Kaisha Control apparatus, control method and non-transitory computer-readable medium
US11580765B2 (en) * 2019-09-02 2023-02-14 Tobii Ab Method and system for detecting physical presence
CA3099061A1 (en) * 2019-11-15 2021-05-15 Op-Hygiene Ip Gmbh Fluid dispenser with wake up sensor
US11796715B2 (en) 2020-06-24 2023-10-24 Sloan Valve Company Hybrid time-of-flight sensor and IR sensor

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080234935A1 (en) 2007-03-23 2008-09-25 Qualcomm Incorporated MULTI-SENSOR DATA COLLECTION and/or PROCESSING
US20130057894A1 (en) 2011-09-06 2013-03-07 Fuji Xerox Co., Ltd. Power supply control apparatus, image processing apparatus, non-transitory computer readable medium storing power supply control program
US20140107846A1 (en) * 2012-10-12 2014-04-17 Telefonaktiebolaget L M Ericsson (Publ) Method for synergistic occupancy sensing in commercial real estates
US20150006927A1 (en) 2013-06-28 2015-01-01 Fuji Xerox Co., Ltd. Information processing apparatus, information processing method and non-transitory computer readable medium
CN204360454U (en) * 2015-01-28 2015-05-27 四川君逸易视科技有限公司 Infrared array people counting device
US20150260580A1 (en) 2014-03-11 2015-09-17 Google Technology Holdings LLC Display viewing detection
US20160036996A1 (en) 2014-08-02 2016-02-04 Sony Corporation Electronic device with static electric field sensor and related method

Family Cites Families (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6032109A (en) * 1996-10-21 2000-02-29 Telemonitor, Inc. Smart sensor module
KR100205384B1 (en) * 1997-03-14 1999-07-01 구자홍 Infrared sensor and method of temperature compensation
US7114079B1 (en) 2000-02-10 2006-09-26 Parkervision, Inc. Security access based on facial features
US6680745B2 (en) 2000-11-10 2004-01-20 Perceptive Network Technologies, Inc. Videoconferencing method with tracking of face and dynamic bandwidth allocation
GB2378078B (en) 2001-07-27 2005-05-25 Hewlett Packard Co Image capture
US8430749B2 (en) 2001-08-10 2013-04-30 Igt Dynamic casino tracking and optimization
JP4460294B2 (en) * 2001-10-24 2010-05-12 バーテックス ファーマシューティカルズ インコーポレイテッド Inhibitors of serine proteases, particularly hepatitis C virus NS3-NS4A protease, incorporating a condensed ring system
KR100839772B1 (en) 2003-07-15 2008-06-20 오므론 가부시키가이샤 Object decision device and imaging device
US7525570B2 (en) 2003-07-17 2009-04-28 Igt Security camera interface
WO2005071634A2 (en) * 2004-01-27 2005-08-04 Richard Turner Method and apparatus for detection and tracking of objects within a defined area
US8478837B2 (en) 2004-01-28 2013-07-02 Microsoft Corporation Offline global address list
JP2005258860A (en) 2004-03-12 2005-09-22 Matsushita Electric Ind Co Ltd Multiple authentication method and its device
US7889381B2 (en) 2004-05-28 2011-02-15 Fujifilm Corporation Photo service system
US20060018522A1 (en) 2004-06-14 2006-01-26 Fujifilm Software(California), Inc. System and method applying image-based face recognition for online profile browsing
US7904052B2 (en) * 2005-02-23 2011-03-08 Hitachi, Ltd. Sensor net management method
JP4367424B2 (en) 2006-02-21 2009-11-18 沖電気工業株式会社 Personal identification device and personal identification method
EP1998567B1 (en) 2006-03-15 2016-04-27 Omron Corporation Tracking device, tracking method, tracking device control program, and computer-readable recording medium
JP4862447B2 (en) 2006-03-23 2012-01-25 沖電気工業株式会社 Face recognition system
JP5087856B2 (en) 2006-04-05 2012-12-05 株式会社ニコン Electronic camera
US7886940B2 (en) * 2006-07-25 2011-02-15 Lockheed Martin Corporation Storage system for fuel cell gases
KR100836577B1 (en) 2006-08-08 2008-06-10 엘지전자 주식회사 Bluetooth system and method of networking the same
JP4213176B2 (en) * 2006-11-16 2009-01-21 シャープ株式会社 Sensor device, server node, sensor network system, communication path construction method, control program, and recording medium
JP5010905B2 (en) 2006-12-13 2012-08-29 パナソニック株式会社 Face recognition device
US7914420B2 (en) 2007-07-18 2011-03-29 Brunswick Corporation Sensing applications for exercise machines
US9329052B2 (en) 2007-08-07 2016-05-03 Qualcomm Incorporated Displaying image data and geographic element data
KR101469246B1 (en) 2007-08-07 2014-12-12 삼성전자주식회사 Apparatus and method for shooting picture in robot
US8758102B2 (en) 2008-03-25 2014-06-24 Wms Gaming, Inc. Generating casino floor maps
US8489021B2 (en) 2008-04-03 2013-07-16 Polar Electro Oy Communication between portable apparatus and counterpart apparatus
JP5227911B2 (en) 2009-07-22 2013-07-03 株式会社日立国際電気 Surveillance video retrieval device and surveillance system
US20110103643A1 (en) 2009-11-02 2011-05-05 Kenneth Edward Salsman Imaging system with integrated image preprocessing capabilities
JP5605854B2 (en) 2009-11-17 2014-10-15 株式会社 日立産業制御ソリューションズ Authentication system and authentication apparatus using biometric information
US8462622B2 (en) 2009-12-08 2013-06-11 Qualcomm Incorporated Detection of co-located interference in a multi-radio coexistence environment
US8544033B1 (en) 2009-12-19 2013-09-24 Cisco Technology, Inc. System and method for evaluating content in a digital signage environment
US9398231B2 (en) 2010-03-15 2016-07-19 Omron Corporation Surveillance camera terminal
US8888660B1 (en) 2010-11-02 2014-11-18 Strength Companion, LLC Energy harvester for exercise equipment
KR101818092B1 (en) 2010-11-10 2018-01-12 나이키 이노베이트 씨.브이. Systems and methods for time-based athletic activity measurement and display
US8930734B1 (en) 2011-03-30 2015-01-06 Google Inc. Managing power states of a computing device
US8939007B2 (en) * 2011-04-27 2015-01-27 Panasonic Corporation Inertial force sensor and zero point correction method used therein
US9336456B2 (en) 2012-01-25 2016-05-10 Bruno Delean Systems, methods and computer program products for identifying objects in video data
US20130208952A1 (en) 2012-02-13 2013-08-15 Geoffrey Auchinleck Method and Apparatus for Improving Accuracy of Biometric Identification in Specimen Collection Applications
US9152868B2 (en) 2012-03-23 2015-10-06 Microsoft Technology Licensing, Llc Personal identification combining proximity sensing with biometrics
US8457367B1 (en) 2012-06-26 2013-06-04 Google Inc. Facial recognition
US8437513B1 (en) 2012-08-10 2013-05-07 EyeVerify LLC Spoof detection for biometric authentication
US8856541B1 (en) 2013-01-10 2014-10-07 Google Inc. Liveness detection
JP5975293B2 (en) 2013-02-22 2016-08-23 富士ゼロックス株式会社 Authentication apparatus and program
US20140274031A1 (en) 2013-03-13 2014-09-18 Qualcomm Incorporated Sharing data among proximate mobile devices with short-range wireless signals
WO2014179707A1 (en) 2013-05-02 2014-11-06 Rolley David System and method for collecting, analyzing and reporting fitness activity data
US9826923B2 (en) 2013-10-31 2017-11-28 Roshanak Houmanfar Motion analysis method
EP3114542B1 (en) 2014-03-06 2023-08-02 Polar Electro Oy Device power saving during exercise
US9830631B1 (en) 2014-05-02 2017-11-28 A9.Com, Inc. Image recognition result culling
US9669261B2 (en) 2014-05-21 2017-06-06 IncludeFitness, Inc. Fitness systems and methods thereof
CN104636751A (en) 2014-12-11 2015-05-20 广东工业大学 Crowd abnormity detection and positioning system and method based on time recurrent neural network
WO2016138042A2 (en) 2015-02-23 2016-09-01 Praveen Kashyap Method and system for virtual fitness training and tracking services
US9300925B1 (en) 2015-05-04 2016-03-29 Jack Ke Zhang Managing multi-user access to controlled locations in a facility
US9959728B2 (en) 2015-06-04 2018-05-01 International Business Machines Corporation Managing a smart appliance with a mobile device
US10863003B2 (en) 2015-09-10 2020-12-08 Elliot Berookhim Methods, devices, and systems for determining a subset for autonomous sharing of digital media
KR102411842B1 (en) * 2015-11-27 2022-06-22 삼성전자 주식회사 Apparatus and method for determining presence of user
US9992429B2 (en) 2016-05-31 2018-06-05 Microsoft Technology Licensing, Llc Video pinning
US10586433B2 (en) 2017-02-13 2020-03-10 Google Llc Automatic detection of zones of interest in a video
US10762640B2 (en) 2017-05-22 2020-09-01 Creavision Technologies, Ltd. Systems and methods for user detection, identification, and localization within a defined space

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080234935A1 (en) 2007-03-23 2008-09-25 Qualcomm Incorporated MULTI-SENSOR DATA COLLECTION and/or PROCESSING
US20130057894A1 (en) 2011-09-06 2013-03-07 Fuji Xerox Co., Ltd. Power supply control apparatus, image processing apparatus, non-transitory computer readable medium storing power supply control program
US20140107846A1 (en) * 2012-10-12 2014-04-17 Telefonaktiebolaget L M Ericsson (Publ) Method for synergistic occupancy sensing in commercial real estates
US20150006927A1 (en) 2013-06-28 2015-01-01 Fuji Xerox Co., Ltd. Information processing apparatus, information processing method and non-transitory computer readable medium
US20150260580A1 (en) 2014-03-11 2015-09-17 Google Technology Holdings LLC Display viewing detection
US20160036996A1 (en) 2014-08-02 2016-02-04 Sony Corporation Electronic device with static electric field sensor and related method
CN204360454U (en) * 2015-01-28 2015-05-27 四川君逸易视科技有限公司 Infrared array people counting device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3427240A4

Also Published As

Publication number Publication date
US11350235B2 (en) 2022-05-31
US20200359165A1 (en) 2020-11-12
US10728694B2 (en) 2020-07-28
EP3427240A1 (en) 2019-01-16
US20170265034A1 (en) 2017-09-14
CN109074721A (en) 2018-12-21
EP3427240A4 (en) 2019-10-30

Similar Documents

Publication Publication Date Title
US11350235B2 (en) Systems and methods for a compound sensor system
US10909355B2 (en) Systems and methods for efficient face recognition
US10970525B2 (en) Systems and methods for user detection and recognition
KR101745684B1 (en) Dynamic sampling
US20170209743A1 (en) System and method for linking oscillating movements of exercise equipment to a user of the exercise equipment in a database
US20210373919A1 (en) Dynamic user interface
CN106019968A (en) Internet-of-things detection system and method for detecting number of people in rooms and states of rooms
Belapurkar et al. Building data-aware and energy-efficient smart spaces
Ghosh et al. UltraSense: A non-intrusive approach for human activity identification using heterogeneous ultrasonic sensor grid for smart home environment
Bayındır A survey of people-centric sensing studies utilizing mobile phone sensors
JP2017096566A (en) Dust collection system and dust collection method
KR20160121756A (en) The acquisition management method of exercise information and the acquisition management terminal of exercise information
US9907999B2 (en) Systems and methods for tracking, collecting, and analyzing user data for gyms
Zhan et al. Mosen: Activity modelling in multiple-occupancy smart homes
Sie et al. Integrating cloud computing, internet-of-things (Iot), and community to support long-term care and lost elderly searching
US20170224254A1 (en) Analyzing system and analyzing method for evaluating calorie consumption by detecting the intensity of wireless signal
WO2020208922A1 (en) Lower limb muscle strength estimation system, lower limb muscle strength estimation method, and program
Mayton Wristque: a personal sensor wristband for smart infrastructure and control
Wu et al. Case studies of WSN-CPS applications
US20240005648A1 (en) Selective knowledge distillation
US20230011337A1 (en) Progressive deep metric learning
Nivas FALL DETECTION SYSTEM
Wu et al. Distributed and Portable Fall Detection Using Wireless Body Area Network at Nursing Home
Issakar Semantic sensor network
JP2014106630A (en) Automatic registration device, automatic registration method, and program

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017764046

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017764046

Country of ref document: EP

Effective date: 20181008

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17764046

Country of ref document: EP

Kind code of ref document: A1