WO2015157757A2 - Interactive systems and methods for data capture - Google Patents

Interactive systems and methods for data capture Download PDF

Info

Publication number
WO2015157757A2
WO2015157757A2 PCT/US2015/025548 US2015025548W WO2015157757A2 WO 2015157757 A2 WO2015157757 A2 WO 2015157757A2 US 2015025548 W US2015025548 W US 2015025548W WO 2015157757 A2 WO2015157757 A2 WO 2015157757A2
Authority
WO
WIPO (PCT)
Prior art keywords
data
target area
individual
processing module
time
Prior art date
Application number
PCT/US2015/025548
Other languages
French (fr)
Other versions
WO2015157757A3 (en
Inventor
Carl Brown
James Martin
Original Assignee
Deja View Concepts, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deja View Concepts, Inc. filed Critical Deja View Concepts, Inc.
Publication of WO2015157757A2 publication Critical patent/WO2015157757A2/en
Publication of WO2015157757A3 publication Critical patent/WO2015157757A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Definitions

  • the present systems and methods relate generally to data capture of an individual (s) or events and more particularly to a real-time interactive data capture and tracking system based on activities across a range of environments.
  • RFID Radio Frequency Identification
  • Concurrent systems include tags which are worn by each park patron during their visit to the park or other entertainment facility.
  • Various readers distributed throughout the park or entertainment facility are able to read the RFID tags.
  • the unique identifier numbers can be conveniently read and provided to an associated photo/ video capture system for purposes of providing indexing of captured photo/video images according to the unique identifiers of all individuals standing within the field of view of the camera.
  • an interactive system and method for data capture may utilize a plurality of data capture devices to track and capture an individual engaged in an activity in a defined environment.
  • the instant system and method is directed towards tracking a participant or activity and capturing that activity as well as the routes used during that activity.
  • the system and accompanying apparatuses are able to track the location of the participant and react with a predetermined action, in one instance being taking photographs or video footage.
  • the system will be configured to track the participant over a large area to ensure that the entire event is captured.
  • a series of cameras and a series of beacons are housed in the same unit which will be referred to as the recording unit.
  • additional embodiments may feature separately housed cameras and beacons.
  • the system detects when a user enters a target area through a user' s tracking device by a plurality of beacons located throughout the target area.
  • the detection of the user by each beacon may occur by Bluetooth ® , GPS or a combination of the systems.
  • the system determines the best time to capture the user while in the target area based on the user' s movements in the target.
  • the system may capture the user through a plurality of recording units located in the target area, wherein the system may capture photographs, videos or a combination of both while the user is located within the target area.
  • the system may process the data and subsequently send the captured photographs and videos directly to a user's device, which may be the tracking device itself or another separate user device.
  • the system determines and records the signal strength (RSSI) of the user' s tracking device in relation to each beacon in the target area in order to approximate the range of the user from each beacon.
  • RSSI signal strength
  • the system will continually detect and record the strength of the signal from the user' s tracking device in relation to the beacon until a minimum threshold signal level is reached. Once the signal strength of the tracking device has reached the minimum predetermined level, the system then records this level as a start time for the beacon in the given target area.
  • the user and the user' s associated tracking device do not "trigger" a capture event (i.e. start capturing and then stop capturing) in the target area. Rather the system analyzes the signal strength of the tracking device against the threshold levels set by the system to determine a start and stop capture period of the user in the target area.
  • the present system allows for tracking and data capture related to a user by (1) detecting a person has entered an area of interest using BTLE sensors, (2) detecting how the person behaves in that detection area to calculate the best time to capture the event, and (3) adding fixed wiring via a device carried upon the person of a user wherein the device is configured to connects wirelessly with the system to capture the event in an open area without walls or buttons .
  • utilization of Bluetooth ® low energy sensors in conjunction with the present system's unique algorithm which monitors a user' s behavioral patterns within the target area in order to create a unique experience.
  • This technology marrying may be accomplished by using a gyroscope and/or accelerometers to trigger an event, in conjunction with the beacon detection. For example, when a user enters an area of interest and pauses for a period of time; uniquely, when a user begins moving again, the system triggers an event and records video of that event to capture the 'moment of acceleration' or peak moment of movement.
  • the movement may be qualified as merely a simple motion or it may be a sequence of events to capture an action.
  • the monitoring of a sequence of events instead of just a single event, as in concurrent systems, ensures that the best/exact moment to capture a video of the event is chosen.
  • the advantage of using the accelerometer or gyroscope in this manner appears in the convenience of operation. In the present method, the user just does what they would normally do. Ergo, the user in never not forced to stop, open the phone, press a button, and put the phone away in order to capture the best moment for a picture or video of the event. The system passively captures the best moments.
  • the present invention provides a data capture system.
  • the data capture system includes a plurality of recording units located within a target area.
  • the target area is configured for data capture of an individual.
  • the data capture system also includes a tracking device that is configured to communicate with each of the recording units. The tracking device is used to detect when an individual enters the target area for data capture.
  • the data is stored in an on-site storage module.
  • Each recording unit that is part of the data capture system includes a camera and a beacon housed within the recording unit.
  • Each beacon is able to communicate with the tracking device through GPS, Bluetooth, or a combination thereof located within the user's computing device.
  • the data processing module also includes a time processing module.
  • the time processing module is configured to identify an individual's entry time and an exit time within the target area.
  • the data processing module has an assembly module.
  • the assembly module is configured to assemble the captured data into a plurality of video data. The video data can then be viewed on a user's computing device.
  • Some computing devices a user may possess include a smartphone, tablet, laptop computer, desktop computer, PDA or any other computing device.
  • the data processing module may further include a computer program product for creating and sending an assembled video to the user' s computing device from the captured data of the individual in the target area.
  • the computer program may have a non-transitory computer usable medium. This usable medium may also have a computer readable program.
  • the computer readable program causes the data processing module to capture data of an individual in the target area from the recording device. It is also capable of receiving a first signal about a minimum threshold and a second signal below the minimum threshold from the tracking device in the target area. Further, the computer readable program causes the data processing module to transmit the first and second signals to the data processing module in order to determine a timeframe when the individual was within the threshold levels in the target area.
  • the data processing module also pulls the captured data from the on-site storage module to a time processing module in the data processing module for the timeframe the individual is within the threshold levels in the target area.
  • the captured data is then assembled into a video by an assembly module and the assembled video is transmitted to the user's computing device.
  • the data processing module has a computer program product for capturing data of the individual while in the target area.
  • the computer program has may have a non-transitory computer usable medium.
  • the non-transitory computer usable medium includes a computer readable program.
  • the computer readable program allows the data processing module to determine whether the tracking device is receiving a device signal from a beacon. It further determines whether the device signal is above a minimum threshold for signal strength and records a time when the device signal received is above the threshold limit. The time is further recorded when the device signal received is at a peak and when the device signal received is below the minimum threshold for signal strength. Then the data processing module transmits the time data and signal strength data to an off-site server module.
  • the data capture system may also include a local server.
  • the local server is configured to collect data from the target area and the recording units.
  • the data capture system may have an off site server.
  • the off site server is in data communication with the local server and is configured to receive the captured data of the individual entering and exiting the target area.
  • Another aspect of the present invention includes a computer program for creating a video sequence product.
  • the program may have a computer usable medium having a computer program code recorded thereon and configured to cause a processing device to perform many different functions. Some of the functions include: capturing data from each of the recording units where each recording unit covers a specific target area defined by a range of a beacon in the recoding unit; receiving time data of an individual entering and exiting the target area; retrieving the captured data corresponding to the time data of the individual in the target area; and assembling the captured data into a video sequence.
  • the receiving time data may also include determining an individual's entry time into a target area. This is determined by the signal strength of a tracking device when the tracking device reaches a minimum threshold.
  • the receiving time data further includes determining an individual's exit time out of the target area. This occurs when the signal strength of the tracking device falls below the minimum threshold.
  • the capturing data is capable of determining the location of the individual based on the tracking device. Then, the capturing data compares the determined location with the coverage areas of the plurality of recording units to determine which specific recording unit will capture data from the determined target area.
  • the assembling the captured data into a video sequence includes selecting captured data from the determined recording units for assembly into a video.
  • Storing the captured data from the plurality of recording units positioned within a target area may include storing the captured data as data segments and receiving timing information associated with the tracking device. Each data segment has an associated target area and timing data.
  • Another aspect of the present invention includes a method of capturing an individual in a target area. The method includes the steps of capturing a plurality of data of an individual in a target area by a plurality of recording units located within the target area. An on-site storage module then stores the captured data of the individual in the target area. The captured data is then pulled from the on-site storage module in order to process the captured data to a data processing module. An individual's entry time and exit time in and out of the target areas is identified by a time processing module. The captured data of the individual between his entry and exit time in and out of the target area is then assembled into video data by an assembly module. The assembled video is then transmitted to the individual's computer device, by the assembly module for an individual to view.
  • Additional steps include the capturing of an individual in a target area by receiving a first signal above a minimum threshold and a second signal below the minimum threshold from the tracking device in the target area.
  • the first and second signals are transmitted to the data processing module to determine a time frame of when the individual is within the threshold levels in the target area.
  • the captured data is pulled from the on-site storage module to a time processing module in the data processing module for the timeframe that the individual is within the threshold levels in the target area.
  • the captured data is then assembled into a video by an assembly module and the assembled video is then transmitted to the user's computing device.
  • Each recording unit has a camera and a beacon housed within the recording unit.
  • Each beacon may communicate with the tracking device through GPS, Bluetooth of a combination thereof.
  • the user's computing device may include a variety of devices including a smartphone, a tablet, a laptop computer, a desktop computer, a PDA or any other like device.
  • Figure 1 illustrates a diagram of an exemplary configuration of one embodiment of the system utilized on a ski mountain
  • Figure 2 illustrates a diagram of another exemplary configuration of one embodiment of the system on a ski mountain ;
  • Figure 3A illustrates a perspective diagram of an individual user of the system entering a target area for one embodiment of data capture
  • Figure 3B illustrates a perspective diagram of an individual user of the system exiting a target area for one embodiment of data capture
  • Figure 3C illustrates a diagram of an exemplary configuration of one embodiment of the system utilized on in an amusement park
  • Figure 4A illustrates a block diagram of one embodiment of the hardware architecture utilized to capture, track and/or process a plurality of data of the individual user in a target area;
  • Figure 4B illustrates a flow diagram of a method to collect and process the captured data of an individual user of the system while in the target area;
  • Figure 5 illustrates a flow diagram of one embodiment of a method to process the data from an individual user of the system that has been collected while in the target area;
  • Figure 6 illustrates a block diagram of one embodiment of a tracking beacon to be utilized to determine the location of an individual user of the system
  • Figure 7 illustrates a flow diagram of one embodiment of a method of smoothing a signal received from the tracking beacon
  • Figure 8 illustrates a block diagram of an exemplary computing system to track and capture data of an individual user in a target area
  • Figure 9A illustrates a diagram of a signal reception from a tracking beacon when the individual user enters a target area
  • Figure 9B illustrates a diagram of a signal reception from a tracking beacon during peak signal strength when the individual is in the target area
  • Figure 9C illustrates a diagram of a signal reception from a tracking beacon when the individual user exits the target area
  • Figure 10 illustrate a diagram of one embodiment of a user controlled recording system to track and capture data
  • Figure 11 illustrates a block diagram of the hardware architecture where an individual user of the system may be a receiver to allow to monitor for a tracking beacon in the vicinity of the individual user;
  • Figure 12 illustrates a block diagram of the hardware architecture where an individual user of the system is util an active beacon for data communication with nearby receivers.
  • the present systems and methods relate to interactive data capture of an individual user.
  • the system captures images of users who are partaking in a memorable activity.
  • the system is particularly useful when the user is partaking in an activity that would make it difficult or impossible for them to capture data of themselves performing or engaged in the activity.
  • FIG. 1A illustrates a diagram of one environment of a sample target area that may be utilized to deploy the present system.
  • a target area would be a ski mountain, however in other embodiments, the system may be utilized in such environments including, but not limited to amusement parks, recreational facilities such as golf courses and skate parks, music festivals, tourist attractions or any other suitable environment where individuals desire to obtain images, videos and/or any other data capture while engaged in observing and/or performing an activity in any number of environments such as the exemplary embodiments described above.
  • a target area 120 includes multiple trails along the side of a mountain.
  • the trails in this embodiment include multiple ski lifts 124 supported by upright members 126 such as columns or poles.
  • trails there may be a variety of trails in the target area 120 .
  • These trails may feature obstacles such as moguls or jumps to heighten the excitement and challenge for skiers. Any or all of these trails may be skied by skiers as long as the skier possesses the requisite skill.
  • alternative ski areas are not limited to the particular arrangement illustrated in in this embodiment.
  • Alternative ski areas may feature any of a variety of ski lifts, trails, and obstacles or other features or even span multiple ski areas. As such, in these areas it may be exceedingly difficult to capture content like video or photo recordings of an individual. Numerous variables such as participant speed, improvised paths, and obstacles make the task of capturing content even more challenging .
  • the specification is directed toward a system and method for tracking a participant or activity and capturing that activity as well as the routes used during that activity.
  • the system is enabled to detect when a participant of the system has entered a target area, and then subsequently track that participant through the target area with a predetermined action such as taking photographs or video footage.
  • a predetermined action such as taking photographs or video footage.
  • the system will be able to track the participant over a large area to ensure that the entire event is captured.
  • FIG. 2 illustrates an embodiment of the system wherein a plurality of recording units 132 may be located within a given environment, in this embodiment along a ski trail, in order to capture an individual user' s activity while traversing the target area 120.
  • each recording unit 132 may be comprised of a plurality of cameras 203 (see Fig. 4A) and a plurality of corresponding beacons 601 (see FIG. 3A-3B) each corresponding pair of which are housed in the same recording unit 132.
  • additional embodiments may feature separate cameras 203 and beacons 601.
  • This embodiment is merely described as a rudimentary example designed to illustrate how a plurality of recording units 132 may be located in the environment to provide coverage and define the target area 120. After reading this description, one of ordinary skill in the art would be able to discern how alternative recording unit 132 configurations with similar or different environmental layouts and target areas 120 may be accomplished consistent with the principles of the systems described in the specification.
  • the plurality of cameras 203 and the plurality of beacons 601 may operate using technology including, but not limited to Bluetooth ® Low Energy (BTLE) , Global Positioning Systems (GPS) , and Radio Frequency Identification (RFID) or other similar tracking technology.
  • BTLE Bluetooth ® Low Energy
  • GPS Global Positioning Systems
  • RFID Radio Frequency Identification
  • the recording units 132 are both attached to the ski lift columns 126 and mounted onto other structures, however they may be placed in other areas or be free standing.
  • the recording units 132 may be arranged to suit the need of the environment that they are operating in to provide for suitable coverage of the target area 120; the dashed Lines indicate the field of view of each recording unit 132.
  • Each camera 203 may be arranged facing across the trail or towards the trail; in this environment facing a camera 203 towards the trail may be the most beneficial positioning for the purposes of capturing photography or video of an individual user of the system. Conversely, facing a camera 203 perpendicular to the trail may make it more difficult to take photography due to the shorter opportunity timeframe.
  • Each recording unit 132 (or if the beacon 601 and camera 203 are separate) may be mounted in a fixed position or with the ability to pan or otherwise move based on a predetermined algorithm. In this embodiment, it may be more important for the recording units 132 that are fixed to be positioned so that their field of view is open to the greatest area of the trail.
  • Each camera 203 utilized in the system is not limited to any particular type of camera. For instance, different resolutions, lenses or other accessories may be attributed to each camera 203 to customize the users experience to the area.
  • the placement of the recording units 132 is not limited to stationary position; for instance, recording units 132 may be placed on mobile platforms such as ski lifts, vehicles, or even other skiers.
  • Fig. 3A illustrates a diagram, wherein a user 502 enters a target area 508 with a corresponding tracking device 330.
  • the target area 508 may defined by a range 504 of a beacon 601 and a first edge 506 and second edge 507 caused by the beacon' s shielding.
  • a directional beacon 601 is utilized, but other embodiments of beacons 601 may be utilized depending on the activity being captured and the target area 508.
  • the user 502 may be holding, wearing or carrying the tracking device 330 in front of them; in other embodiments the user 502 may have the tracking device 330 in their pocket or somewhere else on their person.
  • the tracking device 330 is being utilized by a user 502, however in other embodiments the device 330 may be attached to another object to capture the user 502 while they are in the target area 508.
  • the tracking device 330 receives a signal from the beacon 601.
  • the system preferably will have a predetermined threshold level, which is the strength of the signal from the beacon 601 at which the camera 203 in the target area 508 of the user 502 may begin recording based on the signal strength data. Near the first edge 506 of the target area 508 the signal picked up by the tracking device 330 is near the threshold level to start a new event.
  • the system may record the threshold time as well as the threshold level and the beacon identification number at this time.
  • the system records the highest signal level from the beacon 601 as well as the time it is collected, which is referred to as the peak time (see below description) .
  • FIG. 3B illustrates a diagram of one embodiment of a user 502 and the tracking device 330 leaving the target area 508.
  • Figure 3C illustrates an embodiment of the system on a roller coaster.
  • the user's tracking device 330 As the user's tracking device 330 enters the target area 504 it receives a signal from a beacon 601. The tracking device 330 then records the times that is in the target area 504. A camera 203 records whenever a car passes through the area. Once the tracking device 330 no longer receives a signal from a beacon 601 it closes the event. The times that the tracking device 330 was in the target are 504 are then transmitted to an offsite server where the relevant video data is compiled and sent to the user 502.
  • BTLE transmissions use a short burst transmission signal in a circular direction, up to 300 feet in diameter, to uniquely connect with other BTLE devices.
  • Bluetooth ® operates between the 2.4 and 2.485 GHz spectrum using spread spectrum frequency hopping.
  • the device maintains a connection by communicating with the device and both the device and the transmitter use a pseudo-random code to "hop" between the same frequencies together for pseudo-random amounts of time, in sync.
  • this invention initiates communicates when the device is directly in front of the beacon 601.
  • FIG. 4A illustrates a block diagram of an embodiment of a system 200 for processing data captured from the plurality of cameras 203.
  • the plurality of cameras 203 are always recording data of the target area 508, however in other embodiments the plurality of cameras 203 may only be active when the user 508 is in the target area.
  • the data recorded by the plurality of cameras 203 is preferably stored in an on-site storage module 204, however in other embodiments the place of storage may vary.
  • the beacon 601 When the user' s tracking device 330 receives an ID signal from the beacon 601 that is above the Received Signal Strength (RSSI) followed by a device signal 201 that is below the threshold level the beacon 601 sends a signal to a data processing server 210.
  • the signal sent to the data processing server 210 contains the identification number of the beacon 601 that was read by the user' s 502 tracking device 330 as well as the time the threshold level was met and the time the signal fell below the minimum threshold level.
  • the system 200 pulls the captured data from the on-site storage module 204 for the timeframes that the user 502 is within the threshold levels.
  • the system 200 processes the raw data using processing code 202 to make it readable on the user' s 502 tracking device 330, however the data may be processed in a variety of ways depending on the computation environment.
  • the relevant video data is then pushed to a user' s computing device 208; in one embodiment the user's computing device 208 includes, but is not limited to a smartphone, tablet, laptop computer, desktop computer, PDA or any other like device.
  • FIG. 4A illustrates a system for processing, compiling and sending the data collected by the plurality of cameras 203 to a data processing module 210.
  • the data collected by the plurality of cameras 203 is sent to the data processing module 210.
  • the data processing environment 210 contains a data storage module 204 and a data processing and assembly module 202.
  • the data processing environment 210 also includes a data storage module for assembled sequences 206 communications interface 208 to transmit the assembled data.
  • they are shown as separate modules, one of ordinary skill in the art recognizes that the data storage module 204 and the data storage module for assembled sequences 206 can be combined into a single storage area.
  • One of ordinary skill in the art would also recognize that other physical or logical partitioning of storage can be applied.
  • FIG. 4B illustrates a flow diagram for one embodiment for image collection of the user 502 while in the target area 508.
  • the system 200 determines whether the user' s tracking device 330 is receiving a device signal 201 from the beacon 601. If the system 200 receives a device signal 201, then at step 404 the system 200 determines whether the signal 201 received is above the predetermined threshold limit. When the signal received is above the threshold level the system moves to step 406 where it records the time that the threshold level has was received and the level of the signal that was above the threshold level. At step 408 the highest signal strength received as well as the time that it is received is recorded. At step 410 the system records the time and the strength of the beacon signal when the event ends. At step 412 the event has ended and the time data as well as the signal strength data is sent to an off-site server 301. At step 414 video data is received on the user' s computing device 208 from an off-site server 301.
  • FIG. 5A illustrates an embodiment of an exemplary hardware architecture for an off site server module 300 for use in gathering data of a user 502 while they are in a target area 508.
  • the user's tracking device 330 possesses multiple security procedures, including passing a Crypto API encryption protocol 316 and a secure authorization 304, before communicating with the off site server 301, however in other embodiments the system may have a plurality of security protocols or none.
  • the off site server 301 receives a plurality of data from the user' s tracking device 330 which may include, but is not limited to password locations, password configurations 302, BTLE events, social pushing and future processing 314.
  • a local server 324 collects data from the target area 508 and the recording units 132. Information from the local server 324, which is in data communication with off site server 301 is transmitted once the recording units 132 have captured the user 502 entering and exiting a target area 508.
  • the server 301 may send a plurality of data, including history data, content IDs, videos, payment history, account history 312, content streams, images, secure content, and ID based streams 312 to the tracking device 330.
  • the server 301 may also upload data to social media either automatically or at the request of the user 502.
  • the back end server 301 also may communicate with a variety of payment systems 318 such as PayPal through an internal system accounting program 320.
  • the system determines whether the accounting is up to date 332 and whether a secure authorization has been established 304. Once both have been confirmed information collected during the activity is pushed to the queue 306. Data in the queue is queued and pushed to events.
  • FIG. 5B illustrates one embodiment of a high level process flow of the data gathering system 300.
  • the beacon 601 communicates with the user's tracking device 330 preferably using Bluetooth ® signals.
  • an encryption system ensure that the user 502 is authorized to use the system 302 via secure authentication 304 using crypto-pass and privileges 316.
  • Encryption protocols 318 are also used to insure that payments are safely processed and delivered to the accounting system 320.
  • the user 502 Once the user 502 has passed the encryption protocols 318 they may access the events 310 that the system 302 pushes to the user's device 330.
  • the user interface features access to the user's account history, videos, photos, social media links, and BTLE Event controls 308.
  • a tracking device 330 when a tracking device 330 enters a target area 508 it will receive a signal from one of the beacons 601 in the recording units 132. Initially at step 332, when the accounting system 320 is up to date, the system 302 will determine at step 304 whether secure authorization between the user 502 and the system 302 has been established. Once the authorization has been established, the system 302 may push information, coupons and events to the user' s tracking device 330 at step 306 and 308. Additionally, the local server 324 may deliver the captured data of the user 502 in the target area 508 to the tracking device 330. BTLE events 310 are also monitored and received from the mobile device 114 and sent to the queue 106.
  • the captured data of the user 502 which may consist of photos and videos 338 are uploaded to the server 324 from the cameras 203; the server 324 may be accessed from the tracking device 330, where videos and images may be pulled and processed, converted to RAW Video content 334, and streamed to the consumer's tracking device 330.
  • FIG. 6 illustrates a diagram of one embodiment of a direction-shielded Bluetooth ® Low Energy (“BTLE”) Beacon 601.
  • An exterior shielded container 600 may made of a material, such as tin, with more than a -lOdb differential with a single opening at one end 608 to surround the beacon 601.
  • a BTLE emitter 606 is housed in an inner enclosure with its own power source 604.
  • the inside of the housing may made of an insular and reflective material to prevent any energy from the BTLE 606 escaping in any direction other than the opening at the end of the housing 608.
  • FIG. 7 illustrates an example of a computing module 700, which may be, for example, computing or processing capabilities found in desktop, laptop, and notebook computers; handheld computing devices; mainframes, supercomputers, workstations or servers; or any other type of special or general purpose computing devices as may be desirable or appropriate for a given application or environment.
  • Computing module 700 may also represent computing capabilities embedded within or otherwise available to a given device.
  • a computing module might be found in other electronic devices such as, for example. Digital cameras, navigation systems, cellular telephones, modems, routers, WAPs, and any other electronic device that might include some form or processing capabilities.
  • Computing module 700 may include one or more processors or processing devices, such as a processor 704.
  • processor 704 might be implemented using a general-purpose or special purpose processing engine such as for example, a microprocessor, controller, or other control logic.
  • processor 704 is connected to a bus or other communication medium to facilitate interaction with other components of computing module 700.
  • Computing module 700 may also include one or more memory modules, referred to as main memory 708. For example, preferably random access memory (“RAM”) or other dynamic memory, might be used for storing information and instructions to be executed by processor 704. Main memory 708 may also be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 704. Computing module 700 might likewise include a read only memory (“ROM”) or other static storage device coupled to bus 702 for storing static information and instructions for the processor 704.
  • ROM read only memory
  • the computing module 700 may also include one or more various forms of information storage mechanism 710, which may include but is not limited to a media drive 712 and a storage unit interface 720.
  • the media drive 712 may include a drive or other mechanism to support fixed or removable storage media 714 including but not limited to a hard disk drive, a floppy disk drive a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW) or other removable or fixed media drive.
  • storage media 714 might include a hard disk drive, a floppy disk drive a magnetic tape drive, an optical disk drive, a CD or DVD drive or other removable or fixed media drive that is read by, written to, or accessed by the media drive 712.
  • the storage device 714 can include a computer usable storage medium having stored therein particular computer software or data.
  • information storage mechanism 710 mat include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into computing module 700.
  • Such instrumentalities may include, but are not limited to, a fixed storage unit 722 and an interface 720.
  • Examples of such storage units 722 and interfaces 720 may include, but are not limited to, a program cartridge and cartridge interface, a removable memory (such as a flash memory or other memory module) and memory slot, a PCMCIA slot and card, and other fixed or removable storage units 722 and interfaces 720 that allow software and data to be transferred from the storage unit 722 to computing module 700.
  • Computing module 700 might also include a communications interface 724.
  • Communications interface 724 might be used to allow software and data to be transferred between computing module 700 and external devices including but not limited to modem or soft modem, a network interface (such as Ethernet, network interface card, WiMedia, 802. XX or other interface), a communications port (for instance a USB port, IR port, RS232 port, Bluetooth ® interface, or other port) , or other communications interface.
  • Software and data transferred via communication interface 724 may typically be carried on signal which may be, but are not limited to be, electronic, electromagnetic, optical or other signals capable of being exchanged by a given communications interface 724. These documents might be provided to communications interface 724 via a channel 728.
  • This channel 728 may carry signals and might be implemented using a wire or wireless medium including but not limited to a phone line, a cellular phone link, an RF link, an optical link, a network link, a local or wide area network, and other wired or wireless communications channels.
  • FIG. 8 illustrates a flow diagram for one embodiment of a method for a data capture smoothing system used by this embodiment to ensure that there is no misreading of any possible faulty data and to provide correct feedback.
  • the system is a feedback program designed to note the RSSI and compare it to more RSSI data as it is collected.
  • the system constantly monitors for a beacon signal 902.
  • Each beacon 601 has a unique beacon identification number that it transmits. Once the system detects a beacon 601 signal it determines if the device is already in an active event for the beacon identification number received 904. The system then posts the RSSI from the beacon 601 and the beacon identification number to the smoothing accumulator 928.
  • the smoothing accumulator 928 determines a "Flicker” value, which is used to determine if the beacon's RSSI over time is the appropriate value to start an event.
  • the smoothing accumulator 928 also generates a "Linger” value, which is used to determine if an event should be closed based on the minimum threshold RSSI value over time.
  • the system determines whether that the RSSI from the beacon 601 is above the threshold limit 918. If the RSSI value is below the threshold limit the system returns to monitoring for a beacon 902. If the RSSI is above is above the threshold the system determines whether the predetermined "squelch" time has passed 920. The smoothed value of the RSSI (sRSSI) is compared to the Flicker value, which is pulled from the Smoothing Accumulator 928. If the sRSSI value is below the Flicker value the system returns to monitoring for beacon signals 902 mode. When the sRSSI is above the Flicker value a new event activates and the system records the time, the sRSSI, and the beacon identification number 924. The system then continues to monitor for beacon signals 902.
  • sRSSI smoothed value of the RSSI
  • the system continues to determine whether the device is in an active event for any beacon identification number it receives 904. If the system is in an active event the peak sRSSI value is recorded as well as the time the system has reached that value 906. The system compares the sRSSI value to the Linger value determined by the Smoothing Accumulator 928. If the sRSSI value is above the Linger value the system checks to see if a predetermined time limit has been reached 910. If the time limit has not been reached the system continues monitoring for beacon signals 902. If either the sRSSI value is below the Linger value or the event time limit has expired the event closes and the closing time is recorded 912.
  • the data collected is written to the queue 914.
  • Data in the queue is pushed to a server 916.
  • the push to the server is retired once per time period, however the push may be retried more or less frequently.
  • FIG. 9A illustrates an example of a graph of data that may be recorded when a user' s tracking device 330 enters a target area 508.
  • 806 is the strength of the sRSSI received by the user's tracking device 330.
  • the value of the entrance threshold limit 802 is recorded when the sRSSI reaches that value.
  • the time 804 that the threshold limit 802 is reached is recorded when that value is met.
  • FIG. 9B illustrates an example of a graph of data that may be recorded when a user device 330 enters the peak sRSSI value 808.
  • the time 810 is recorded when the peak sRSSI value 808 has been reached.
  • FIG. 9C illustrates a graph of data that may be recorded when a user device 330 approaches the exit threshold level 812.
  • the time 814, as well as the sRSSI value 812, when the exit threshold level 812 is reached is recorded by the system and the event closes. Once the event has closed the data collected is push to the queue and then pushed to the backend server.
  • FIG. 10 illustrates another embodiment of the invention.
  • the illustrative example allows the user to control whether they are recorded.
  • a directional recording device 132 is used, but other embodiments of beacons 601 may be used depending on the activity being captured and the target area 508.
  • the user 502 is holding their tracking device 330 in front of them but in other embodiments the user may have the device in their pocket or somewhere else on their person.
  • the tracking device 330 when the user's tracking device 330 is in the target area 508 the tracking device 330 notifies the user 502 with a pop up notification 900.
  • the notification is an image with the text "You Are in a Recording Area!” however in other embodiments of the notification 900 may vary for instance, using a variety of images, text, or video.
  • the tracking device 330 offers a button 902 for the user to push if they would like to be recorded.
  • FIG. 11 illustrates one mode of the BTLE/Accelerometer architecture where the person is the receiver of the signal.
  • the BTLE Movement Detection system 1010 has a Bluetooth ® Low Energy Transmitter 1012 which emits a transmission that is duly received by the BTLE receiver 1014.
  • the transmission that has been received from the receiver is fed into the system 10 Bus 1020 that connects the sharing and caching component 1026 and the movement detection system 1010.
  • the System 10 Bus 1020 is also connected to the RAM 1016 and Custom Software 1018 combination, the accelerometer 1022, and the CPU.
  • the device allows a person to monitor for a beacon nearby and send the data via the Internet to the backend server 1030 for storage.
  • FIG. 12 is quite similar to FIG. 11 as a person wears both however, it differs, as the person is now the transmitter instead of the receiver.
  • a transmission is received from the Bluetooth ® Energy Transmitter 1034 and received by the BTLE Transmitter and Receiver 1036.
  • the System 10 Bus 1042 connects the RAM 1038 and Custom Software 1040 combination, with the Accelerometer 1044, and the CPU 1046 performing essentially the same functions as in FIG. 12.
  • the System 10 bus transfers information from the Movement
  • this device turns a person into an active beacon. They transmit a low energy signal to nearby receivers and provide a low energy signal to nearby receivers and provide movement data. They may also include a receiver to look for nearby events to expedite processing.

Abstract

The present system and method is directed towards tracking a participant or activity and capturing that activity as well as the routes used during that activity. The invention is able to track where the participant is and react with a predetermined action such as taking photographs or video footage. Preferably the system will be able to track the participant over a large area to ensure that the entire event is captured. In this embodiment the cameras and the beacons are housed in the same unit which will be referred to as the recording unit, however other embodiments may feature separate cameras and beacons.

Description

INTERACTIVE SYSTEMS AND METHODS FOR DATA CAPTURE
CROSS REFERENCE TO RELATED APPLICATION
This application claims the benefit of and takes priority from United States Provisional Application Serial No. 61/978,474 filed on April 11, 2014, the contents of which are hereby incorporated by reference.
BACKGROUND OF THE INVENTION
Field of the Invention The present systems and methods relate generally to data capture of an individual (s) or events and more particularly to a real-time interactive data capture and tracking system based on activities across a range of environments.
Description of the Related Art
Systems and methods for automated photo/video capture and retrieval utilize RFID (Radio Frequency Identification) tags containing a unique person and/or group identifier number concurrently exist. These systems allow for automated capturing and indexing of individual or group photo/video images according to each unique person and/or group identifier. Concurrent systems include tags which are worn by each park patron during their visit to the park or other entertainment facility. Various readers distributed throughout the park or entertainment facility are able to read the RFID tags. Thus, the unique identifier numbers can be conveniently read and provided to an associated photo/ video capture system for purposes of providing indexing of captured photo/video images according to the unique identifiers of all individuals standing within the field of view of the camera.
SUMMARY OF THE INVENTION
The present apparatuses, systems and accompanying methods, as illustrated herein, are clearly not anticipated, rendered obvious, or even present in any of the prior art mechanisms, either alone or in any combination thereof. In one embodiment, an interactive system and method for data capture may utilize a plurality of data capture devices to track and capture an individual engaged in an activity in a defined environment.
The instant system and method is directed towards tracking a participant or activity and capturing that activity as well as the routes used during that activity. The system and accompanying apparatuses are able to track the location of the participant and react with a predetermined action, in one instance being taking photographs or video footage. Preferably the system will be configured to track the participant over a large area to ensure that the entire event is captured. In this embodiment, a series of cameras and a series of beacons are housed in the same unit which will be referred to as the recording unit. However, additional embodiments may feature separately housed cameras and beacons.
In one embodiment, the system detects when a user enters a target area through a user' s tracking device by a plurality of beacons located throughout the target area. In one embodiment, the detection of the user by each beacon may occur by Bluetooth®, GPS or a combination of the systems. Additionally, once the system has detected that a user has entered a target area, the system determines the best time to capture the user while in the target area based on the user' s movements in the target. The system may capture the user through a plurality of recording units located in the target area, wherein the system may capture photographs, videos or a combination of both while the user is located within the target area. Following capture of the user in the target area, the system may process the data and subsequently send the captured photographs and videos directly to a user's device, which may be the tracking device itself or another separate user device.
In yet another embodiment, when a user enters a target, the system determines and records the signal strength (RSSI) of the user' s tracking device in relation to each beacon in the target area in order to approximate the range of the user from each beacon. Preferably, the system will continually detect and record the strength of the signal from the user' s tracking device in relation to the beacon until a minimum threshold signal level is reached. Once the signal strength of the tracking device has reached the minimum predetermined level, the system then records this level as a start time for the beacon in the given target area. In a preferred embodiment, the user and the user' s associated tracking device do not "trigger" a capture event (i.e. start capturing and then stop capturing) in the target area. Rather the system analyzes the signal strength of the tracking device against the threshold levels set by the system to determine a start and stop capture period of the user in the target area.
In yet another embodiment, the present system allows for tracking and data capture related to a user by (1) detecting a person has entered an area of interest using BTLE sensors, (2) detecting how the person behaves in that detection area to calculate the best time to capture the event, and (3) adding fixed wiring via a device carried upon the person of a user wherein the device is configured to connects wirelessly with the system to capture the event in an open area without walls or buttons .
In one embodiment, utilization of Bluetooth® low energy sensors, in conjunction with the present system's unique algorithm which monitors a user' s behavioral patterns within the target area in order to create a unique experience. This technology marrying may be accomplished by using a gyroscope and/or accelerometers to trigger an event, in conjunction with the beacon detection. For example, when a user enters an area of interest and pauses for a period of time; uniquely, when a user begins moving again, the system triggers an event and records video of that event to capture the 'moment of acceleration' or peak moment of movement.
In this embodiment, the movement may be qualified as merely a simple motion or it may be a sequence of events to capture an action. The monitoring of a sequence of events, instead of just a single event, as in concurrent systems, ensures that the best/exact moment to capture a video of the event is chosen. The advantage of using the accelerometer or gyroscope in this manner appears in the convenience of operation. In the present method, the user just does what they would normally do. Ergo, the user in never not forced to stop, open the phone, press a button, and put the phone away in order to capture the best moment for a picture or video of the event. The system passively captures the best moments.
In one aspect, the present invention provides a data capture system. The data capture system includes a plurality of recording units located within a target area. The target area is configured for data capture of an individual. The data capture system also includes a tracking device that is configured to communicate with each of the recording units. The tracking device is used to detect when an individual enters the target area for data capture. When the plurality of recording units captures data of the individual in the target area, the data is stored in an on-site storage module. There is also a data processing module, which is configured to pull the captured data from the on-site storage module. Then, the data processing module processes the captured data.
Each recording unit that is part of the data capture system includes a camera and a beacon housed within the recording unit. Each beacon is able to communicate with the tracking device through GPS, Bluetooth, or a combination thereof located within the user's computing device.
The data processing module also includes a time processing module. The time processing module is configured to identify an individual's entry time and an exit time within the target area. Additionally, the data processing module has an assembly module. The assembly module is configured to assemble the captured data into a plurality of video data. The video data can then be viewed on a user's computing device. Some computing devices a user may possess include a smartphone, tablet, laptop computer, desktop computer, PDA or any other computing device.
The data processing module may further include a computer program product for creating and sending an assembled video to the user' s computing device from the captured data of the individual in the target area. The computer program may have a non-transitory computer usable medium. This usable medium may also have a computer readable program. The computer readable program causes the data processing module to capture data of an individual in the target area from the recording device. It is also capable of receiving a first signal about a minimum threshold and a second signal below the minimum threshold from the tracking device in the target area. Further, the computer readable program causes the data processing module to transmit the first and second signals to the data processing module in order to determine a timeframe when the individual was within the threshold levels in the target area. The data processing module also pulls the captured data from the on-site storage module to a time processing module in the data processing module for the timeframe the individual is within the threshold levels in the target area. The captured data is then assembled into a video by an assembly module and the assembled video is transmitted to the user's computing device.
The data processing module has a computer program product for capturing data of the individual while in the target area. The computer program has may have a non-transitory computer usable medium. The non-transitory computer usable medium includes a computer readable program.
The computer readable program allows the data processing module to determine whether the tracking device is receiving a device signal from a beacon. It further determines whether the device signal is above a minimum threshold for signal strength and records a time when the device signal received is above the threshold limit. The time is further recorded when the device signal received is at a peak and when the device signal received is below the minimum threshold for signal strength. Then the data processing module transmits the time data and signal strength data to an off-site server module.
The data capture system may also include a local server. The local server is configured to collect data from the target area and the recording units. Additionally, the data capture system may have an off site server. The off site server is in data communication with the local server and is configured to receive the captured data of the individual entering and exiting the target area.
Another aspect of the present invention includes a computer program for creating a video sequence product. The program may have a computer usable medium having a computer program code recorded thereon and configured to cause a processing device to perform many different functions. Some of the functions include: capturing data from each of the recording units where each recording unit covers a specific target area defined by a range of a beacon in the recoding unit; receiving time data of an individual entering and exiting the target area; retrieving the captured data corresponding to the time data of the individual in the target area; and assembling the captured data into a video sequence.
The receiving time data may also include determining an individual's entry time into a target area. This is determined by the signal strength of a tracking device when the tracking device reaches a minimum threshold. The receiving time data further includes determining an individual's exit time out of the target area. This occurs when the signal strength of the tracking device falls below the minimum threshold.
The capturing data is capable of determining the location of the individual based on the tracking device. Then, the capturing data compares the determined location with the coverage areas of the plurality of recording units to determine which specific recording unit will capture data from the determined target area.
The assembling the captured data into a video sequence includes selecting captured data from the determined recording units for assembly into a video.
Storing the captured data from the plurality of recording units positioned within a target area may include storing the captured data as data segments and receiving timing information associated with the tracking device. Each data segment has an associated target area and timing data. Another aspect of the present invention includes a method of capturing an individual in a target area. The method includes the steps of capturing a plurality of data of an individual in a target area by a plurality of recording units located within the target area. An on-site storage module then stores the captured data of the individual in the target area. The captured data is then pulled from the on-site storage module in order to process the captured data to a data processing module. An individual's entry time and exit time in and out of the target areas is identified by a time processing module. The captured data of the individual between his entry and exit time in and out of the target area is then assembled into video data by an assembly module. The assembled video is then transmitted to the individual's computer device, by the assembly module for an individual to view.
Additional steps include the capturing of an individual in a target area by receiving a first signal above a minimum threshold and a second signal below the minimum threshold from the tracking device in the target area. The first and second signals are transmitted to the data processing module to determine a time frame of when the individual is within the threshold levels in the target area. The captured data is pulled from the on-site storage module to a time processing module in the data processing module for the timeframe that the individual is within the threshold levels in the target area. The captured data is then assembled into a video by an assembly module and the assembled video is then transmitted to the user's computing device.
Each recording unit has a camera and a beacon housed within the recording unit. Each beacon may communicate with the tracking device through GPS, Bluetooth of a combination thereof. The user's computing device may include a variety of devices including a smartphone, a tablet, a laptop computer, a desktop computer, a PDA or any other like device.
There has thus been outlined, rather broadly, the more important features of the versatile interactive systems, apparatuses and accompany methods for data capture for in order that the detailed description thereof that follows may be better understood, and in order that the present contribution to the art may be better appreciated. There are additional features of the invention that will be described hereinafter and which will form the subject matter of the claims appended hereto.
In this respect, before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carries out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting.
These, together with other objects of the invention, along with the various features of novelty, which characterize the invention, are pointed out with particularity in the claims annexed to and forming a part of this disclosure. For a better understanding of the interactive systems, apparatuses and accompany methods, the operating advantages and the specific objects attained by usage, reference should be made to the accompanying drawings and descriptive matter in which there are illustrated preferred embodiments of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
Advantages of the present systems and methods will be apparent from the following detailed description of exemplary embodiments thereof, which description should be considered in conjunction with the accompanying drawings, in which:
Figure 1 illustrates a diagram of an exemplary configuration of one embodiment of the system utilized on a ski mountain;
Figure 2 illustrates a diagram of another exemplary configuration of one embodiment of the system on a ski mountain ;
Figure 3A illustrates a perspective diagram of an individual user of the system entering a target area for one embodiment of data capture;
Figure 3B illustrates a perspective diagram of an individual user of the system exiting a target area for one embodiment of data capture;
Figure 3C illustrates a diagram of an exemplary configuration of one embodiment of the system utilized on in an amusement park;
Figure 4A illustrates a block diagram of one embodiment of the hardware architecture utilized to capture, track and/or process a plurality of data of the individual user in a target area;
Figure 4B illustrates a flow diagram of a method to collect and process the captured data of an individual user of the system while in the target area;
Figure 5 illustrates a flow diagram of one embodiment of a method to process the data from an individual user of the system that has been collected while in the target area;
Figure 6 illustrates a block diagram of one embodiment of a tracking beacon to be utilized to determine the location of an individual user of the system;
Figure 7 illustrates a flow diagram of one embodiment of a method of smoothing a signal received from the tracking beacon;
Figure 8 illustrates a block diagram of an exemplary computing system to track and capture data of an individual user in a target area;
Figure 9A illustrates a diagram of a signal reception from a tracking beacon when the individual user enters a target area;
Figure 9B illustrates a diagram of a signal reception from a tracking beacon during peak signal strength when the individual is in the target area; Figure 9C illustrates a diagram of a signal reception from a tracking beacon when the individual user exits the target area;
Figure 10 illustrate a diagram of one embodiment of a user controlled recording system to track and capture data;
Figure 11 illustrates a block diagram of the hardware architecture where an individual user of the system may be a receiver to allow to monitor for a tracking beacon in the vicinity of the individual user; and,
Figure 12 illustrates a block diagram of the hardware architecture where an individual user of the system is util an active beacon for data communication with nearby receivers.
DETAILED DESCRIPTION OF THE SEVERAL EMBODIMENTS
For the following defined terms, these definitions shall be applied, unless a different definition is given in the claims or elsewhere in this specification. All numeric values are herein assumed to be modified by the term "about", whether or not explicitly indicated. The term "about" generally refers to a range of numbers that one of skill in the art would consider equivalent to the recited value (i.e., having the same function or result) . In many instances, the terms "about" may include numbers that are rounded to the nearest significant figure.
As used in this specification and the appended claims, the singular forms "a", "an", and "the" include plural referents unless the content clearly dictates otherwise. As used in this specification and the appended claims, the term "or" is generally employed in its sense including "and/or" unless the content clearly dictates otherwise.
The following description should be read with reference to the drawings wherein like reference numerals indicate like elements throughout the several views. The drawings, which are not necessarily to scale, depict illustrative embodiments of the claimed invention. In one embodiment, the present systems and methods relate to interactive data capture of an individual user. Preferably the system captures images of users who are partaking in a memorable activity. The system is particularly useful when the user is partaking in an activity that would make it difficult or impossible for them to capture data of themselves performing or engaged in the activity.
Target Area
FIG. 1A illustrates a diagram of one environment of a sample target area that may be utilized to deploy the present system. As shown in this embodiment, a target area would be a ski mountain, however in other embodiments, the system may be utilized in such environments including, but not limited to amusement parks, recreational facilities such as golf courses and skate parks, music festivals, tourist attractions or any other suitable environment where individuals desire to obtain images, videos and/or any other data capture while engaged in observing and/or performing an activity in any number of environments such as the exemplary embodiments described above. In this embodiment, a target area 120 includes multiple trails along the side of a mountain. The trails in this embodiment include multiple ski lifts 124 supported by upright members 126 such as columns or poles. In this embodiment, there may be a variety of trails in the target area 120 . These trails may feature obstacles such as moguls or jumps to heighten the excitement and challenge for skiers. Any or all of these trails may be skied by skiers as long as the skier possesses the requisite skill. As one with ordinary skill in the art would understand, alternative ski areas are not limited to the particular arrangement illustrated in in this embodiment. Alternative ski areas may feature any of a variety of ski lifts, trails, and obstacles or other features or even span multiple ski areas. As such, in these areas it may be exceedingly difficult to capture content like video or photo recordings of an individual. Numerous variables such as participant speed, improvised paths, and obstacles make the task of capturing content even more challenging .
From time-to-time the present invention will be described in terms of the example environments set forth. These environments are described to provide context for the features in the embodiments described. After reading this description it will be clear to anyone skilled in the art how to implement the present invention in other environments.
The specification is directed toward a system and method for tracking a participant or activity and capturing that activity as well as the routes used during that activity. In one embodiment, the system is enabled to detect when a participant of the system has entered a target area, and then subsequently track that participant through the target area with a predetermined action such as taking photographs or video footage. Preferably the system will be able to track the participant over a large area to ensure that the entire event is captured.
FIG. 2 illustrates an embodiment of the system wherein a plurality of recording units 132 may be located within a given environment, in this embodiment along a ski trail, in order to capture an individual user' s activity while traversing the target area 120. In one embodiment, each recording unit 132 may be comprised of a plurality of cameras 203 (see Fig. 4A) and a plurality of corresponding beacons 601 (see FIG. 3A-3B) each corresponding pair of which are housed in the same recording unit 132. As discussed below however, additional embodiments may feature separate cameras 203 and beacons 601.
This embodiment is merely described as a rudimentary example designed to illustrate how a plurality of recording units 132 may be located in the environment to provide coverage and define the target area 120. After reading this description, one of ordinary skill in the art would be able to discern how alternative recording unit 132 configurations with similar or different environmental layouts and target areas 120 may be accomplished consistent with the principles of the systems described in the specification.
In the illustrative embodiment the plurality of cameras 203 and the plurality of beacons 601 may operate using technology including, but not limited to Bluetooth® Low Energy (BTLE) , Global Positioning Systems (GPS) , and Radio Frequency Identification (RFID) or other similar tracking technology. In this embodiment the recording units 132 are both attached to the ski lift columns 126 and mounted onto other structures, however they may be placed in other areas or be free standing. The recording units 132 may be arranged to suit the need of the environment that they are operating in to provide for suitable coverage of the target area 120; the dashed Lines indicate the field of view of each recording unit 132. Each camera 203 may be arranged facing across the trail or towards the trail; in this environment facing a camera 203 towards the trail may be the most beneficial positioning for the purposes of capturing photography or video of an individual user of the system. Conversely, facing a camera 203 perpendicular to the trail may make it more difficult to take photography due to the shorter opportunity timeframe.
Each recording unit 132 (or if the beacon 601 and camera 203 are separate) may be mounted in a fixed position or with the ability to pan or otherwise move based on a predetermined algorithm. In this embodiment, it may be more important for the recording units 132 that are fixed to be positioned so that their field of view is open to the greatest area of the trail. Each camera 203 utilized in the system is not limited to any particular type of camera. For instance, different resolutions, lenses or other accessories may be attributed to each camera 203 to customize the users experience to the area. The placement of the recording units 132 is not limited to stationary position; for instance, recording units 132 may be placed on mobile platforms such as ski lifts, vehicles, or even other skiers.
Target Area Entry and Exit
Fig. 3A illustrates a diagram, wherein a user 502 enters a target area 508 with a corresponding tracking device 330. The target area 508 may defined by a range 504 of a beacon 601 and a first edge 506 and second edge 507 caused by the beacon' s shielding. In this embodiment a directional beacon 601 is utilized, but other embodiments of beacons 601 may be utilized depending on the activity being captured and the target area 508. In this embodiment the user 502 may be holding, wearing or carrying the tracking device 330 in front of them; in other embodiments the user 502 may have the tracking device 330 in their pocket or somewhere else on their person. In this embodiment the tracking device 330 is being utilized by a user 502, however in other embodiments the device 330 may be attached to another object to capture the user 502 while they are in the target area 508.
As the user' s 502 tracking device 330 passes through the target area 508 the tracking device 330 receives a signal from the beacon 601. The system preferably will have a predetermined threshold level, which is the strength of the signal from the beacon 601 at which the camera 203 in the target area 508 of the user 502 may begin recording based on the signal strength data. Near the first edge 506 of the target area 508 the signal picked up by the tracking device 330 is near the threshold level to start a new event. The system may record the threshold time as well as the threshold level and the beacon identification number at this time. The system records the highest signal level from the beacon 601 as well as the time it is collected, which is referred to as the peak time (see below description) .
FIG. 3B illustrates a diagram of one embodiment of a user 502 and the tracking device 330 leaving the target area 508. Once the signal broadcast by the beacon 601 is near the threshold level a second time at the second edge 507 of the target area 508. In this embodiment, when the signal received by the tracking device 330 is below the threshold level a second time the system closes the event, however in other embodiments the system may be closed by other factors such as a predetermined ending time. When the event closes the second threshold signal level and the time it is received is recorded and all of the data collected is pushed to the queue; the data in the queue is the pushed to an off-site server 204.
Figure 3C illustrates an embodiment of the system on a roller coaster. As the user's tracking device 330 enters the target area 504 it receives a signal from a beacon 601. The tracking device 330 then records the times that is in the target area 504. A camera 203 records whenever a car passes through the area. Once the tracking device 330 no longer receives a signal from a beacon 601 it closes the event. The times that the tracking device 330 was in the target are 504 are then transmitted to an offsite server where the relevant video data is compiled and sent to the user 502.
BTLE Beacon
BTLE transmissions use a short burst transmission signal in a circular direction, up to 300 feet in diameter, to uniquely connect with other BTLE devices. Bluetooth® operates between the 2.4 and 2.485 GHz spectrum using spread spectrum frequency hopping. The device maintains a connection by communicating with the device and both the device and the transmitter use a pseudo-random code to "hop" between the same frequencies together for pseudo-random amounts of time, in sync. To ensure that the device in this application is only connected when it is within a certain area, rather than anywhere in a three hundred sixty degree radius of the beacon 601, this invention initiates communicates when the device is directly in front of the beacon 601.
Image Processing
FIG. 4A illustrates a block diagram of an embodiment of a system 200 for processing data captured from the plurality of cameras 203. In this embodiment the plurality of cameras 203 are always recording data of the target area 508, however in other embodiments the plurality of cameras 203 may only be active when the user 508 is in the target area. In this embodiment the data recorded by the plurality of cameras 203 is preferably stored in an on-site storage module 204, however in other embodiments the place of storage may vary.
When the user' s tracking device 330 receives an ID signal from the beacon 601 that is above the Received Signal Strength (RSSI) followed by a device signal 201 that is below the threshold level the beacon 601 sends a signal to a data processing server 210. The signal sent to the data processing server 210 contains the identification number of the beacon 601 that was read by the user' s 502 tracking device 330 as well as the time the threshold level was met and the time the signal fell below the minimum threshold level. The system 200 pulls the captured data from the on-site storage module 204 for the timeframes that the user 502 is within the threshold levels. In this embodiment the system 200 processes the raw data using processing code 202 to make it readable on the user' s 502 tracking device 330, however the data may be processed in a variety of ways depending on the computation environment. The relevant video data is then pushed to a user' s computing device 208; in one embodiment the user's computing device 208 includes, but is not limited to a smartphone, tablet, laptop computer, desktop computer, PDA or any other like device.
Image Collection
FIG. 4A illustrates a system for processing, compiling and sending the data collected by the plurality of cameras 203 to a data processing module 210. The data collected by the plurality of cameras 203 is sent to the data processing module 210. In this embodiment the data processing environment 210 contains a data storage module 204 and a data processing and assembly module 202. The data processing environment 210 also includes a data storage module for assembled sequences 206 communications interface 208 to transmit the assembled data. Although they are shown as separate modules, one of ordinary skill in the art recognizes that the data storage module 204 and the data storage module for assembled sequences 206 can be combined into a single storage area. One of ordinary skill in the art would also recognize that other physical or logical partitioning of storage can be applied.
FIG. 4B illustrates a flow diagram for one embodiment for image collection of the user 502 while in the target area 508. Initially, at step 402 the system 200 determines whether the user' s tracking device 330 is receiving a device signal 201 from the beacon 601. If the system 200 receives a device signal 201, then at step 404 the system 200 determines whether the signal 201 received is above the predetermined threshold limit. When the signal received is above the threshold level the system moves to step 406 where it records the time that the threshold level has was received and the level of the signal that was above the threshold level. At step 408 the highest signal strength received as well as the time that it is received is recorded. At step 410 the system records the time and the strength of the beacon signal when the event ends. At step 412 the event has ended and the time data as well as the signal strength data is sent to an off-site server 301. At step 414 video data is received on the user' s computing device 208 from an off-site server 301.
High-level Data Gathering System
FIG. 5A illustrates an embodiment of an exemplary hardware architecture for an off site server module 300 for use in gathering data of a user 502 while they are in a target area 508. Preferably, the user's tracking device 330 possesses multiple security procedures, including passing a Crypto API encryption protocol 316 and a secure authorization 304, before communicating with the off site server 301, however in other embodiments the system may have a plurality of security protocols or none. In this embodiment, the off site server 301 receives a plurality of data from the user' s tracking device 330 which may include, but is not limited to password locations, password configurations 302, BTLE events, social pushing and future processing 314. A local server 324 collects data from the target area 508 and the recording units 132. Information from the local server 324, which is in data communication with off site server 301 is transmitted once the recording units 132 have captured the user 502 entering and exiting a target area 508.
Further, the server 301 may send a plurality of data, including history data, content IDs, videos, payment history, account history 312, content streams, images, secure content, and ID based streams 312 to the tracking device 330. The server 301 may also upload data to social media either automatically or at the request of the user 502. The back end server 301 also may communicate with a variety of payment systems 318 such as PayPal through an internal system accounting program 320. In this embodiment, the system determines whether the accounting is up to date 332 and whether a secure authorization has been established 304. Once both have been confirmed information collected during the activity is pushed to the queue 306. Data in the queue is queued and pushed to events. In the queue data is pushed to social media platforms 336, coupons and data is pushed to events 306 and videos / images are pulled from the server 301 and processed 332. Raw video / images are pushed 334 to the user's tracking device 330. Events 310 are also transmitted to the user's tracking device 330.
FIG. 5B illustrates one embodiment of a high level process flow of the data gathering system 300. In this embodiment of the system 302, the beacon 601 communicates with the user's tracking device 330 preferably using Bluetooth® signals. In this embodiment features an encryption system ensure that the user 502 is authorized to use the system 302 via secure authentication 304 using crypto-pass and privileges 316. Encryption protocols 318 are also used to insure that payments are safely processed and delivered to the accounting system 320. Once the user 502 has passed the encryption protocols 318 they may access the events 310 that the system 302 pushes to the user's device 330. In this embodiment the user interface features access to the user's account history, videos, photos, social media links, and BTLE Event controls 308. Preferably, when a tracking device 330 enters a target area 508 it will receive a signal from one of the beacons 601 in the recording units 132. Initially at step 332, when the accounting system 320 is up to date, the system 302 will determine at step 304 whether secure authorization between the user 502 and the system 302 has been established. Once the authorization has been established, the system 302 may push information, coupons and events to the user' s tracking device 330 at step 306 and 308. Additionally, the local server 324 may deliver the captured data of the user 502 in the target area 508 to the tracking device 330. BTLE events 310 are also monitored and received from the mobile device 114 and sent to the queue 106. In one embodiment, the captured data of the user 502 which may consist of photos and videos 338 are uploaded to the server 324 from the cameras 203; the server 324 may be accessed from the tracking device 330, where videos and images may be pulled and processed, converted to RAW Video content 334, and streamed to the consumer's tracking device 330.
FIG. 6 illustrates a diagram of one embodiment of a direction-shielded Bluetooth® Low Energy ("BTLE") Beacon 601. An exterior shielded container 600 may made of a material, such as tin, with more than a -lOdb differential with a single opening at one end 608 to surround the beacon 601. A BTLE emitter 606 is housed in an inner enclosure with its own power source 604. The inside of the housing may made of an insular and reflective material to prevent any energy from the BTLE 606 escaping in any direction other than the opening at the end of the housing 608.
Computing Module
FIG. 7 illustrates an example of a computing module 700, which may be, for example, computing or processing capabilities found in desktop, laptop, and notebook computers; handheld computing devices; mainframes, supercomputers, workstations or servers; or any other type of special or general purpose computing devices as may be desirable or appropriate for a given application or environment. Computing module 700 may also represent computing capabilities embedded within or otherwise available to a given device. For example a computing module might be found in other electronic devices such as, for example. Digital cameras, navigation systems, cellular telephones, modems, routers, WAPs, and any other electronic device that might include some form or processing capabilities.
Computing module 700 may include one or more processors or processing devices, such as a processor 704. Processor 704 might be implemented using a general-purpose or special purpose processing engine such as for example, a microprocessor, controller, or other control logic. In the example illustrated in FIG. 7, processor 704 is connected to a bus or other communication medium to facilitate interaction with other components of computing module 700.
Computing module 700 may also include one or more memory modules, referred to as main memory 708. For example, preferably random access memory ("RAM") or other dynamic memory, might be used for storing information and instructions to be executed by processor 704. Main memory 708 may also be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 704. Computing module 700 might likewise include a read only memory ("ROM") or other static storage device coupled to bus 702 for storing static information and instructions for the processor 704.
The computing module 700 may also include one or more various forms of information storage mechanism 710, which may include but is not limited to a media drive 712 and a storage unit interface 720. The media drive 712 may include a drive or other mechanism to support fixed or removable storage media 714 including but not limited to a hard disk drive, a floppy disk drive a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW) or other removable or fixed media drive. Accordingly, storage media 714 might include a hard disk drive, a floppy disk drive a magnetic tape drive, an optical disk drive, a CD or DVD drive or other removable or fixed media drive that is read by, written to, or accessed by the media drive 712. As these examples illustrate, the storage device 714 can include a computer usable storage medium having stored therein particular computer software or data.
In alternative embodiments, information storage mechanism 710 mat include other similar instrumentalities for allowing computer programs or other instructions or data to be loaded into computing module 700. Such instrumentalities may include, but are not limited to, a fixed storage unit 722 and an interface 720. Examples of such storage units 722 and interfaces 720 may include, but are not limited to, a program cartridge and cartridge interface, a removable memory (such as a flash memory or other memory module) and memory slot, a PCMCIA slot and card, and other fixed or removable storage units 722 and interfaces 720 that allow software and data to be transferred from the storage unit 722 to computing module 700.
Computing module 700 might also include a communications interface 724. Communications interface 724 might be used to allow software and data to be transferred between computing module 700 and external devices including but not limited to modem or soft modem, a network interface (such as Ethernet, network interface card, WiMedia, 802. XX or other interface), a communications port (for instance a USB port, IR port, RS232 port, Bluetooth® interface, or other port) , or other communications interface. Software and data transferred via communication interface 724 may typically be carried on signal which may be, but are not limited to be, electronic, electromagnetic, optical or other signals capable of being exchanged by a given communications interface 724. These documents might be provided to communications interface 724 via a channel 728. This channel 728 may carry signals and might be implemented using a wire or wireless medium including but not limited to a phone line, a cellular phone link, an RF link, an optical link, a network link, a local or wide area network, and other wired or wireless communications channels.
Smoothing System
FIG. 8 illustrates a flow diagram for one embodiment of a method for a data capture smoothing system used by this embodiment to ensure that there is no misreading of any possible faulty data and to provide correct feedback. The system is a feedback program designed to note the RSSI and compare it to more RSSI data as it is collected. The system constantly monitors for a beacon signal 902. Each beacon 601 has a unique beacon identification number that it transmits. Once the system detects a beacon 601 signal it determines if the device is already in an active event for the beacon identification number received 904. The system then posts the RSSI from the beacon 601 and the beacon identification number to the smoothing accumulator 928. The smoothing accumulator 928 determines a "Flicker" value, which is used to determine if the beacon's RSSI over time is the appropriate value to start an event. The smoothing accumulator 928 also generates a "Linger" value, which is used to determine if an event should be closed based on the minimum threshold RSSI value over time.
If the system is not already in an active event the system determines whether that the RSSI from the beacon 601 is above the threshold limit 918. If the RSSI value is below the threshold limit the system returns to monitoring for a beacon 902. If the RSSI is above is above the threshold the system determines whether the predetermined "squelch" time has passed 920. The smoothed value of the RSSI (sRSSI) is compared to the Flicker value, which is pulled from the Smoothing Accumulator 928. If the sRSSI value is below the Flicker value the system returns to monitoring for beacon signals 902 mode. When the sRSSI is above the Flicker value a new event activates and the system records the time, the sRSSI, and the beacon identification number 924. The system then continues to monitor for beacon signals 902.
The system continues to determine whether the device is in an active event for any beacon identification number it receives 904. If the system is in an active event the peak sRSSI value is recorded as well as the time the system has reached that value 906. The system compares the sRSSI value to the Linger value determined by the Smoothing Accumulator 928. If the sRSSI value is above the Linger value the system checks to see if a predetermined time limit has been reached 910. If the time limit has not been reached the system continues monitoring for beacon signals 902. If either the sRSSI value is below the Linger value or the event time limit has expired the event closes and the closing time is recorded 912.
When an event has been closed the data collected is written to the queue 914. Data in the queue is pushed to a server 916. In this embodiment the push to the server is retired once per time period, however the push may be retried more or less frequently.
FIG. 9A illustrates an example of a graph of data that may be recorded when a user' s tracking device 330 enters a target area 508. In this example, 806 is the strength of the sRSSI received by the user's tracking device 330. The value of the entrance threshold limit 802 is recorded when the sRSSI reaches that value. The time 804 that the threshold limit 802 is reached is recorded when that value is met.
FIG. 9B illustrates an example of a graph of data that may be recorded when a user device 330 enters the peak sRSSI value 808. The time 810 is recorded when the peak sRSSI value 808 has been reached.
FIG. 9C illustrates a graph of data that may be recorded when a user device 330 approaches the exit threshold level 812. The time 814, as well as the sRSSI value 812, when the exit threshold level 812 is reached is recorded by the system and the event closes. Once the event has closed the data collected is push to the queue and then pushed to the backend server.
User Controlled Recording
FIG. 10 illustrates another embodiment of the invention.
The illustrative example allows the user to control whether they are recorded. In this embodiment a directional recording device 132 is used, but other embodiments of beacons 601 may be used depending on the activity being captured and the target area 508. In this embodiment the user 502 is holding their tracking device 330 in front of them but in other embodiments the user may have the device in their pocket or somewhere else on their person.
Thus, when the user's tracking device 330 is in the target area 508 the tracking device 330 notifies the user 502 with a pop up notification 900. In this embodiment the notification is an image with the text "You Are in a Recording Area!" however in other embodiments of the notification 900 may vary for instance, using a variety of images, text, or video. When the user' s tracking device 330 is in the target area 508 the tracking device 330 offers a button 902 for the user to push if they would like to be recorded.
FIG. 11 illustrates one mode of the BTLE/Accelerometer architecture where the person is the receiver of the signal. The BTLE Movement Detection system 1010, has a Bluetooth® Low Energy Transmitter 1012 which emits a transmission that is duly received by the BTLE receiver 1014. The transmission that has been received from the receiver is fed into the system 10 Bus 1020 that connects the sharing and caching component 1026 and the movement detection system 1010. The System 10 Bus 1020 is also connected to the RAM 1016 and Custom Software 1018 combination, the accelerometer 1022, and the CPU.
These additional components allow for the software to perform the functionality of what is received from the BTLE Receiver 1014, the accelerometer' s 1022 measure of the acceleration detected and the CPU 1024 to insure the proper output to be shared to 1026. Ultimately, the device allows a person to monitor for a beacon nearby and send the data via the Internet to the backend server 1030 for storage.
FIG. 12 is quite similar to FIG. 11 as a person wears both however, it differs, as the person is now the transmitter instead of the receiver. A transmission is received from the Bluetooth® Energy Transmitter 1034 and received by the BTLE Transmitter and Receiver 1036. The System 10 Bus 1042 connects the RAM 1038 and Custom Software 1040 combination, with the Accelerometer 1044, and the CPU 1046 performing essentially the same functions as in FIG. 12.
The System 10 bus transfers information from the Movement
Detection System 1032 components RAM 1038, Custom Software 1040, Accelerometer 1044, and CPU 1046 to the Sharing and Caching System 1048 back to the BTLE Transmitter and Receiver 1036. 1036 then transmits the output to the BTLE Receiver 1050 in the sharing and caching system, which stores the information to the backend server 1054 via an internet connection 1052. In short, this device turns a person into an active beacon. They transmit a low energy signal to nearby receivers and provide a low energy signal to nearby receivers and provide movement data. They may also include a receiver to look for nearby events to expedite processing.
In conclusion, herein is presented an interactive system and method for data capture. The invention is illustrated by example in the flow diagrams and figures, and throughout the written description. It should be understood that numerous variations are possible, while adhering to the inventive concept. Such variations are contemplated as being a part of the present invention

Claims

CLAIMS What is claimed is:
1. A data capture system comprising:
a plurality of recording units located within a target area configured for data capture of an individual;
a tracking device, wherein the tracking device is configured to communicate with each recording unit located in the target area to detect when the individual enters the target area for data capture ;
an on-site storage module configured to store the captured data of the individual in the target area by the plurality of recording units;
a data processing module configured to pull the captured data from the on-site storage module to process the captured data, wherein the data processing module further comprises:
a time processing module configured to identify an entry time and an exit time of the individual in the target area; and
an assembly module configured to assemble the captured data into a plurality of video data to be viewed on a user' s computing device.
2. The data capture system of claim 2, wherein each recording unit comprises a camera and a beacon housed within the recording unit .
3. The data capture system of claim 3, wherein each beacon communicates with the tracking device through GPS, Bluetooth, or a combination thereof.
4. The data capture system of claim 1, wherein the user's computing device is selected from the group consisting of: a smartphone, a tablet, a laptop computer, a desktop computer, a PDA or any other like device.
5. The data capture system of claim 1, wherein the tracking device is located within the user's computing device.
6. The data capture system of claim 1, wherein the data processing module includes a computer program product for creating and sending an assembled video to the individual's computing device from the captured data of the individual in the target area, the computer program comprising a non-transitory computer usable medium including a computer readable program, wherein the computer readable program causes the data processing module to:
capturing data of an individual in the target area from the recording devices; receiving a first signal above a minimum threshold and a second signal below the minimum threshold from the tracking device in the target area;
transmitting the first and second signals to the data processing module to determine a timeframe when the individual is within the threshold levels in the target area;
pulling the captured data from the on-site storage module to a time processing module in the data processing module for the timeframe the individual is within the threshold levels in the target area;
assembling the captured data into a video by an assembly module in the data processing module; and
transmitting the assembled video to the user' s computing device .
7. The data capture system of claim 1, wherein the data processing module includes a computer program product for capturing data of the individual while in the target area, the computer program comprising a non-transitory computer usable medium including a computer readable program, wherein the computer readable program causes the data processing module to:
determining whether the tracking device is receiving a device signal from a beacon; determining whether the device signal is above a minimum threshold for signal strength;
recording a time when the device signal received is above the threshold limit;
recording a time when the device signal received is at a peak; recording a time when the device signal received is below the minimum threshold for signal strength;
transmitting the time data and signal strength data an off- site server module.
8. The data capture system of claim 1 further comprising:
a local server, wherein the local server is configured to collect data from the target area and the recording units; and an off site server, wherein the off site server is in data communication with the local server and configured to receive the captured data of the individual entering and exiting the target area .
9. A computer program for creating a video sequence product, wherein the program includes a computer usable medium having computer program code recorded thereon and configured to cause a processing device to perform the functions of: capturing data from a plurality of recording units, wherein each recording unit covers a target area defined by a range of a beacon in the recording unit;
receiving time data of an individual entering and exiting the target area;
retrieving the captured data corresponding to the time data of the individual in the target area; and
assembling the captured data into a video sequence.
10. The computer program product of claim 9, wherein receiving time data comprises:
determining an entry time into a target area by the individual once the signal strength of a tracking device reaches a minimum threshold; and
determining an exit time out of the target area by the individual once the signal strength of the tracking device falls below the minimum threshold.
11. The computer program product of claim 9, wherein capturing data comprises:
determining a location of the individual based on the tracking device and comparing the determined location with the coverage areas of the plurality of recording units to determine which recording units captured data from the determined target area.
12. The computer program product of claim 9, wherein assembling comprises :
selecting captured data from the determined recording units for assembly into a video.
13. The computer program of claim 9, wherein storing captured data from a plurality of recording units positioned within a target area comprises:
storing the captured data as data segments each having an associated target area and timing data;
and receiving timing information associated with the tracking device .
14. A method of capturing an individual in a target area comprising the steps of:
capturing a plurality of data of an individual in a target area by a plurality of recording units located within the target area;
storing the captured data of the individual in the target area by an on-site storage module;
pulling the captured data from the on-site storage module to process the captured data, to a data processing module; identifying an entry time and an exit time of the individual in the target area by a time processing module;
assembling the captured data of the individual between the entry and exit time in the target area into a video data by an assembly module; and
transmitting the assembled video to the individual's computer device for viewing by the data processing module.
15. The method of capturing an individual in a target area of claim 14 further comprising the steps of:
receiving a first signal above a minimum threshold and a second signal below the minimum threshold from the tracking device in the target area;
transmitting the first and second signals to the data processing module to determine a timeframe when the individual is within the threshold levels in the target area;
pulling the captured data from the on-site storage module to a time processing module in the data processing module for the timeframe the individual is within the threshold levels in the target area;
assembling the captured data into a video by an assembly module in the data processing module; and transmitting the assembled video to the user' s computing device .
16. The method of capturing an individual in a target area of claim 14, wherein each recording unit comprises a camera and a beacon housed within the recording unit.
17. The method of capturing an individual in a target area of claim 16, wherein each beacon communicates with the tracking device through GPS, Bluetooth, or a combination thereof.
18. The method of capturing an individual in a target area of claim 14, wherein the user's computing device is selected from the group consisting of: a smartphone, a tablet, a laptop computer, a desktop computer, a PDA or any other like device.
PCT/US2015/025548 2014-04-11 2015-04-13 Interactive systems and methods for data capture WO2015157757A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461978474P 2014-04-11 2014-04-11
US61/978,474 2014-04-11

Publications (2)

Publication Number Publication Date
WO2015157757A2 true WO2015157757A2 (en) 2015-10-15
WO2015157757A3 WO2015157757A3 (en) 2015-11-26

Family

ID=54288564

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/025548 WO2015157757A2 (en) 2014-04-11 2015-04-13 Interactive systems and methods for data capture

Country Status (1)

Country Link
WO (1) WO2015157757A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019133755A1 (en) * 2017-12-27 2019-07-04 General Electric Company Automated scope limiting for video analytics

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5075670A (en) * 1990-08-01 1991-12-24 Digital Products Corporation Personnel monitoring tag with tamper detection and secure reset
US20050177859A1 (en) * 2004-02-09 2005-08-11 Valentino Henry Iii Video surveillance system and methods of use and doing business
US8275767B2 (en) * 2009-08-24 2012-09-25 Xerox Corporation Kiosk-based automatic update of online social networking sites
US8757477B2 (en) * 2011-08-26 2014-06-24 Qualcomm Incorporated Identifier generation for visual beacon

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019133755A1 (en) * 2017-12-27 2019-07-04 General Electric Company Automated scope limiting for video analytics
US11153474B2 (en) 2017-12-27 2021-10-19 Ubicquia Iq Llc Automated scope limiting for video analytics
US11917325B2 (en) 2017-12-27 2024-02-27 Ubicquia Iq Llc Automated scope limiting for video analytics

Also Published As

Publication number Publication date
WO2015157757A3 (en) 2015-11-26

Similar Documents

Publication Publication Date Title
US11847589B2 (en) Virtual queuing system and method
US9307351B2 (en) Near field communication system, and method of operating same
US11553126B2 (en) Systems and methods to control camera operations
US10453067B2 (en) Short range wireless translation methods and systems for hands-free fare validation
CN106067045B (en) For collecting method, the equipment, wireless computer device of information in facility
EP2905953A1 (en) Content acquisition device, portable device, server, information processing device and storage medium
US10887553B2 (en) Monitoring system and monitoring method
CN105264827A (en) Portable platform for networked computing
US20150189176A1 (en) Domain aware camera system
JP6755480B2 (en) Matching method, intellectual interactive experience system and intellectual interaction system
US20150161449A1 (en) System and method for the use of multiple cameras for video surveillance
EP2786549A2 (en) Gymnastic machine with data exchange by means of a short range communication channel and training system using such machine
CN106488316A (en) Media playing method and device, electronic equipment
JP6650936B2 (en) Camera control and image streaming
Nguyen et al. IdentityLink: user-device linking through visual and RF-signal cues
US11132880B2 (en) System for tracking the location of people
CN103634547A (en) Client device, server, and storage medium
JP2009152733A (en) Person specifying system, person specifying device, person specifying method, and person specifying program
WO2015157757A2 (en) Interactive systems and methods for data capture
KR20170011755A (en) Apparatus and Method for video surveillance based on Beacon
US20180121737A1 (en) System and method for associating an identifier of a mobile communication terminal with a person-of-interest, using video tracking
CN106203234A (en) Object identifying and searching system and method
KR102161815B1 (en) Disc golf basket combined with cctv and intelligent system for photographing disc golf image using thereof
CN109960995A (en) A kind of exercise data determines system, method and device
CH718437A2 (en) Method and system for controlling visitors in an access zone inside an amusement park.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15777031

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15777031

Country of ref document: EP

Kind code of ref document: A2