WO2023182901A1 - Targeting aid system, device, and/or method - Google Patents

Targeting aid system, device, and/or method Download PDF

Info

Publication number
WO2023182901A1
WO2023182901A1 PCT/RU2022/000090 RU2022000090W WO2023182901A1 WO 2023182901 A1 WO2023182901 A1 WO 2023182901A1 RU 2022000090 W RU2022000090 W RU 2022000090W WO 2023182901 A1 WO2023182901 A1 WO 2023182901A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
sensor
target
firearm
targeting aid
Prior art date
Application number
PCT/RU2022/000090
Other languages
French (fr)
Inventor
Alexander Vladimirovich VALENCIA
Vadim Borisovich LINETSKIY
Vitaliy Vladimirovich SHIRYAEV
Alexander Fink
Original Assignee
Ai Smart Kinematics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ai Smart Kinematics Ltd filed Critical Ai Smart Kinematics Ltd
Priority to PCT/RU2022/000090 priority Critical patent/WO2023182901A1/en
Publication of WO2023182901A1 publication Critical patent/WO2023182901A1/en

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41AFUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
    • F41A33/00Adaptations for training; Gun simulators
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41CSMALLARMS, e.g. PISTOLS, RIFLES; ACCESSORIES THEREFOR
    • F41C27/00Accessories; Details or attachments not otherwise provided for
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/26Teaching or practice apparatus for gun-aiming or gun-laying
    • F41G3/2605Teaching or practice apparatus for gun-aiming or gun-laying using a view recording device cosighted with the gun

Definitions

  • the present disclosure relates generally to systems, devices, and/or methods that may operate to provide training to a user of a firearm so as to improve the user’s performance with respect to hitting targets in flight via projectiles fired or discharged from the firearm.
  • a user of a firearm e.g., shotgun, etc.
  • a firearm e.g., shotgun, etc.
  • one or more projectiles e.g., lead shot, bullet, buck shot, etc.
  • a user may wish to train with an intention of improving shooting performance.
  • improving a user’s shooting performance may be complicated by several factors.
  • success in hitting a target in flight with one or more projectiles discharged from a firearm may depend, at least in part, on multiple events and/or factors that may occur over the course of a relatively very short period of time. Also, for example, analyzing and/or reviewing such events may be difficult post-event. Factors that may contribute to the success, or lack thereof, of hitting a target in flight with one or more projectiles discharged from a firearm may include, for example, where the user is aiming at particular points in time, how the user manipulates the firearm, direction and/or speed of the target, weather conditions, user’s condition (e.g., a pulse rate, etc.), or the like.
  • Factors that may contribute to the success, or lack thereof, of hitting a target in flight with one or more projectiles discharged from a firearm may include, for example, where the user is aiming at particular points in time, how the user manipulates the firearm, direction and/or speed of the target, weather conditions, user’s condition (e.g., a pulse
  • a target in flight and one or more projectiles fired or discharged from a firearm may each travel at relatively higher speeds, it may be difficult for an instructor, for example, to provide real-time and/or near real-time instructions to a user as to how to improve a probability of hitting the target.
  • Post-event analysis and/or instruction may also be problematic and/or less effective and/or less efficient, as mentioned. Accordingly, training aids directed to improving a user’s performance with respect to hitting targets in flight using a firearm continue to be an active area of investigation and/or development.
  • FIG. 1A is a diagram illustrating potential use of an example targeting aid system and/or device, according to an embodiment
  • FIG. IB is a diagram illustrating potential use of an example targeting aid system and/or device in a hunting context, according to an embodiment
  • FIG. 2 is a schematic diagram illustrating an example communications infrastructure including an example targeting aid system and/or device, in accordance with an embodiment
  • FIG. 3 is a schematic block diagram depicting an example targeting aid device, according to an embodiment
  • FIG. 4A is an illustration depicting a perspective view of an example targeting aid device, according to an embodiment
  • FIG. 4B is an illustration depicting a front (i.e., user’s perspective) view of an example targeting aid device, according to an embodiment
  • FIG. 5 is a flow diagram depicting an example process, at a targeting aid system and/or device, for providing feedback to a user based at least in part on signals and/or signal packets obtained from at least one sensor of the targeting aid system and/or device, according to an embodiment
  • FIG. 6 is a block diagram depicting an example processing pipeline of an example targeting aid system and/or device, according to an embodiment
  • FIG. 7 is a flow diagram depicting an example process, at a targeting aid system and/or device, for processing sensor content to provide feedback to a user, according to an embodiment
  • FIG. 8 is a flow diagram depicting an example process, at a targeting aid system and/or device, for processing image sensor content, according to an embodiment
  • FIG. 9 is a flow diagram depicting an example process, at a targeting aid system and/or device, for processing inertial motion unit content, according to an embodiment
  • FIG. 10 is a block diagram depicting an example state diagram for an example targeting aid system and/or device, according to an embodiment
  • FIG. 1 1 is a flow diagram depicting an example process for determining barrel movement, according to an embodiment
  • FIG. 12 is a schematic diagram illustrating an implementation of an example computing environment associated with processes to facilitate multi-party and/or delegated computing according to an embodiment.
  • references throughout this specification to one implementation, an implementation, one embodiment, an embodiment, and/or the like means that a particular feature, structure, characteristic, and/or the like described in relation to a particular implementation and/or embodiment is included in at least one implementation and/or embodiment of claimed subject matter.
  • appearances of such phrases in various places throughout this specification are not necessarily intended to refer to the same implementation and/or embodiment or to any one particular implementation and/or embodiment.
  • particular features, structures, characteristics, and/or the like described are capable of being combined in various ways in one or more implementations and/or embodiments and, therefore, are within intended claim scope.
  • these and other issues have a potential to vary in a particular context of usage.
  • particular context of description and/or usage provides guidance regarding reasonable inferences to be drawn; however, likewise, the term “in this context” in general without further qualification refers at least to the context of the present patent application.
  • a user of a firearm such as a shotgun, rifle, handgun, or the like, may fire one or more projectiles (e.g., lead shot, bullet, buck shot, etc.) in the direction of a target in flight in an effort to hit the target with the one or more projectiles. Failure to hit the target in flight may lead to frustration in the user.
  • a user may wish to train with an intention of improving shooting performance.
  • improving a user’s shooting performance may be complicated by several factors.
  • success in hitting a target in flight with one or more projectiles discharged from a firearm may depend, at least in part, on multiple events and/or factors that may occur over the course of a relatively very short period of time. Also, for example, analyzing and/or reviewing such events may be difficult postevent. Factors that may contribute to the success, or lack thereof, of hitting a target in flight with one or more projectiles discharged from a firearm may include, for example, where the user is aiming at particular points in time, how the user manipulates the firearm, direction and speed of the target, weather conditions, user’s condition (e.g., a pulse rate, etc.), or the like.
  • a target in flight and one or more projectiles fired or discharged from a firearm may each travel at relatively higher speeds, it may be difficult for an instructor, for example, to provide real-time and/or near real-time instructions to a user as to how to improve a probability of hitting the target.
  • Post-event analysis and/or instruction may also be problematic, as mentioned. Accordingly, training aids directed to improving a user’s performance with respect to hitting targets in flight using a firearm continue to be an active area of investigation and/or development.
  • example embodiments described herein may include a targeting aid system, device, and/or method that may process signals and/or signal packets obtained from at least one sensor (e.g., image sensor) to provide feedback to a user in real-time and/or in near real-time during shooting activities (e.g., trap shooting, skeet shooting, fowl hunting, etc.).
  • a targeting aid system, device, and/or method may process signals and/or signal packets obtained from at least one sensor (e.g., image sensor) to provide feedback to a user in real-time and/or in near real-time during shooting activities (e.g., trap shooting, skeet shooting, fowl hunting, etc.).
  • Feedback and/or analysis, including a visual display may also be provided to a user post-event based at least in part on sensor content obtained during shooting activities.
  • “Realtime,” “near real-time” and/or the like in this context refers to providing feedback in a timely enough fashion to allow a user to act upon the feedback during a current shooting activity (e.g., while tracking and/or engaging a target). More generally, “real-time” and/or “near real-time” refers to the approximate actual time during which a process takes place and/or an event or activity occurs. “Post-event” and/or the like in this context refers to providing feedback and/or analysis of one or more shooting activities after the one or more shooting activities have completed.
  • Various example embodiments and/or implementations are described below. Although particular example embodiments and/or implementations are described herein, subject matter is not limited in scope in these respects.
  • a targeting aid system may provide training assistance so as to provide feedback to a user which may increase the user’s capability to hit a target in flight (e.g., a clay pigeon, bird, etc.) with one or more projectiles (e.g., lead shot) ejected from a firearm (e.g., shotgun).
  • a targeting aid device may be mounted to a barrel of a firearm.
  • Such a targeting aid device may not merely comprise an image sensor (e.g., camera) but rather may also perform calculations and/or otherwise process sensor content, including image (e.g., video) content, for example, to provide feedback (e.g., real-time, near real-time and/or post-event) to a user of the firearm.
  • image sensor e.g., camera
  • feedback e.g., real-time, near real-time and/or post-event
  • a targeting aid system may, for example, facilitate feedback, such as audio cues, visual cues, haptic cues, and so forth, which may indicate to a user of adjustments that may be made with respect to firearm positioning and/or movement and/or which may indicate to the user of an appropriate time at which the user may articulate a triggering mechanism so as to fire one or more projectiles towards the target in flight.
  • feedback such as audio cues, visual cues, haptic cues, and so forth, which may indicate to a user of adjustments that may be made with respect to firearm positioning and/or movement and/or which may indicate to the user of an appropriate time at which the user may articulate a triggering mechanism so as to fire one or more projectiles towards the target in flight.
  • a targeting aid system may determine whether a barrel portion of a firearm should be reoriented in an upward or downward direction (e.g., along a pitch axis), whether the barrel portion should be reoriented in a side-to-side direction (e.g., along a yaw axis), and/or whether the barrel portion should be reoriented in a roll axis. Also, in particular implementations, audible, visual and/or haptic feedback cues may be provided to the user to indicate a particular determined reorientation.
  • a targeting aid system may provide post-event processing to allow the user and/or other individual to review, analyze, and/or evaluate a particular event which may, for example, permit the user to improve the user’s shooting technique. Performance improvements gained in this manner may allow a user to hit targets in flight (e.g., launched clay targets, fowl, etc.) with increasing probability.
  • targets in flight e.g., launched clay targets, fowl, etc.
  • feedback may be actionable (e.g., by a user) and/or non-actionable.
  • actionable feedback may include indications of adjustments that may be made with respect to firearm positioning and/or movement and/or of an appropriate time at which the user may articulate a triggering mechanism.
  • Non-actionable feedback may include feedback that may not directly contribute to a user’s shooting performance and/or that may be merely informational, in an implementation.
  • non-actional feedback may include tracking performance over time, sharing information with one’s teammates or coach, and/or sharing information through social media, to name a few non-limiting examples.
  • FIG. 1A depicts a diagram 100 illustrating potential use of an embodiment 300 of an example targeting aid device.
  • targeting aid device 300 may be mounted to a barrel of a firearm, such as firearm 107.
  • Firearm 107 may be operated by a user, such as user 105.
  • a launcher 110 may launch one or more targets 150, such as a clay pigeon or other expendable target, into the field of view of user 105.
  • launcher 110 may launch one or more targets 150 responsive to user 105 uttering and/or otherwise indicating a “pull” command and/or the like.
  • launcher 110 may launch one or more targets 150 responsive to an electronic actuator under control of user 105.
  • launcher 110 may be actuated, for example, by way of user 105 actuating a foot pedal.
  • user 105 may align the barrel of firearm 107 so as to track the motion of launched target 150.
  • user 105 may initiate tracking of target 150 in flight at a “pickup point” 115.
  • Pickup point refers to a point at which a user initially visually acquires a target in flight.
  • user 105 may continue tracking of target 150 through a “hold point” 120.
  • Hold point refers to a point at which user 105 initially visually inserts the barrel(s) of a firearm into the path of a target in flight.
  • “Break point” refers to a point along a trajectory of a target at which a user may activate a trigger mechanism of a firearm to hit the target with one or more projectiles to be discharged from the firearm. “Break point” may also sometimes be referred to as a “kill point” and/or “shoot point.” Additionally, “aim point” refers to wherever the barrel of a firearm is pointing at any given point in time. At break point 125, user 105 may actuate a trigger mechanism of firearm 107 which may affect a discharge of one or more projectiles 130 from the barrel of firearm 107.
  • one or more highspeed projectiles 130 may coincide with launched target 150. Responsive to one or more high-speed projectiles 130 coinciding with launched target 150, target 150 may disintegrate or fragment, for example.
  • aiming action and/or the like refers to a movement and/or repositioning of a firearm that is meant to change where a projectile or projectile cloud would fly and/or hit if a trigger of the firearm was to be pulled.
  • handling action and/or the like in this context refers to an action or movement that modifies the state of a firearm.
  • a handling action may include an action break, reload, pick up from a resting position, lay down to a resting position, trigger pull, etc., to name a few non-limiting examples.
  • targeting aid device 300 may be mounted to a barrel of firearm 107, for example. As discussed more fully below, in implementations, targeting aid device 300 may record video events via an integrated image sensor. In particular implementations, targeting aid device 300 may transmit wireless signals and/or signal packets representative of image content to one or more external devices such as a smartphone, tablet device and/or computing device. See, for example, external device 250 of FIG. 2. In addition to transmitting wireless signals and/or signal packets representative of image content, targeting aid device 300 may record movements of the barrel(s) of firearm 107. In particular implementations, targeting aid device 300 may record movements in a coordinate plane, such as in a pitch access 109 as depicted in diagram 100.
  • targeting aid device 300 may record movements in additional axes, such as a yaw axis and/or a roll axis.
  • additional axes such as a yaw axis and/or a roll axis.
  • sensor types as well as signals and/or signal packets that may be obtained from various sensor types, are discussed more fully below.
  • Various example processes that may be performed on sensor signals and/or signal packets are also described more fully below.
  • FIG. IB depicts a diagram 101 illustrating potential use of targeting aid device 300 in a hunting context. Similar in many respects to the example depicted in FIG. 1A, targeting aid device 300 may be utilized to train users with respect to improving shooting techniques in the context of hunting wildlife, including, but not limited to, birds.
  • user 105 may respond to a target 160 (e.g., duck, pheasant, etc.) entering the user’s field of view.
  • User 105 may initiate tracking of target 160 at pickup point 115 and may continue tracking of target 160 through hold point 120.
  • user 105 may actuate the trigger mechanism of firearm 107, resulting in one or more projectiles 130 coinciding with target 160.
  • a target 160 e.g., duck, pheasant, etc.
  • a targeting aid system such as targeting aid device 300, may facilitate feedback, such as audio cues, visual cues, haptic cues, and so forth, which may indicate to a user, such as user 105, of adjustments that may be made with respect to positioning and/or movement of a firearm, such as firearm 107, and/or which may indicate to the user of an appropriate time at which the user may articulate a triggering mechanism of the firearm, as discussed more fully below.
  • FIG. 2 is a schematic diagram illustrating an embodiment 200 of an example communications infrastructure including targeting aid device 300.
  • targeting aid device 300 may facilitate wireless communications with external systems and/or devices.
  • such wireless communication may include real-time or near-real-time transmission of wireless signals and/or signal packets representative of live-action video content.
  • Video content may also be made available for post-event review and/or analysis, for example.
  • targeting aid device 300 may include a wireless communications interface that may operate via a communications infrastructure which may facilitate, at least in part, a capability for a user and/or other individuals to review, analyze, and/or evaluate the user’s shooting performance and/or technique at a remote location and/or at a later time.
  • targeting aid device 300 may transmit and/or receive wireless signals and/or signal packets, such as via a wireless communications network.
  • targeting aid device 300 may communicate with an external device, such smartphone, tablet device and/or computing device 250, via a point-to-point WiFi interconnect.
  • targeting aid device 300 may comprise an access point and external device 250 may comprise a wireless-capable device.
  • targeting aid device 300 and external device 250 may communicate via a wireless local-area network (WLAN).
  • WLAN wireless local-area network
  • targeting aid device 300 and external device 250 may connect to an access point, such as local transceiver 215.
  • targeting aid device 300 may communicate with a cellular communications network by transmitting wireless signals to and/or receiving wireless signals from one or more cellular transceivers 210 which may comprise a wireless base transceiver subsystem, a Node B and/or an evolved NodeB (eNodeB), for example, over wireless communication link 223.
  • targeting aid device 300 may transmit wireless signals to and/or may receive wireless signals from one or more local transceivers 215 over wireless communication link 225.
  • Local transceiver 215 may comprise an access point (AP), femtocell, Home Base Station, small cell base station, Home Node B (HNB) or Home eNodeB (HeNB), for example, and/or may provide access to a WLAN (e.g., IEEE 802.11 network), for example.
  • AP access point
  • femtocell Home Base Station
  • small cell base station small cell base station
  • HNB Home Node B
  • HeNB Home eNodeB
  • WLAN e.g., IEEE 802.11 network
  • targeting aid device 300 may communication with one or more external devices, such as external device 250.
  • targeting aid device 300 may communicate with one or more external devices such as, for example, smartphone, tablet device and/or computing device 250 via a wireless personal area network (WPAN, e.g., Bluetooth® network), WLAN, and/or a cellular network (e.g. an LTE network or other wireless wide area network, such as those discussed herein), for example.
  • WLAN wireless personal area network
  • a cellular network e.g. an LTE network or other wireless wide area network, such as those discussed herein
  • cellular transceiver 210, local transceiver 215 and/or satellite 214 may represent touchpoints which may permit targeting aid device 300 to interact with a network 222.
  • Examples of network technologies that may support wireless communication link 223 may include GSM, Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Long Term Evolution LTE), High Rate Packet Data (HRPD), etc.
  • GSM, WCDMA and/or LTE may comprise technologies defined by 3 rd Generation Partnership Project (3GPP).
  • CDMA and/or HRPD may comprise technologies defined by the 3 rd Generation Partnership Project 2 (3GPP2).
  • WCDMA may also be part of the Universal Mobile Telecommunications System (UMTS) and/or may be supported by an HNB.
  • Cellular transceivers 210 may comprise deployments of equipment providing subscriber access to a wireless telecommunication network for a service (e.g., under a service contract), for example.
  • a service e.g., under a service contract
  • cellular transceivers 210 may perform functions of a cellular base station in servicing subscriber devices within a cell determined based, at least in part, on a range at which the cellular transceivers 210 are capable of providing access service.
  • Examples of radio technologies that may support wireless communication link 225 may include IEEE 802.11, BT, LTE. etc.
  • cellular transceivers 210 and/or local transceivers 215 may communicate with one or more external devices such as, for example, server 240, such as by way of network 222 via communication links 245.
  • network 240 may comprise any combination of wired or wireless links and may include cellular transceiver 210 and/or local transceiver 215 and/or server 240, for example.
  • network 240 may comprise Internet Protocol (IP) and/or other infrastructure capable of facilitating communication between targeting aid device 300 and server 240 through local transceiver 215 or cellular transceiver 210.
  • IP Internet Protocol
  • network 240 may comprise a cellular communication network infrastructure such as, for example, a base station controller or packet-based or circuit-based switching center (not shown) to facilitate mobile cellular communication with targeting aid device 300.
  • network 240 may comprise local area network (LAN) elements such as WiFi APs, routers and/or bridges and/or may, in some implementations, comprise links to gateway elements that may provide access to wide area networks such as the Internet.
  • LAN local area network
  • network 240 may comprise multiple networks (e.g., one or more wireless networks and/or the Internet).
  • network 240 may include one or more serving gateways and/or Packet Data Network gateways.
  • one or more servers 240 may comprise an E-SMLC, a Secure User Plane Location (SUPL) Location Platform (SLP), a SUPL Location Center (SLC), a SUPL Positioning Center (SPC), a Position Determining Entity (PDE) and/or a gateway mobile location center (GMLC), each of which may connect to one or more location retrieval functions (LRFs) and/or mobility management entities (MMEs) of network 240, for example.
  • LRFs location retrieval functions
  • MMEs mobility management entities
  • Wireless signals and/or signal packets may be modulated to convey messages utilizing one or more techniques such as amplitude modulation, frequency modulation, binary phase shift keying (BPSK), quaternary phase shift keying (QPSK) and/or any of numerous other modulation techniques, and subject matter is not limited in scope in this respect.
  • BPSK binary phase shift keying
  • QPSK quaternary phase shift keying
  • targeting aid device 300 may comprise circuitry and/or processing resources capable of obtaining location related measurements (e.g. for signals received from GPS or other Satellite Positioning System (SPS) satellites 214), cellular transceiver 210 and/or local transceiver 215 and/or possibly computing a position fix or estimated location of targeting aid device 300 based at least in part on these location related measurements.
  • location related measurements obtained by targeting aid device 300 may be transferred to a location server such as an enhanced serving mobile location center (E-SMLC) or SUPL location platform (SLP) (e.g. which may comprise a server, such as server 240) after which the location server may estimate or determine an estimated location for targeting aid device 300 based at least in part on the measurements.
  • E-SMLC enhanced serving mobile location center
  • SLP SUPL location platform
  • location related measurements obtained by targeting aid device 300 may include measurements of signals 224 received from satellites belonging to an SPS or Global Navigation Satellite System (GNSS) such as GPS, GLONASS, Galileo or Beidou and/or may include measurements of signals (such as 223 and/or 225) received from terrestrial transmitters fixed at known locations (e.g., such as cellular transceiver 210).
  • GNSS Global Navigation Satellite System
  • GPS Global Navigation Satellite System
  • GLONASS Global Navigation Satellite System
  • Galileo Galileo
  • Beidou may include measurements of signals (such as 223 and/or 225) received from terrestrial transmitters fixed at known locations (e.g., such as cellular transceiver 210).
  • Targeting aid device 300 and/or a separate location server may obtain a location estimate for targeting aid device 300 based at least in part on location related measurements using, for example, GNSS, Assisted GNSS (A-GNSS), Advanced Forward Link Trilateration (AFLT), Observed Time Difference Of Arrival (OTDOA) and/or Enhanced Cell ID (E-CID), or combinations thereof.
  • targeting aid device 300 may comprise an Internet-of- Things (IoT) type device.
  • IoT-type device and/or the like refers to one or more electronic and/or computing devices capable of leveraging existing Internet and/or like infrastructure as part of the so-called “Internet of Things” or IoT, such as via a variety of applicable protocols, domains, applications, etc.
  • the IoT may typically comprise a system of interconnected and/or internetworked physical devices in which computing may be embedded into hardware so as to facilitate and/or support devices’ ability to acquire, collect, and/or communicate content over one or more communications networks, for example, at times, without human participation and/or interaction.
  • FIG. 3 is a schematic block diagram depicting an example implementation of targeting aid device 300.
  • a targeting aid device such as targeting aid device 300
  • communications interface 320 may enable wireless communications between targeting aid device 300 and one or more other external systems and/or devices.
  • communications interface 320 may facilitate, at least in part, wireless communications between targeting aid device 300 and a smartphone, tablet device, laptop computing device, etc.
  • wireless communications may occur substantially in accordance any of a wide range of communication standards and/or protocols, such as those mentioned herein, for example.
  • Targeting aid device 300 may also include one or more batteries, such as battery 370.
  • targeting aid device 300 may include a memory, such as memory 330.
  • memory 330 may comprise a non-volatile memory, for example.
  • memory 330 may have stored therein executable instructions, such as for one or more operating systems, communications protocols, and/or applications, for example.
  • Memory 330 may further store particular instructions, such as software and/or firmware code 332, for example.
  • software and/or firmware code 332 may be updated via wired and/or wireless communication with one or more external devices and/or systems, for example.
  • targeting aid device 300 may include an image sensor (e.g., camera) 350 and/or may further include one or more other sensors 360.
  • sensor and/or the like refers to a device and/or component that may respond to physical stimulus, such as, for example, heat, light, sound pressure, magnetism, particular motions, etc., and/or that may generate one or more signals and/or signal packets in response to physical stimulus.
  • Example sensors may include, but are not limited to, accelerometers, gyroscopes, thermometers, magnetometers, barometers, light sensors, proximity sensors, heart-rate monitors, perspiration sensors, hydration sensors, breath sensors and/or other biometric sensors, microphones, etc., and/or any combination thereof.
  • Targeting aid device 300 may comprise an inertial motion unit (IMU).
  • an IMU may comprise one or more sensors such as, for example, one or more accelerometers, gyroscopes, magnetometers, etc.
  • targeting aid device 300 may comprise a user interface, such as user interface 340.
  • user interface 340 may comprise a video and/or graphics display, although other implementations may not include a video and/or graphics display.
  • User interface 340 may also include a touch-screen to receive user inputs in some implementations.
  • user interface 340 may include one or more push-buttons and/or switches and/or the like by which a user, such as user 105, may provide inputs to initiate and/or exit various device states and/or modes of operations, for example.
  • User inputs may further include parameters indicative of one or more characteristics of ammunition being used and/or indicative of one or more characteristics of the particular firearm being used.
  • a user may provide inputs via an external device, such as smartphone, tablet device and/or computing device 250, for example.
  • User inputs may also be provided by way of voice commands, in an implementation.
  • user interface 340 may include one or more user feedback mechanisms to provide at least haptic, audible and/or visual feedback to a user, such as user 105.
  • real-time, near real-time and/or post-event user feedback may be provided to user 105 to improve the user’s shooting technique.
  • feedback provided to user 105 may be directed to prompting the user to adjust the user’s technique to be more in line with specified and/or common shooting practices.
  • user feedback may be provided to prompt the user to make adjustments with respect to gun mount technique, swing and/or target connection quality, shooting method and/or timing of shot routine stages and/or position of particular shot stages (e.g., hold point, break point, etc.) to name a few non-limiting examples.
  • feedback may be provided to user 105 via one or more hardware interfaces including, for example, a speaker implemented as part of targeting aid device 300 and/or implemented on an external device (e.g., smartphone, tablet device, laptop computing device, etc.), a wired and/or wireless headset (e.g., Bluetooth-enabled earphone(s)) to be worn by user 105, a graphics and/or video screen implemented as part of targeting aid device 300 and/or implemented as part of an external system or device, a haptic motor implemented as part of targeting aid device 300, etc.
  • an external device e.g., smartphone, tablet device, laptop computing device, etc.
  • a wired and/or wireless headset e.g., Bluetooth-enabled earphone(s)
  • a graphics and/or video screen implemented as part of targeting aid device 300 and/or implemented as part of an external system or device
  • a haptic motor implemented as part of targeting aid device 300, etc.
  • feedback may be provided to a user by way of video and/or still images displayed on targeting aid device 300 and/or on an external device, such as smartphone, tablet device and/or computing device 250.
  • trajectories of a target and/or of a projectile or projectile cloud e.g., lead shot
  • user interface 340 may include one or more indicator lights (e.g., light emitting diodes). See, for example, indicator lights 410 depicted in FIG. 4B.
  • targeting aid device 300 may further include one or more timers and/or counters and/or like circuits, for example.
  • one or more timers and/or counters and/or the like may track one or more aspects of device performance and/or operation.
  • processor 310 may synchronize image sensor content and/or other sensor content, such as IMU content.
  • processor 310 may timestamp image sensor content and/or other sensor content, such as IMU content, based at least in part on one or more timers and/or counters.
  • FIG. 3 depicts a particular example implementation of a targeting aid device, such as targeting aid device 300, including an image sensor, a processor, a memory and/or a user interface
  • a targeting aid device may include an image sensor to obtain image content and a communications interface to transmit the image content to an external device such as, for example, a smartphone, a tablet device and/or a laptop computing device and/or the like.
  • Image content and/or other sensor content may be processed by an external device to determine what feedback to provide to the user and one or more feedback parameters may be transmitted from the external device to the targeting aid device.
  • various operations related to gathering image and/or other sensor content, processing image and/or other sensor content, determining what feedback to provide to a user, and/or providing feedback to the user may be partitioned between a targeting aid device and one or more external devices in a variety of ways.
  • feedback parameter refers to a physical signal, signal packet and/or physical state specifying at least one characteristic of particular feedback to be provided to a user.
  • a feedback parameter determined within a targeting aid device and/or determined by an external device may be communicated to the targeting aid device and the targeting aid device may, responsive to the received feedback parameter, provide the specified feedback to the user.
  • a feedback parameter may comprise one or more signals and/or signal packets specifying a particular audible signal to be provided to a user to prompt the user to move the user’s aim in a particular direction.
  • a feedback parameter may comprise one or more signals and/or signal packets representative of an image to be displayed to the user, wherein the image may depict an actual aim point vs a desired aim point at a particular point in time, for example.
  • a feedback parameter may comprise one or more signals and/or signal packets representative of at least one video frame to be displayed to the user, wherein the at least one video frame may include one or more overlays depicting an actual aim trajectory vs a desired aim trajectory.
  • these are merely examples of feedback parameters, and subject matter is not limited in scope in these respects.
  • FIG. 4A is an illustration depicting a perspective view of an example embodiment of a targeting aid device, such as targeting aid device 300.
  • FIG. 4B depicts a rear face (i.e., user’s perspective) view.
  • a targeting aid device such as targeting aid device 300
  • targeting aid device 300 may be mounted to a barrel of a firearm, such as firearm 107.
  • targeting aid device 300 may comprise a barrel mount 410 that may allow targeting aid device 300 to be fixedly and/or removably mounted to a barrel of a firearm 107.
  • a housing 420 may contain image sensor 350 and/or communications interface 320. Housing 420 may also contain processor 310, memory 330, user interface 340 and/or sensors 360 in some implementations, for example.
  • housing 420 may be slidably attachable to barrel mount 410.
  • targeting aid device 300 may include one or more indicator lights, such as indicator lights 430.
  • indicator lights 430 may comprise light-emitting diodes (LED) that may be positioned on a userfacing surface of housing 420, for example.
  • LED light-emitting diodes
  • Indicator lights 430 may provide feedback and/or other information to a user such as battery level and/or modes of operation (e.g., sleep, standby, record, view, etc.), for example.
  • indicator lights 430 may be placed at the top left portion of housing 420 so that they may be visible when the user is shooting.
  • indicator lights 430 may be placed at the top right portion of housing 420, for example.
  • FIG. 5 is a flow diagram depicting an embodiment 500 of a process performed at a targeting aid system and/or device, such as targeting aid device 300, for providing feedback to a user based at least in part on signals and/or signal packets obtained from at least one sensor of the targeting aid system and/or device.
  • a targeting aid system and/or device such as targeting aid device 300
  • Embodiments in accordance with claimed subject matter may include all of blocks 510-540, fewer than blocks 510-540, and/or more than blocks 510-540.
  • content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example process 500 may be represented via one or more analog and/or digital signals and/or signal packets.
  • signals and/or signal packets may be obtained from at least one sensor of a targeting aid system and/or device, such as targeting aid device 300, as depicted at block 510.
  • sensors of targeting aid device 300 may include one or more image sensors, accelerometers, gyroscopes, thermometers, magnetometers, barometers, light sensors, proximity sensors, biometric sensors, microphones, IMU, etc., and/or any combination thereof.
  • a direction and speed of a target may be determined, in an implementation.
  • a target may comprise a launched target such as clay pigeon, for example.
  • a target may also comprise a bird, for example.
  • processor 310 of targeting aid device 300 may perform operations to determine the direction and speed of a target. In another implementation, such operations may be performed by an external device, such as smartphone, tablet device and/or computing device 250, for example.
  • At least one feedback parameter may be determined, as depicted at block 530.
  • the at least one user action relative to the direction and speed of the target may include barrel movement and/or trigger activation, for example.
  • Operations to determine the at least one feedback parameter may be performed by processor 310 of targeting aid device 300, in an implementation.
  • operations to determine the at least one feedback parameter may be performed by an external device, such as a smartphone, tablet device and/or computing device 250, for example.
  • haptic, audible and/or visual feedback may be provided to a user, such as user 105, at least in part in accordance with the at least one feedback parameter, for example.
  • feedback may be provided to user 105 via one or more hardware interfaces including, for example, a speaker implemented as part of targeting aid device 300 and/or implemented on an external device, a wired and/or wireless headset to be worn by the user, a graphics and/or video screen implemented as part of targeting aid device 300 and/or implemented as part of an external system or device, a haptic motor implemented as part of targeting aid device 300, etc.
  • these are merely examples of the types of feedback mechanisms that may be implemented and subject matter is not limited in scope in these respects.
  • feedback provided to user 105 may include realtime, near real-time and/or post-event feedback.
  • Feedback may be directed to improving the user’s shooting technique.
  • user feedback may be provided to prompt the user to make adjustments with respect to gun mount technique, swing and/or target connection quality, shooting method and/or timing of shot routine stages and/or position of particular shot stages (e.g., hold point, break point, etc.) to name a few non-limiting examples.
  • realtime and/or near real-time feedback may indicate to the user whether a barrel of a firearm should be reoriented in an upward or downward direction (e.g., along a pitch axis), whether the barrel portion should be reoriented in a side-to-side direction (e.g., along a yaw axis), and/or whether the barrel portion should be reoriented in a roll axis, for example.
  • a targeting aid system may facilitate post-event processing to allow a user and/or other individual to review, analyze, and/or evaluate a particular event which may, for example, permit the user to improve the user’s technique.
  • a targeting aid system may comprise a targeting aid device, such as targeting aid device 300.
  • a targeting aid system may include a targeting aid device, such as targeting aid device 300, and may also include one or more external devices, such as smartphone, tablet device and/or computing device 250.
  • FIG. 6 is a block diagram depicting an embodiment 600 of an example processing pipeline of an example targeting aid system.
  • a targeting aid system may comprise a targeting aid device, such as targeting aid device 300, or may include a targeting aid device, such as targeting aid device 300, and one or more external devices, such as smartphone, tablet device and/or computing device 250.
  • the various circuits, operations, blocks, etc. depicted in example 600 may be partitioned in various was between a targeting aid device and one or more external devices.
  • embodiments in accordance with claimed subject matter may include all of the circuits, operations and/or blocks of example pipeline 600, may include fewer than the circuits, operations and/or blocks of example pipeline 600, and/or include more than the circuits, operations and/or blocks of example pipeline 600.
  • content acquired or produced such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 600 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed.
  • implementations may be directed at determining: 1) where a shooter was aiming and what the shooter was doing at various and/or key points in time; 2) where a target was flying (e.g., speed and direction of target); 3) what technique elements the shooter should follow (e.g., according at least in part to techniques taught by experienced and/or knowledgeable instructors, and/or as described in handbooks from various shooting organizations) and/or how the shooter’s actions differed from the specified technique elements; and/or 4) what actionable feedback to provide to the shooter to help the shooter more closely align with the specified technique elements.
  • a targeting aid system may comprise an image sensor 602, an audio encoder/decoder (audio codec) 604 and/or one or more other sensors 606.
  • image sensor 602, audio codec 604 and other sensors 606 may provide signals and/or signal packets to buffers 610, 612 and 614, respectively.
  • Image sensor buffer 610 may provide stored image sensor content to a video encoder 630, for example.
  • audio codec buffer 612 may provide audio content (e.g., obtained by audio codec 604 from a microphone) to an audio encoder 632.
  • Other sensor content may be provided by buffer 614 to a content serializer 634, for example.
  • a multiplexer 636 may provide signals and/or signal packets from video encoder 630, audio encoder 632 and/or content serializer 634 to a writer block 638.
  • writer block 638 may comprise circuitry to write signals and/or signal packets obtained from video encoder 630, audio encoder 632 and/or content serializer 634 to a storage device, such as a secure digital (SD) card and/or a solid state drive (SSD), for example.
  • SD secure digital
  • SSD solid state drive
  • an analytics queue 620 may store image content from buffer 610, audio content from buffer 612 and/or other sensor content from buffer 614. Additionally, analytics queue 620 may provide image content, audio content and/or other sensor content to a pre-processing block 640.
  • Pre-processing and/or the like in this context may refer to analysis of sensor content to translate raw sensor content into content having semantic meaning.
  • IMU content e.g., signals and/or signal packets representative of readings from an accelerometer, gyroscope, magnetometer, etc.
  • IMU content may be pre-processed to determine barrel rotation, horizon leveling and/or barrel pointing up/down.
  • IMU content may also be pre- processed to detect an action break (e.g., to reload), for example.
  • image sensor content e.g., video content
  • pre-processed to detect moving objects For example, a static background may be subtracted from image content to reveal and/or emphasize a moving object (e.g., target in flight).
  • pre-processing of image content may include classification and/or filtering of objects to detect a target and/or to detect a projectile or projectile cloud. Example pre-processing operations are discussed more fully below.
  • Example pipeline 600 may also include fusion blocks 650.
  • Fusion in this context may refer to analysis and/or processing of the output of one or more preprocessing blocks to derive higher-order semantic meanings.
  • target speed and/or direction, weather parameters and/or user-provided content regarding firearm and/or projectile ballistics may be “fused” to calculate a projected trajectory of a target (e.g., clay pigeon, skeet, bird, etc.).
  • barrel position, weather parameters and/or user-provided content regarding firearm and/or projectile ballistics may be fused to calculate a projected trajectory of a projectile and/or projectile cloud (e.g., lead shot).
  • a shooter’s technique may be improved via adherence to particular specified principles at particular points during a target’s trajectory.
  • the pickup point, hold point and/or break point may be key aspects of a shooting period.
  • These points may be calculated by fusing barrel positioning content (e.g., derived from IMU signals and/or signal packets) with target trajectory and/or projectile or projectile cloud trajectory. Example fusion operations are discussed more fully below.
  • fusion blocks 650 may provide content to a user feedback generator 660 and/or to a content serializer 662 (e.g., for storage in an SD/SSD), for example.
  • a storage medium such as memory 330, SD card and/or SSD, for example, may receive content from a processor, such as processor 310, from writer block 638 and/or from content serializer block 662.
  • such content may comprise a merged stream of video content and a text track and/or may comprise a video file and a separate metadata text file, for example.
  • video and/or text content may be timestamped.
  • Example pipeline 600 also depicts an optional streaming option whereby image content may be provided in real-time or near real-time to a live streaming application that may be executed on an external device, such as a smartphone and/or tablet device, for example.
  • an external device such as a smartphone and/or tablet device, for example.
  • FIG. 7 is a flow diagram depicting an embodiment 700 of an example process logically partitioned into a plurality of layers, wherein operations associated with the various layers may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a combination of a targeting aid device and one or more external devices, such as smartphone, tablet device and/or computing device 250.
  • a targeting aid system and/or device such as targeting aid device 300
  • a combination of a targeting aid device and one or more external devices such as smartphone, tablet device and/or computing device 250.
  • Embodiments in accordance with claimed subject matter may include all of blocks 710-770, fewer than blocks 710-770, and/or more than blocks 710-770.
  • content acquired or produced such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 700 may be represented via one or more analog and/or digital signals and/or signal packets.
  • content acquired or produced such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 700 may be represented via one or more analog and/or digital signals and/or signal packets.
  • one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed.
  • the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
  • an example “sensors layer” may include collecting inputs from an image sensor, an IMU and/or other sensors, as depicted at block 710.
  • a sensors layer may also include obtaining inputs from a user, for example.
  • a user may provide inputs with respect to projectile (e.g., ammunition) characteristics and/or firearm characteristics, for example.
  • an example “drivers layer” may include timestamping and/or synchronizing inputs (e.g., sensor content) and/or may also include placing inputs in a first-in, first-out queue, for example.
  • an example “pre-processing layer” may include pre-processing inputs and/or translating signals and/or signal packets obtained from one or more sensors, including an image sensor, into higher-order constructs (e.g., camera roll, barrel rotation, action break, etc.), as depicted at block 730.
  • an example “fusion layer” may include fusing together higher-order constructs into system-wide constructs (e.g., target trajectory, projectile trajectory, key shooting points, etc.) as depicted at block 740.
  • an example “decision layer” may include making decisions based on system-wide constructs, as depicted at block 750. For example, decisions may be made with respect to determining appropriate feedback to provide to a user and/or to which state to transition the device and/or system.
  • an example “user interface (UI) layer” may include providing feedback to a user with respect to ways to modify the user’s activity for improved shooting performance, for example.
  • feedback types may include, by way of non-limiting examples, visual cues, audio cues and/or commands, haptic vibration, etc.
  • an example “applications/storage layer” may include storing video content and/or corresponding metadata content in a storage medium, as depicted at block 770. Applications/storage layer may also include transmitting video content and/or corresponding metadata content to an external device, such as smartphone, tablet device and/or computing device 250, for later retrieval and/or analysis, for example.
  • an external device such as smartphone, tablet device and/or computing device 250
  • Table 1 provides additional example detail for particular implementations related to the various logical operational layers discussed above in connection with example process 700, for example.
  • Column A of table 1 indicates a particular layer.
  • Column B of table 1 specifies for particular layers example hardware (HW) circuits and/or components such as may be implemented in a targeting aid device, such as targeting aid device 300, and/or in an external device, such as smartphone, tablet device and/or computing device 250.
  • Column B additionally specifies particular example software/firmware agents and/or particular example operations for particular layers.
  • Column C of table 1 specifies the origins of inputs to be provided for elements specified in column B for particular layers.
  • FIG. 8 is a flow diagram depicting an embodiment 800 for processing image sensor content.
  • Example 800 may comprise a sub-routine with respect to example process 700, in an implementation.
  • example process 800 may be logically partitioned into a plurality of layers, wherein operations associated with the various layers may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a combination of a targeting aid device and one or more external devices, such as smartphone, tablet device and/or computing device 250.
  • Embodiments in accordance with claimed subject matter may include all of blocks 810-860, fewer than blocks 810-860, and/or more than blocks 810-860.
  • content acquired or produced such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 800 may be represented via one or more analog and/or digital signals and/or signal packets.
  • content acquired or produced such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 800 may be represented via one or more analog and/or digital signals and/or signal packets.
  • a sensors layer may include collecting inputs, such as signals and/or signal packets, from an image sensor, for example.
  • a drivers layer as depicted at block 820, may comprise timestamping and/or synchronizing inputs from one or more sensors, such as an image sensor, in an implementation.
  • Drivers layer may also include placing signals and/or signal packets representative of inputs in a first-in, first-out queue, for example.
  • a pre-processing layer depicted at block 830 may include pre-processing inputs, such as image content obtained from an image sensor, to subtract a target from static background image content.
  • preprocessing layer may include calculating a predicted trajectory of a target in 3D space (e.g., relative to the image sensor’s position).
  • a fusion layer may include fusing a target trajectory with a projectile or projectile cloud trajectory to predict whether the two trajectories will meet if a trigger mechanism of a firearm is activated at a particular moment.
  • a decision layer as depicted at block 850, may include determining what feedback to provide the user and user interface layer 860 may include providing feedback to the user.
  • FIG. 9 is a flow diagram depicting an embodiment 900 for processing IMU content.
  • Process 900 may comprise a sub-routine with respect to example process 700, for example.
  • example process 900 may be logically partitioned into a plurality of layers, wherein operations associated with the various layers may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a combination of a targeting aid device and one or more external devices, such as smartphone, tablet device and/or computing device 250.
  • Embodiments in accordance with claimed subject matter may include all of blocks 910-960, fewer than blocks 910-960, and/or more than blocks 910-960.
  • example embodiment 900 content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 900 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed. In addition, although the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
  • a sensors layer may include collecting inputs, such as signals and/or signal packets, from an inertial motion unit (IMU), for example.
  • IMU inertial motion unit
  • a drivers layer may comprise timestamping and/or synchronizing inputs from one or more sensors, such as an IMU, in an implementation.
  • Drivers layer may also include placing signals and/or signal packets representative of inputs in a first-in, first-out queue.
  • a pre-processing layer may comprise pre-processing inputs, such as IMU content, to estimate barrel orientation (e.g., derived at least in part from gyroscope reading), rotation (e.g., derived at least in part from accelerometer reading) and/or position (e.g., derived at least in part from magnetometer reading).
  • Block 940 depicts a fusion layer that may include, for example, fusing barrel orientation, rotation and/or position with projectile speed to calculate a projectile or projectile cloud trajectory and/or to predict whether and/or where the target trajectory and projectile trajectory will meet.
  • a decision layer depicted at block 950 may include determining what feedback to provide the user and user interface layer 960 may include providing feedback to the user.
  • FIG. 10 is a block diagram depicting an example state diagram 1000 for an example targeting aid system and/or device, such as targeting aid device 300.
  • targeting aid device 300 may have several distinct states. Transitions between states may be initiated by particular events and/or conditions. Characteristics of particular states may be considered in conjunction with example pipeline 600 depicted in FIG. 6, discussed above, for example.
  • a transition from an initial “off’ state 1010 to a “deep sleep” state 1020 may occur responsive, at least in part, to a user input such as a button press and/or voice command, for example.
  • a transition from deep sleep state 1020 to a “view” state 1030 may occur responsive at least in part to a closing of an action of a firearm (e.g., shotgun).
  • an opening of the firearm’s action may initiate a transition from view state 1030 back to deep sleep state 1020.
  • view state 1030 may include prebuffering of video frames and/or pre-buffering signals and/or signal packets obtained from one or more sensors such as, for example, an IMU, microphone, etc. “Prebuffering” in this context refers to storing signals and/or signal packets prior to issuance of a “pull” command (or some other command and/or input that may be interpreted to be similar to “pull”) from a user, for example.
  • view state 1030 may include buffering content from image sensor 602, audio codec 604 and/or other sensors 606 in buffers 610, 612 and/or 614, respectively, for example.
  • buffers 610, 612 and/or 614 may store a specified amount of sensor and/or image content.
  • buffers 610, 612 and/or 614 may store “n” seconds worth of image content, audio content and/or other sensor content.
  • buffers 610, 612 and/or 614 may comprise first-in, first-out buffers wherein oldest content may be replaced as the buffers become full, for example.
  • view state 1030 may include maintaining n seconds of image and/or sensor content, in an implementation.
  • a transition from view state 1030 to a “record” state may occur responsive at least in part to a “pull” command and/or the like uttered and/or otherwise indicated by a user.
  • targeting aid device 300 may transition to a “process” state 1050, discussed below, responsive at least in part to a shot being fired. Because n seconds of content may be stored prior to a pull command, recording of video content, audio content and/or other sensor content during record state 1040 may begin n seconds prior to the pull command, in an implementation.
  • video content may be streamed to an external device, such as smartphone, table device and/or computing device 250, for example. Communications between targeting aid device 300 and external device 250 is discussed above, such as in connection with FIG. 2.
  • Pipeline 600 depicted in FIG. 6 also depicts streaming image content via a live streaming app 670 to a smartphone and/or tablet device, for example.
  • image content, audio content and/or other sensor content such as may be processed by video encoder 630, audio encoder 632 and/or content serializer 634, for example, may be written to a storage medium such as an SD card and/or SSD, for example, referring again to example pipeline 600 depicted in FIG. 6.
  • image content, audio content and/or other sensor content may be stored in analytics queue 620.
  • content stored in analytics queue 620 and/or buffers 610, 612 and/or 614 may be provided to pre-processing blocks 640, for example.
  • image content, audio content and/or other sensor content going back n seconds prior to a pull command and/or the like may be available for processing, in an implementation.
  • a transition from record state 1040 back to view state 1030 may occur responsive at least in part to a timeout (e.g., a shot is not detected within a specified period of time). Further, for example, a transition from record state 1040 to deep sleep state 1020 may occur responsive at least in part to an opening of the firearm action (e.g., indicating an end to a current shooting session).
  • process state 1050 may involve pre-processing blocks 640, fusion blocks 650 and/or user feedback generator 660, for example.
  • image content, audio content and/or other sensor content may continue to be written by writer block 638 to a storage medium, such as an SD card and/or an SSD, for example.
  • streaming of image content may continue via live stream app 670, for example.
  • a transition from process state 1050 to post-process state 1060 may occur responsive at least in part to a specified amount of time having passed since a most recently-detected shot.
  • recording of image content, audio content and/or other sensor content may cease.
  • post-process state 1060 may involve buffers 610, 612 and/or 614, analytics queue 620, pre-processing blocks 640, fusion blocks 650, user feedback generator block 660 and/or content serializer 662, for example.
  • a transition from post-process state 1060 to view state 1030 may occur responsive at least in part to completion of post-processing operations, for example.
  • post-processing operations may include processing (e.g., via pre-processing blocks 640, fusion blocks 650 and/or user feedback generator block 660) content stored in analytics queue 620.
  • post-process state 1060 may complete and target aid device may automatically transition to view state 1030.
  • FIG. 1 1 is a flow diagram depicting an embodiment 1100 of an example process for determining barrel movement. Operations associated example process 1100 may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a one or more external devices, such as smartphone, tablet device and/or computing device 250.
  • a targeting aid system and/or device such as targeting aid device 300
  • a one or more external devices such as smartphone, tablet device and/or computing device 250.
  • Embodiments in accordance with claimed subject matter may include all of blocks 1 1 10-1 150, fewer than blocks 1 1 10- 1 150, and/or more than blocks 1 1 10-1 150.
  • content acquired or produced such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 1 100 may be represented via one or more analog and/or digital signals and/or signal packets.
  • one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed.
  • the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
  • inputs in the form of signals and/or signal packets may be obtained from an image sensor of a targeting aid device, such as targeting aid device 300, as indicated at block 1 110. Further, as depicted at block 1120, a determination may be made as to whether permanent and/or static background features (e.g., trees) can be discerned in one or more video frames obtained via the image sensor inputs.
  • permanent and/or static background features e.g., trees
  • barrel movement may be determined based at least in part on the image sensor inputs, as indicated at block 1130. For example, when processing two consecutive frames of video, if a particular permanent and/or static background object, such as a tree, is noted to shift position from one frame to the next it may be determined that the image sensor is moving. In an implementation, because targeting aid device 300 may be mounted to the barrel of the firearm, movement of the image sensor may correlate with movement of the barrel. At least in part by detecting how the permanent and/or static objects shift position from one frame to a subsequent frame, a particular movement of the barrel of the firearm may be calculated, for example.
  • barrel movement may be determined based at least in part on inputs obtained from an IMU of targeting aid device 300.
  • inputs e.g., readings in the form of signals and/or signal packets
  • an IMU may provide readings related to linear acceleration and/or angular speed, for example.
  • barrel movement may be determined based at least in part on the linear acceleration and/or angular speed readings obtained from the IMU, for example.
  • connection the term “connection,” the term “component” and/or similar terms are intended to be physical, but are not necessarily always tangible. Whether or not these terms refer to tangible subject matter, thus, may vary in a particular context of usage.
  • a tangible connection and/or tangible connection path may be made, such as by a tangible, electrical connection, such as an electrically conductive path comprising metal or other conductor, that is able to conduct electrical current between two tangible components.
  • a tangible connection path may be at least partially affected and/or controlled, such that, as is typical, a tangible connection path may be open or closed, at times resulting from influence of one or more externally derived signals, such as external currents and/or voltages, such as for an electrical switch.
  • Non-limiting illustrations of an electrical switch include a transistor, a diode, etc.
  • a “connection” and/or “component,” in a particular context of usage likewise, although physical, can also be non-tangible, such as a connection between a client and a server over a network, particularly a wireless network, which generally refers to the ability for the client and server to transmit, receive, and/or exchange communications, as discussed in more detail later.
  • Coupled is used in a manner so that the terms are not synonymous. Similar terms may also be used in a manner in which a similar intention is exhibited.
  • connected is used to indicate that two or more tangible components and/or the like, for example, are tangibly in direct physical contact.
  • two tangible components that are electrically connected are physically connected via a tangible electrical connection, as previously discussed.
  • coupled is used to mean that potentially two or more tangible components are tangibly in direct physical contact.
  • Coupled is also used to mean that two or more tangible components and/or the like are not necessarily tangibly in direct physical contact, but are able to co-operate, liaise, and/or interact, such as, for example, by being "optically coupled.” Likewise, the term “coupled” is also understood to mean indirectly connected. It is further noted, in the context of the present patent application, since memory, such as a memory component and/or memory states, is intended to be non- transitory, the term physical, at least if used in relation to memory necessarily implies that such memory components and/or memory states, continuing with the example, are tangible.
  • deposition of a substance “on” a substrate refers to a deposition involving direct physical and tangible contact without an intermediary, such as an intermediary substance, between the substance deposited and the substrate in this latter example; nonetheless, deposition “over” a substrate, while understood to potentially include deposition “on” a substrate (since being “on” may also accurately be described as being “over”), is understood to include a situation in which one or more intermediaries, such as one or more intermediary substances, are present between the substance deposited and the substrate so that the substance deposited is not necessarily in direct physical and tangible contact with the substrate.
  • the term “one or more” and/or similar terms is used to describe any feature, structure, characteristic, and/or the like in the singular, “and/or” is also used to describe a plurality and/or some other combination of features, structures, characteristics, and/or the like.
  • the term “based on” and/or similar terms are understood as not necessarily intending to convey an exhaustive list of factors, but to allow for existence of additional factors not necessarily expressly described.
  • one or more measurements may respectively comprise a sum of at least two components.
  • one component may comprise a deterministic component, which in an ideal sense, may comprise a physical value (e.g., sought via one or more measurements), often in the form of one or more signals, signal samples and/or states, and one component may comprise a random component, which may have a variety of sources that may be challenging to quantify.
  • a statistical or stochastic model may be used in addition to a deterministic model as an approach to identification and/or prediction regarding one or more measurement values that may relate to claimed subject matter.
  • the terms “type” and/or “like,” if used, such as with a feature, structure, characteristic, and/or the like, using “optical” or “electrical” as simple examples, means at least partially of and/or relating to the feature, structure, characteristic, and/or the like in such a way that presence of minor variations, even variations that might otherwise not be considered fully consistent with the feature, structure, characteristic, and/or the like, do not in general prevent the feature, structure, characteristic, and/or the like from being of a “type” and/or being “like,” (such as being an “optical-type” or being “optical-like,” for example) if the minor variations are sufficiently minor so that the feature, structure, characteristic, and/or the like would still be considered to be substantially present with such variations also present.
  • optical-type and/or optical-like properties are necessarily intended to include optical properties.
  • electrical-type and/or electrical-like properties are necessarily intended to include electrical properties.
  • portions of a process such as signal processing of signal samples, for example, may be allocated among various devices, including one or more client devices and/or one or more server devices, via a computing and/or communications network, for example.
  • a network may comprise two or more devices, such as network devices and/or computing devices, and/or may couple devices, such as network devices and/or computing devices, so that signal communications, such as in the form of signal packets and/or signal frames (e.g., comprising one or more signal samples), for example, may be exchanged, such as between a server device and/or a client device, as well as other types of devices, including between wired and/or wireless devices coupled via a wired and/or wireless network, for example.
  • signal communications such as in the form of signal packets and/or signal frames (e.g., comprising one or more signal samples), for example, may be exchanged, such as between a server device and/or a client device, as well as other types of devices, including between wired and/or wireless devices coupled via a wired and/or wireless network, for example.
  • network device refers to any device capable of communicating via and/or as part of a network and may comprise a computing device. While network devices may be capable of communicating signals (e.g., signal packets and/or frames), such as via a wired and/or wireless network, they may also be capable of performing operations associated with a computing device, such as arithmetic and/or logic operations, processing and/or storing operations (e.g., storing signal samples), such as in memory as tangible, physical memory states, and/or may, for example, operate as a server device and/or a client device in various embodiments.
  • signals e.g., signal packets and/or frames
  • processing and/or storing operations e.g., storing signal samples
  • memory tangible, physical memory states
  • Network devices capable of operating as a server device, a client device and/or otherwise may include, as examples, dedicated rack-mounted servers, desktop computers, laptop computers, set top boxes, tablets, netbooks, smart phones, wearable devices, integrated devices combining two or more features of the foregoing devices, and/or the like, or any combination thereof.
  • signal packets and/or frames may be exchanged, such as between a server device and/or a client device, as well as other types of devices, including between wired and/or wireless devices coupled via a wired and/or wireless network, for example, or any combination thereof.
  • server, server device, server computing device, server computing platform and/or similar terms are used interchangeably.
  • client client device
  • client computing device client computing platform
  • similar terms are also used interchangeably. While in some instances, for ease of description, these terms may be used in the singular, such as by referring to a “client device” or a “server device,” the description is intended to encompass one or more client devices and/or one or more server devices, as appropriate.
  • references to a “database” are understood to mean, one or more databases and/or portions thereof, as appropriate.
  • a network device also referred to as a networking device
  • a network device may be embodied and/or described in terms of a computing device and vice-versa.
  • this description should in no way be construed so that claimed subject matter is limited to one embodiment, such as only a computing device and/or only a network device, but, instead, may be embodied as a variety of devices or combinations thereof, including, for example, one or more illustrative examples.
  • a network may also include now known, and/or to be later developed arrangements, derivatives, and/or improvements, including, for example, past, present and/or future mass storage, such as network attached storage (NAS), a storage area network (SAN), and/or other forms of device readable media, for example.
  • a network may include a portion of the Internet, one or more local area networks (LANs), one or more wide area networks (WANs), wire-line type connections, wireless type connections, other connections, or any combination thereof.
  • LANs local area networks
  • WANs wide area networks
  • wire-line type connections wireless type connections, other connections, or any combination thereof.
  • a network may be worldwide in scope and/or extent.
  • sub-networks such as may employ differing architectures and/or may be substantially compliant and/or substantially compatible with differing protocols, such as network computing and/or communications protocols (e.g., network protocols), may interoperate within a larger network.
  • sub-network and/or similar terms if used, for example, with respect to a network, refers to the network and/or a part thereof.
  • Sub-networks may also comprise links, such as physical links, connecting and/or coupling nodes, so as to be capable to communicate signal packets and/or frames between devices of particular nodes, including via wired links, wireless links, or combinations thereof.
  • links such as physical links, connecting and/or coupling nodes, so as to be capable to communicate signal packets and/or frames between devices of particular nodes, including via wired links, wireless links, or combinations thereof.
  • Various types of devices such as network devices and/or computing devices, may be made available so that device interoperability is enabled and/or, in at least some instances, may be transparent.
  • the term “transparent,” if used with respect to devices of a network refers to devices communicating via the network in which the devices are able to communicate via one or more intermediate devices, such as one or more intermediate nodes, but without the communicating devices necessarily specifying the one or more intermediate nodes and/or the one or more intermediate devices of the one or more intermediate nodes and/or, thus, may include within the network the devices communicating via the one or more intermediate nodes and/or the one or more intermediate devices of the one or more intermediate nodes, but may engage in signal communications as if such intermediate nodes and/or intermediate devices are not necessarily involved.
  • a router may provide a link and/or connection between otherwise separate and/or independent LANs.
  • a “private network” refers to a particular, limited set of devices, such as network devices and/or computing devices, able to communicate with other devices, such as network devices and/or computing devices, in the particular, limited set, such as via signal packet and/or signal frame communications, for example, without a need for re-routing and/or redirecting signal communications.
  • a private network may comprise a stand-alone network; however, a private network may also comprise a subset of a larger network, such as, for example, without limitation, all or a portion of the Internet.
  • a private network “in the cloud” may refer to a private network that comprises a subset of the Internet.
  • signal communications may employ intermediate devices of intermediate nodes to exchange signal packets and/or signal frames, those intermediate devices may not necessarily be included in the private network by not being a source or designated destination for one or more signal packets and/or signal frames, for example. It is understood in the context of the present patent application that a private network may direct outgoing signal communications to devices not in the private network, but devices outside the private network may not necessarily be able to direct inbound signal communications to devices included in the private network.
  • the Internet refers to a decentralized global network of interoperable networks that comply with the Internet Protocol (IP). It is noted that there are several versions of the Internet Protocol.
  • IP Internet Protocol
  • the term Internet Protocol, IP, and/or similar terms are intended to refer to any version, now known and/or to be later developed.
  • the Internet includes local area networks (LANs), wide area networks (WANs), wireless networks, and/or long haul public networks that, for example, may allow signal packets and/or frames to be communicated between LANs.
  • LANs local area networks
  • WANs wide area networks
  • wireless networks and/or long haul public networks that, for example, may allow signal packets and/or frames to be communicated between LANs.
  • WWW or Web World Wide Web and/or similar terms may also be used, although it refers to a part of the Internet that complies with the Hypertext Transfer Protocol (HTTP).
  • HTTP Hypertext Transfer Protocol
  • network devices may engage in an HTTP session through an exchange of appropriately substantially compatible and/or substantially compliant signal packets and/or frames.
  • Hypertext Transfer Protocol there are several versions of the Hypertext Transfer Protocol.
  • the term Hypertext Transfer Protocol, HTTP, and/or similar terms are intended to refer to any version, now known and/or to be later developed.
  • substitution of the term Internet with the term World Wide Web (“Web”) may be made without a significant departure in meaning and may, therefore, also be understood in that manner if the statement would remain correct with such a substitution.
  • the Internet and/or the Web may without limitation provide a useful example of an embodiment at least for purposes of illustration.
  • the Internet and/or the Web may comprise a worldwide system of interoperable networks, including interoperable devices within those networks.
  • the Internet and/or Web has evolved to a public, self-sustaining facility accessible to potentially billions of people or more worldwide.
  • the terms “WWW” and/or “Web” refer to a part of the Internet that complies with the Hypertext Transfer Protocol.
  • the Internet and/or the Web may comprise a service that organizes stored digital content, such as, for example, text, images, video, etc., through the use of hypermedia, for example.
  • a network such as the Internet and/or Web, may be employed to store electronic files and/or electronic documents.
  • electronic file and/or the term electronic document are used throughout this document to refer to a set of stored memory states and/or a set of physical signals associated in a manner so as to thereby at least logically form a file (e.g., electronic) and/or an electronic document. That is, it is not meant to implicitly reference a particular syntax, format and/or approach used, for example, with respect to a set of associated memory states and/or a set of associated physical signals. If a particular type of file storage format and/or syntax, for example, is intended, it is referenced expressly. It is further noted an association of memory states, for example, may be in a logical sense and not necessarily in a tangible, physical sense. Thus, although signal and/or state components of a file and/or an electronic document, for example, are to be associated logically, storage thereof, for example, may reside in one or more different places in a tangible, physical memory, in an embodiment.
  • Web site and/or similar terms refer to Web pages that are associated electronically to form a particular collection thereof.
  • Web page and/or similar terms refer to an electronic file and/or an electronic document accessible via a network, including by specifying a uniform resource locator (URL) for accessibility via the Web, in an example embodiment.
  • URL uniform resource locator
  • a Web page may comprise digital content coded (e.g., via computer instructions) using one or more languages, such as, for example, markup languages, including HTML and/or XML, although claimed subject matter is not limited in scope in this respect.
  • application developers may write code (e.g., computer instructions) in the form of JavaScript (or other programming languages), for example, executable by a computing device to provide digital content to populate an electronic document and/or an electronic file in an appropriate format, such as for use in a particular application, for example.
  • code e.g., computer instructions
  • JavaScript or other programming languages
  • Use of the term “JavaScript” and/or similar terms intended to refer to one or more particular programming languages are intended to refer to any version of the one or more programming languages identified, now known and/or to be later developed.
  • JavaScript is merely an example programming language.
  • claimed subject matter is not intended to be limited to examples and/or illustrations.
  • the terms “entry,” “electronic entry,” “document,” “electronic document,” “content,”, “digital content,” “item,” and/or similar terms are meant to refer to signals and/or states in a physical format, such as a digital signal and/or digital state format, e.g., that may be perceived by a user if displayed, played, tactilely generated, etc. and/or otherwise executed by a device, such as a digital device, including, for example, a computing device, but otherwise might not necessarily be readily perceivable by humans (e.g., if in a digital format).
  • an electronic document and/or an electronic file may comprise a Web page of code (e.g., computer instructions) in a markup language executed or to be executed by a computing and/or networking device, for example.
  • an electronic document and/or electronic file may comprise a portion and/or a region of a Web page.
  • an electronic document and/or electronic file may comprise a number of components.
  • a component is physical, but is not necessarily tangible.
  • components with reference to an electronic document and/or electronic file in one or more embodiments, may comprise text, for example, in the form of physical signals and/or physical states (e.g., capable of being physically displayed).
  • memory states for example, comprise tangible components, whereas physical signals are not necessarily tangible, although signals may become (e.g., be made) tangible, such as if appearing on a tangible display, for example, as is not uncommon.
  • components with reference to an electronic document and/or electronic file may comprise a graphical object, such as, for example, an image, such as a digital image, and/or sub-objects, including attributes thereof, which, again, comprise physical signals and/or physical states (e.g., capable of being tangibly displayed).
  • digital content may comprise, for example, text, images, audio, video, and/or other types of electronic documents and/or electronic files, including portions thereof, for example.
  • parameters refer to material descriptive of a collection of signal samples, such as one or more electronic documents and/or electronic files, and exist in the form of physical signals and/or physical states, such as memory states.
  • one or more parameters such as referring to an electronic document and/or an electronic file comprising an image, may include, as examples, time of day at which an image was captured, latitude and longitude of an image capture device, such as a camera, for example, etc.
  • one or more parameters relevant to digital content such as digital content comprising a technical article, as an example, may include one or more authors, for example.
  • Claimed subject matter is intended to embrace meaningful, descriptive parameters in any format, so long as the one or more parameters comprise physical signals and/or states, which may include, as parameter examples, collection name (e.g., electronic file and/or electronic document identifier name), technique of creation, purpose of creation, time and date of creation, logical path if stored, coding formats (e.g., type of computer instructions, such as a markup language) and/or standards and/or specifications used so as to be protocol compliant (e.g., meaning substantially compliant and/or substantially compatible) for one or more uses, and so forth.
  • collection name e.g., electronic file and/or electronic document identifier name
  • technique of creation purpose of creation, time and date of creation
  • logical path if stored e.g., coding formats (e.g., type of computer instructions, such as a markup language) and/or standards and/or specifications used so as to be protocol compliant (e.g., meaning substantially compliant and/or substantially compatible) for one or more uses,
  • Signal packet communications and/or signal frame communications may be communicated between nodes of a network, where a node may comprise one or more network devices and/or one or more computing devices, for example.
  • a node may comprise one or more sites employing a local network address, such as in a local network address space.
  • a device such as a network device and/or a computing device, may be associated with that node.
  • transmission is intended as another term for a type of signal communication that may occur in any one of a variety of situations. Thus, it is not intended to imply a particular directionality of communication and/or a particular initiating end of a communication path for the “transmission” communication.
  • the mere use of the term in and of itself is not intended, in the context of the present patent application, to have particular implications with respect to the one or more signals being communicated, such as, for example, whether the signals are being communicated “to” a particular device, whether the signals are being communicated “from” a particular device, and/or regarding which end of a communication path may be initiating communication, such as, for example, in a “push type” of signal transfer or in a “pull type” of signal transfer.
  • push and/or pull type signal transfers are distinguished by which end of a communications path initiates signal transfer.
  • a signal packet and/or frame may, as an example, be communicated via a communication channel and/or a communication path, such as comprising a portion of the Internet and/or the Web, from a site via an access node coupled to the Internet or vice-versa.
  • a signal packet and/or frame may be forwarded via network nodes to a target site coupled to a local network, for example.
  • a signal packet and/or frame communicated via the Internet and/or the Web may be routed via a path, such as either being “pushed” or “pulled,” comprising one or more gateways, servers, etc.
  • a signal packet and/or frame may, for example, route a signal packet and/or frame, such as, for example, substantially in accordance with a target and/or destination address and availability of a network path of network nodes to the target and/or destination address.
  • the Internet and/or the Web comprise a network of interoperable networks, not all of those interoperable networks are necessarily available and/or accessible to the public.
  • a network protocol such as for communicating between devices of a network, may be characterized, at least in part, substantially in accordance with a layered description, such as the so-called Open Systems Interconnection (OSI) seven layer type of approach and/or description.
  • a network computing and/or communications protocol (also referred to as a network protocol) refers to a set of signaling conventions, such as for communication transmissions, for example, as may take place between and/or among devices in a network.
  • the term “between” and/or similar terms are understood to include “among” if appropriate for the particular usage and vice-versa.
  • the terms “compatible with,” “comply with” and/or similar terms are understood to respectively include substantial compatibility and/or substantial compliance.
  • a network protocol such as protocols characterized substantially in accordance with the aforementioned OSI description, has several layers. These layers are referred to as a network stack. Various types of communications (e.g., transmissions), such as network communications, may occur across various layers.
  • a lowest level layer in a network stack such as the so-called physical layer, may characterize how symbols (e.g., bits and/or bytes) are communicated as one or more signals (and/or signal samples) via a physical medium (e.g., twisted pair copper wire, coaxial cable, fiber optic cable, wireless air interface, combinations thereof, etc.).
  • Additional operations and/or features may be available via engaging in communications that are substantially compatible and/or substantially compliant with a particular network protocol at these higher-level layers.
  • higher-level layers of a network protocol may, for example, affect device permissions, user permissions, etc.
  • a network and/or sub-network may communicate via signal packets and/or signal frames, such as via participating digital devices and may be substantially compliant and/or substantially compatible with, but is not limited to, now known and/or to be developed, versions of any of the following network protocol stacks: ARCNET, AppleTalk, ATM, Bluetooth, DECnet, Ethernet, FDDI, Frame Relay, HIPPI, IEEE 1394, IEEE 802.11, IEEE-488, Internet Protocol Suite, IPX, Myrinet, OSI Protocol Suite, QsNet, RS-232, SPX, System Network Architecture, Token Ring, USB, and/or X.25.
  • network protocol stacks ARCNET, AppleTalk, ATM, Bluetooth, DECnet, Ethernet, FDDI, Frame Relay, HIPPI, IEEE 1394, IEEE 802.11, IEEE-488, Internet Protocol Suite, IPX, Myrinet, OSI Protocol Suite, QsNet, RS-232, SPX, System Network Architecture, Token Ring, USB, and/or X.25.
  • a network and/or sub-network may employ, for example, a version, now known and/or later to be developed, of the following: TCP/IP, UDP, DECnet, NetBEUI, IPX, AppleTalk and/or the like.
  • Versions of the Internet Protocol (IP) may include IPv4, IPv6, and/or other later to be developed versions.
  • a wireless network may couple devices, including client devices, with the network.
  • a wireless network may employ stand-alone, ad-hoc networks, mesh networks, Wireless LAN (WLAN) networks, cellular networks, and/or the like.
  • WLAN Wireless LAN
  • a wireless network may further include a system of terminals, gateways, routers, and/or the like coupled by wireless radio links, and/or the like, which may move freely, randomly and/or organize themselves arbitrarily, such that network topology may change, at times even rapidly.
  • a wireless network may further employ a plurality of network access technologies, including a version of Long Term Evolution (LTE), WLAN, Wireless Router (WR) mesh, 2nd, 3rd, or 4th generation (2G, 3G, 4G, or 5G) cellular technology and/or the like, whether currently known and/or to be later developed.
  • Network access technologies may enable wide area coverage for devices, such as computing devices and/or network devices, with varying degrees of mobility, for example.
  • a network may enable radio frequency and/or other wireless type communications via a wireless network access technology and/or air interface, such as Global System for Mobile communication (GSM), Universal Mobile Telecommunications System (UMTS), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), 3GPP Long Term Evolution (LTE), LTE Advanced, Wideband Code Division Multiple Access (WCDMA), Bluetooth, ultra- wideband (UWB), 802.1 lb/g/n, and/or the like.
  • GSM Global System for Mobile communication
  • UMTS Universal Mobile Telecommunications System
  • GPRS General Packet Radio Services
  • EDGE Enhanced Data GSM Environment
  • LTE Long Term Evolution
  • LTE Advanced Long Term Evolution
  • WCDMA Wideband Code Division Multiple Access
  • Bluetooth ultra- wideband
  • UWB ultra- wideband
  • a system embodiment may comprise a local network (e.g., device 1204 and medium 1240) and/or another type of network, such as a computing and/or communications network.
  • FIG. 12 shows an embodiment 1200 of a system that may be employed to implement either type or both types of networks.
  • Network 1208 may comprise one or more network connections, links, processes, services, applications, and/or resources to facilitate and/or support communications, such as an exchange of communication signals, for example, between a computing device, such as 1202, and another computing device, such as 1206, which may, for example, comprise one or more client computing devices and/or one or more server computing device.
  • network 1208 may comprise wireless and/or wired communication links, telephone and/or telecommunications systems, Wi-Fi networks, Wi-MAX networks, the Internet, a local area network (LAN), a wide area network (WAN), or any combinations thereof.
  • Example devices in FIG. 12 may comprise features, for example, of a client computing device and/or a server computing device, in an embodiment. It is further noted that the term computing device, in general, whether employed as a client and/or as a server, or otherwise, refers at least to a processor and a memory connected by a communication bus.
  • first and third devices 1202 and 1206 may be capable of rendering a graphical user interface (GUI) for a network device and/or a computing device, for example, so that a user-operator may engage in system use.
  • GUI graphical user interface
  • Device 1204 may potentially serve a similar function in this illustration.
  • computing device 1202 ‘first device’ in figure) may interface with computing device 1204 (‘second device’ in figure), which may, for example, also comprise features of a client computing device and/or a server computing device, in an embodiment.
  • Processor (e.g., processing device) 1220 and memory 1222, which may comprise primary memory 1224 and secondary memory 1226, may communicate by way of a communication bus 1215, for example.
  • computing device refers to a system and/or a device, such as a computing apparatus, that includes a capability to process (e.g., perform computations) and/or store digital content, such as electronic files, electronic documents, measurements, text, images, video, audio, sensor content, etc. in the form of signals and/or states.
  • a computing device in the context of the present patent application, may comprise hardware, software, firmware, or any combination thereof (other than software per se).
  • Computing device 1204 is merely one example, and claimed subject matter is not limited in scope to this particular example.
  • a device such as a computing device and/or networking device, may comprise, for example, any of a wide range of digital electronic devices, including, but not limited to, targeting aid devices, desktop and/or notebook computers, high-definition televisions, digital versatile disc (DVD) and/or other optical disc players and/or recorders, game consoles, satellite television receivers, cellular telephones, tablet devices, wearable devices, personal digital assistants, mobile audio and/or video playback and/or recording devices, Internet of Things (IOT) type devices, endpoint and/or sensor nodes, gateway devices, or any combination of the foregoing.
  • IOT Internet of Things
  • a process as described, such as with reference to flow diagrams and/or otherwise, may also be executed and/or affected, in whole or in part, by a computing device and/or a network device.
  • a device such as a computing device and/or network device, may vary in terms of capabilities and/or features. Claimed subject matter is intended to cover a wide range of potential variations.
  • a device may include a numeric keypad and/or other display of limited functionality, such as a monochrome liquid crystal display (LCD) for displaying text, for example.
  • LCD monochrome liquid crystal display
  • a web-enabled device may include a physical and/or a virtual keyboard, mass storage, one or more accelerometers, one or more gyroscopes, global positioning system (GPS) and/or other location-identifying type capability, and/or a display with a higher degree of functionality, such as a touch-sensitive color 2D or 3D display, for example.
  • a virtual keyboard may include a physical and/or a virtual keyboard, mass storage, one or more accelerometers, one or more gyroscopes, global positioning system (GPS) and/or other location-identifying type capability, and/or a display with a higher degree of functionality, such as a touch-sensitive color 2D or 3D display, for example.
  • communications between a computing device and/or a network device and a wireless network may be in accordance with known and/or to be developed network protocols including, for example, global system for mobile communications (GSM), enhanced data rate for GSM evolution (EDGE), 802.1 Ib/g/n/h, etc., and/or worldwide interoperability for microwave access (WiMAX).
  • GSM global system for mobile communications
  • EDGE enhanced data rate for GSM evolution
  • WiMAX worldwide interoperability for microwave access
  • a computing device and/or a networking device may also have a subscriber identity module (SIM) card, which, for example, may comprise a detachable or embedded smart card that is able to store subscription content of a user, and/or is also able to store a contact list.
  • SIM subscriber identity module
  • SIM card may also be electronic, meaning that is may simply be stored in a particular location in memory of the computing and/or networking device.
  • a user may own the computing device and/or network device or may otherwise be a user, such as a primary user, for example.
  • a device may be assigned an address by a wireless network operator, a wired network operator, and/or an Internet Service Provider (ISP).
  • ISP Internet Service Provider
  • an address may comprise a domestic or international telephone number, an Internet Protocol (IP) address, and/or one or more other identifiers.
  • IP Internet Protocol
  • a computing and/or communications network may be embodied as a wired network, wireless network, or any combinations thereof.
  • a computing and/or network device may include and/or may execute a variety of now known and/or to be developed operating systems, derivatives and/or versions thereof, including computer operating systems, such as Windows, iOS, Linux, a mobile operating system, such as iOS, Android, Windows Mobile, and/or the like.
  • a computing device and/or network device may include and/or may execute a variety of possible applications, such as a client software application enabling communication with other devices.
  • one or more messages may be communicated, such as via one or more protocols, now known and/or later to be developed, suitable for communication of email, short message service (SMS), and/or multimedia message service (MMS), including via a network, such as a social network, formed at least in part by a portion of a computing and/or communications network, including, but not limited to, Facebook, Linkedln, Twitter, and/or Flickr, to provide only a few examples.
  • a computing and/or network device may also include executable computer instructions to process and/or communicate digital content, such as, for example, textual content, digital multimedia content, sensor content, and/or the like.
  • a computing and/or network device may also include executable computer instructions to perform a variety of possible tasks, such as browsing, searching, playing various forms of digital content, including locally stored and/or streamed video, and/or games such as, but not limited to, fantasy sports leagues.
  • executable computer instructions to perform a variety of possible tasks, such as browsing, searching, playing various forms of digital content, including locally stored and/or streamed video, and/or games such as, but not limited to, fantasy sports leagues.
  • computing device 1202 may provide one or more sources of executable computer instructions in the form physical states and/or signals (e.g., stored in memory states), for example.
  • Computing device 1202 may communicate with computing device 1204 by way of a network connection, such as via network 1208, for example.
  • a connection while physical, may not necessarily be tangible.
  • computing device 1204 of FIG. 12 shows various tangible, physical components, claimed subject matter is not limited to a computing devices having only these tangible components as other implementations and/or embodiments may include alternative arrangements that may comprise additional tangible components or fewer tangible components, for example, that function differently while achieving similar results. Rather, examples are provided merely as illustrations. It is not intended that claimed subject matter be limited in scope to illustrative examples.
  • Memory 1222 may comprise any non-transitory storage mechanism.
  • Memory 1222 may comprise, for example, primary memory 1224 and secondary memory 1226, additional memory circuits, mechanisms, or combinations thereof may be used.
  • Memory 1222 may comprise, for example, random access memory, read only memory, etc., such as in the form of one or more storage devices and/or systems, such as, for example, a disk drive including an optical disc drive, a tape drive, a solid-state memory drive, etc., just to name a few examples.
  • Memory 1222 may be utilized to store a program of executable computer instructions. For example, processor 1220 may fetch executable instructions from memory and proceed to execute the fetched instructions. Memory 1222 may also comprise a memory controller for accessing device readable-medium 1240 that may carry and/or make accessible digital content, which may include code, and/or instructions, for example, executable by processor 1220 and/or some other device, such as a controller, as one example, capable of executing computer instructions, for example.
  • a non-transitory memory such as memory cells storing physical states (e.g., memory states), comprising, for example, a program of executable computer instructions, may be executed by processor 1220 and able to generate signals to be communicated via a network, for example, as previously described. Generated signals may also be stored in memory, also previously suggested.
  • physical states e.g., memory states
  • Generated signals may also be stored in memory, also previously suggested.
  • Memory 1222 may store electronic files and/or electronic documents, such as relating to one or more users, and may also comprise a computer-readable medium that may carry and/or make accessible content, including code and/or instructions, for example, executable by processor 1220 and/or some other device, such as a controller, as one example, capable of executing computer instructions, for example.
  • the term electronic file and/or the term electronic document are used throughout this document to refer to a set of stored memory states and/or a set of physical signals associated in a manner so as to thereby form an electronic file and/or an electronic document.
  • Algorithmic descriptions and/or symbolic representations are examples of techniques used by those of ordinary skill in the signal processing and/or related arts to convey the substance of their work to others skilled in the art.
  • An algorithm is, in the context of the present patent application, and generally, is considered to be a self- consistent sequence of operations and/or similar signal processing leading to a desired result.
  • operations and/or processing involve physical manipulation of physical quantities.
  • such quantities may take the form of electrical and/or magnetic signals and/or states capable of being stored, transferred, combined, compared, processed and/or otherwise manipulated, for example, as electronic signals and/or states making up components of various forms of digital content, such as signal measurements, text, images, video, audio, etc.
  • a special purpose computer and/or a similar special purpose computing and/or network device is capable of processing, manipulating and/or transforming signals and/or states, typically in the form of physical electronic and/or magnetic quantities, within memories, registers, and/or other storage devices, processing devices, and/or display devices of the special purpose computer and/or similar special purpose computing and/or network device.
  • the term “specific apparatus” therefore includes a general purpose computing and/or network device, such as a general purpose computer, once it is programmed to perform particular functions, such as pursuant to program software instructions.
  • operation of a memory device may comprise a transformation, such as a physical transformation.
  • a transformation such as a physical transformation.
  • a physical transformation may comprise a physical transformation of an article to a different state or thing.
  • a change in state may involve an accumulation and/or storage of charge or a release of stored charge.
  • a change of state may comprise a physical change, such as a transformation in magnetic orientation.
  • a physical change may comprise a transformation in molecular structure, such as from crystalline form to amorphous form or vice-versa.
  • a change in physical state may involve quantum mechanical phenomena, such as, superposition, entanglement, and/or the like, which may involve quantum bits (qubits), for example.
  • quantum mechanical phenomena such as, superposition, entanglement, and/or the like
  • quantum bits quantum bits
  • the foregoing is not intended to be an exhaustive list of all examples in which a change in state from a binary one to a binary zero or vice-versa in a memory device may comprise a transformation, such as a physical, but non-transitory, transformation. Rather, the foregoing is intended as illustrative examples.
  • processor 1220 may comprise one or more circuits, such as digital circuits, to perform at least a portion of a computing procedure and/or process.
  • processor 1220 may comprise one or more processors, such as controllers, microprocessors, microcontrollers, application specific integrated circuits, digital signal processors, programmable logic devices, field programmable gate arrays, the like, or any combination thereof.
  • processor 1220 may perform signal processing, typically substantially in accordance with fetched executable computer instructions, such as to manipulate signals and/or states, to construct signals and/or states, etc., with signals and/or states generated in such a manner to be communicated and/or stored in memory, for example.
  • FIG. 12 also illustrates device 1204 as including a component 1232 operable with input/output devices, for example, so that signals and/or states may be appropriately communicated between devices, such as device 1204 and an input device and/or device 1204 and an output device.
  • a user may make use of an input device, such as a computer mouse, stylus, track ball, keyboard, and/or any other similar device capable of receiving user actions and/or motions as input signals.
  • a user may speak to a device to generate input signals.
  • a user may make use of an output device, such as a display, a printer, etc., and/or any other device capable of providing signals and/or generating stimuli for a user, such as visual stimuli, audio stimuli and/or other similar stimuli.

Abstract

The present disclosure relates generally to systems, devices, and/or methods that may operate to provide training to a user of a firearm, such as a shotgun, rifle, handgun, or the like, so as to improve the user's performance with respect to hitting targets in flight via projectiles fired or discharged from the firearm.

Description

TARGETING AID SYSTEM, DEVICE, AND/OR METHOD
BACKGROUND
Field
The present disclosure relates generally to systems, devices, and/or methods that may operate to provide training to a user of a firearm so as to improve the user’s performance with respect to hitting targets in flight via projectiles fired or discharged from the firearm.
Information
In a sport shooting environment (e.g., in which clay target or a “pigeon” may be launched from a ground-based launcher) and/or in a hunting environment (e.g., bird hunting), for example, a user of a firearm (e.g., shotgun, etc.) may fire one or more projectiles (e.g., lead shot, bullet, buck shot, etc.) in the direction of a target in flight in an effort to hit the target with the one or more projectiles. Failure to hit the target in flight may lead to frustration in the user. A user may wish to train with an intention of improving shooting performance. However, improving a user’s shooting performance may be complicated by several factors. For example, success in hitting a target in flight with one or more projectiles discharged from a firearm may depend, at least in part, on multiple events and/or factors that may occur over the course of a relatively very short period of time. Also, for example, analyzing and/or reviewing such events may be difficult post-event. Factors that may contribute to the success, or lack thereof, of hitting a target in flight with one or more projectiles discharged from a firearm may include, for example, where the user is aiming at particular points in time, how the user manipulates the firearm, direction and/or speed of the target, weather conditions, user’s condition (e.g., a pulse rate, etc.), or the like.
Because a target in flight and one or more projectiles fired or discharged from a firearm may each travel at relatively higher speeds, it may be difficult for an instructor, for example, to provide real-time and/or near real-time instructions to a user as to how to improve a probability of hitting the target. Post-event analysis and/or instruction may also be problematic and/or less effective and/or less efficient, as mentioned. Accordingly, training aids directed to improving a user’s performance with respect to hitting targets in flight using a firearm continue to be an active area of investigation and/or development. BRIEF DESCRIPTION OF THE DRAWINGS
Claimed subject matter is particularly pointed out and distinctly claimed in the concluding portion of the specification. However, both as to organization and/or method of operation, features, and/or advantages thereof, it may best be understood by reference to the following detailed description if read with the accompanying drawings in which:
FIG. 1A is a diagram illustrating potential use of an example targeting aid system and/or device, according to an embodiment;
FIG. IB is a diagram illustrating potential use of an example targeting aid system and/or device in a hunting context, according to an embodiment;
FIG. 2 is a schematic diagram illustrating an example communications infrastructure including an example targeting aid system and/or device, in accordance with an embodiment;
FIG. 3 is a schematic block diagram depicting an example targeting aid device, according to an embodiment;
FIG. 4A is an illustration depicting a perspective view of an example targeting aid device, according to an embodiment;
FIG. 4B is an illustration depicting a front (i.e., user’s perspective) view of an example targeting aid device, according to an embodiment;
FIG. 5 is a flow diagram depicting an example process, at a targeting aid system and/or device, for providing feedback to a user based at least in part on signals and/or signal packets obtained from at least one sensor of the targeting aid system and/or device, according to an embodiment;
FIG. 6 is a block diagram depicting an example processing pipeline of an example targeting aid system and/or device, according to an embodiment;
FIG. 7 is a flow diagram depicting an example process, at a targeting aid system and/or device, for processing sensor content to provide feedback to a user, according to an embodiment;
FIG. 8 is a flow diagram depicting an example process, at a targeting aid system and/or device, for processing image sensor content, according to an embodiment;
FIG. 9 is a flow diagram depicting an example process, at a targeting aid system and/or device, for processing inertial motion unit content, according to an embodiment; FIG. 10 is a block diagram depicting an example state diagram for an example targeting aid system and/or device, according to an embodiment;
FIG. 1 1 is a flow diagram depicting an example process for determining barrel movement, according to an embodiment; and
FIG. 12 is a schematic diagram illustrating an implementation of an example computing environment associated with processes to facilitate multi-party and/or delegated computing according to an embodiment.
Reference is made in the following detailed description to the accompanying drawings, which form a part hereof, wherein like numerals may designate like parts throughout that are corresponding and/or analogous. It may be appreciated that figures are not necessarily rendered to scale, such as for simplicity and/or clarity of illustration. For example, dimensions of some aspects may be exaggerated relative to others, one or more aspects, properties, etc. may be omitted, such as for ease of discussion, or the like. Further, it is to be understood that other embodiments may be utilized. Furthermore, structural and/or other changes may be made without departing from claimed subject matter. References throughout this specification to “claimed subject matter” refer to subject matter intended to be covered by one or more claims, or any portion thereof, and are not necessarily intended to refer to a complete claim set, to a particular combination of claim sets (e.g., method claims, apparatus claims, etc.), or to a particular claim.
DETAILED DESCRIPTION
References throughout this specification to one implementation, an implementation, one embodiment, an embodiment, and/or the like means that a particular feature, structure, characteristic, and/or the like described in relation to a particular implementation and/or embodiment is included in at least one implementation and/or embodiment of claimed subject matter. Thus, appearances of such phrases in various places throughout this specification, are not necessarily intended to refer to the same implementation and/or embodiment or to any one particular implementation and/or embodiment. Furthermore, it is to be understood that particular features, structures, characteristics, and/or the like described, are capable of being combined in various ways in one or more implementations and/or embodiments and, therefore, are within intended claim scope. In general, for the specification of a patent application, these and other issues have a potential to vary in a particular context of usage. In other words, throughout the disclosure, particular context of description and/or usage provides guidance regarding reasonable inferences to be drawn; however, likewise, the term “in this context” in general without further qualification refers at least to the context of the present patent application.
As mentioned, in a sport shooting environment (e.g., in which clay target or a “pigeon” may be launched from a ground-based launcher) and/or in a hunting environment (e.g., bird hunting), for example, a user of a firearm, such as a shotgun, rifle, handgun, or the like, may fire one or more projectiles (e.g., lead shot, bullet, buck shot, etc.) in the direction of a target in flight in an effort to hit the target with the one or more projectiles. Failure to hit the target in flight may lead to frustration in the user. A user may wish to train with an intention of improving shooting performance. However, improving a user’s shooting performance may be complicated by several factors. For example, success in hitting a target in flight with one or more projectiles discharged from a firearm may depend, at least in part, on multiple events and/or factors that may occur over the course of a relatively very short period of time. Also, for example, analyzing and/or reviewing such events may be difficult postevent. Factors that may contribute to the success, or lack thereof, of hitting a target in flight with one or more projectiles discharged from a firearm may include, for example, where the user is aiming at particular points in time, how the user manipulates the firearm, direction and speed of the target, weather conditions, user’s condition (e.g., a pulse rate, etc.), or the like.
Because a target in flight and one or more projectiles fired or discharged from a firearm may each travel at relatively higher speeds, it may be difficult for an instructor, for example, to provide real-time and/or near real-time instructions to a user as to how to improve a probability of hitting the target. Post-event analysis and/or instruction may also be problematic, as mentioned. Accordingly, training aids directed to improving a user’s performance with respect to hitting targets in flight using a firearm continue to be an active area of investigation and/or development.
To address challenges in helping a user improve performance with respect to hitting targets in flight using a firearm, example embodiments described herein may include a targeting aid system, device, and/or method that may process signals and/or signal packets obtained from at least one sensor (e.g., image sensor) to provide feedback to a user in real-time and/or in near real-time during shooting activities (e.g., trap shooting, skeet shooting, fowl hunting, etc.). Feedback and/or analysis, including a visual display, for example, may also be provided to a user post-event based at least in part on sensor content obtained during shooting activities. “Realtime,” “near real-time” and/or the like in this context refers to providing feedback in a timely enough fashion to allow a user to act upon the feedback during a current shooting activity (e.g., while tracking and/or engaging a target). More generally, “real-time” and/or “near real-time” refers to the approximate actual time during which a process takes place and/or an event or activity occurs. “Post-event” and/or the like in this context refers to providing feedback and/or analysis of one or more shooting activities after the one or more shooting activities have completed. Various example embodiments and/or implementations are described below. Although particular example embodiments and/or implementations are described herein, subject matter is not limited in scope in these respects.
In particular implementations, a targeting aid system may provide training assistance so as to provide feedback to a user which may increase the user’s capability to hit a target in flight (e.g., a clay pigeon, bird, etc.) with one or more projectiles (e.g., lead shot) ejected from a firearm (e.g., shotgun). In implementations, a targeting aid device may be mounted to a barrel of a firearm. Such a targeting aid device, in implementations, may not merely comprise an image sensor (e.g., camera) but rather may also perform calculations and/or otherwise process sensor content, including image (e.g., video) content, for example, to provide feedback (e.g., real-time, near real-time and/or post-event) to a user of the firearm.
A targeting aid system may, for example, facilitate feedback, such as audio cues, visual cues, haptic cues, and so forth, which may indicate to a user of adjustments that may be made with respect to firearm positioning and/or movement and/or which may indicate to the user of an appropriate time at which the user may articulate a triggering mechanism so as to fire one or more projectiles towards the target in flight. In particular implementations, a targeting aid system may determine whether a barrel portion of a firearm should be reoriented in an upward or downward direction (e.g., along a pitch axis), whether the barrel portion should be reoriented in a side-to-side direction (e.g., along a yaw axis), and/or whether the barrel portion should be reoriented in a roll axis. Also, in particular implementations, audible, visual and/or haptic feedback cues may be provided to the user to indicate a particular determined reorientation. Further, in particular implementations, a targeting aid system may provide post-event processing to allow the user and/or other individual to review, analyze, and/or evaluate a particular event which may, for example, permit the user to improve the user’s shooting technique. Performance improvements gained in this manner may allow a user to hit targets in flight (e.g., launched clay targets, fowl, etc.) with increasing probability.
Further, in implementations, feedback may be actionable (e.g., by a user) and/or non-actionable. For example, as mentioned, actionable feedback may include indications of adjustments that may be made with respect to firearm positioning and/or movement and/or of an appropriate time at which the user may articulate a triggering mechanism. Non-actionable feedback may include feedback that may not directly contribute to a user’s shooting performance and/or that may be merely informational, in an implementation. For example, non-actional feedback may include tracking performance over time, sharing information with one’s teammates or coach, and/or sharing information through social media, to name a few non-limiting examples.
FIG. 1A depicts a diagram 100 illustrating potential use of an embodiment 300 of an example targeting aid device. In an implementation, targeting aid device 300 may be mounted to a barrel of a firearm, such as firearm 107. Firearm 107 may be operated by a user, such as user 105. In implementations, a launcher 110 may launch one or more targets 150, such as a clay pigeon or other expendable target, into the field of view of user 105. In particular implementations, launcher 110 may launch one or more targets 150 responsive to user 105 uttering and/or otherwise indicating a “pull” command and/or the like. In other implementations, launcher 110 may launch one or more targets 150 responsive to an electronic actuator under control of user 105. In some implementations, launcher 110 may be actuated, for example, by way of user 105 actuating a foot pedal.
Responsive to a launching of target 150 via launcher 110, user 105 may align the barrel of firearm 107 so as to track the motion of launched target 150. Thus, in an embodiment, user 105 may initiate tracking of target 150 in flight at a “pickup point” 115. “Pickup point” refers to a point at which a user initially visually acquires a target in flight. In an embodiment, user 105 may continue tracking of target 150 through a “hold point” 120. “Hold point” refers to a point at which user 105 initially visually inserts the barrel(s) of a firearm into the path of a target in flight. “Break point” refers to a point along a trajectory of a target at which a user may activate a trigger mechanism of a firearm to hit the target with one or more projectiles to be discharged from the firearm. “Break point” may also sometimes be referred to as a “kill point” and/or “shoot point.” Additionally, “aim point” refers to wherever the barrel of a firearm is pointing at any given point in time. At break point 125, user 105 may actuate a trigger mechanism of firearm 107 which may affect a discharge of one or more projectiles 130 from the barrel of firearm 107. Responsive to a discharge of firearm 107, which may occur, for example, at break point 125, one or more highspeed projectiles 130 may coincide with launched target 150. Responsive to one or more high-speed projectiles 130 coinciding with launched target 150, target 150 may disintegrate or fragment, for example.
Further, as utilized herein, “aiming action” and/or the like refers to a movement and/or repositioning of a firearm that is meant to change where a projectile or projectile cloud would fly and/or hit if a trigger of the firearm was to be pulled. Also, “handling action” and/or the like in this context refers to an action or movement that modifies the state of a firearm. For example, a handling action may include an action break, reload, pick up from a resting position, lay down to a resting position, trigger pull, etc., to name a few non-limiting examples.
As mentioned, targeting aid device 300 may be mounted to a barrel of firearm 107, for example. As discussed more fully below, in implementations, targeting aid device 300 may record video events via an integrated image sensor. In particular implementations, targeting aid device 300 may transmit wireless signals and/or signal packets representative of image content to one or more external devices such as a smartphone, tablet device and/or computing device. See, for example, external device 250 of FIG. 2. In addition to transmitting wireless signals and/or signal packets representative of image content, targeting aid device 300 may record movements of the barrel(s) of firearm 107. In particular implementations, targeting aid device 300 may record movements in a coordinate plane, such as in a pitch access 109 as depicted in diagram 100. Further, targeting aid device 300 may record movements in additional axes, such as a yaw axis and/or a roll axis. Various example sensor types, as well as signals and/or signal packets that may be obtained from various sensor types, are discussed more fully below. Various example processes that may be performed on sensor signals and/or signal packets are also described more fully below.
FIG. IB depicts a diagram 101 illustrating potential use of targeting aid device 300 in a hunting context. Similar in many respects to the example depicted in FIG. 1A, targeting aid device 300 may be utilized to train users with respect to improving shooting techniques in the context of hunting wildlife, including, but not limited to, birds. For example, user 105 may respond to a target 160 (e.g., duck, pheasant, etc.) entering the user’s field of view. User 105 may initiate tracking of target 160 at pickup point 115 and may continue tracking of target 160 through hold point 120. At break point 125, user 105 may actuate the trigger mechanism of firearm 107, resulting in one or more projectiles 130 coinciding with target 160.
Whether in a sport shooting context, such as depicted in FIG. 1A, or in a hunting context, such as depicted in FIG. IB, a targeting aid system, such as targeting aid device 300, may facilitate feedback, such as audio cues, visual cues, haptic cues, and so forth, which may indicate to a user, such as user 105, of adjustments that may be made with respect to positioning and/or movement of a firearm, such as firearm 107, and/or which may indicate to the user of an appropriate time at which the user may articulate a triggering mechanism of the firearm, as discussed more fully below.
FIG. 2 is a schematic diagram illustrating an embodiment 200 of an example communications infrastructure including targeting aid device 300. As mentioned, in particular implementations, targeting aid device 300 may facilitate wireless communications with external systems and/or devices. In an implementation, such wireless communication may include real-time or near-real-time transmission of wireless signals and/or signal packets representative of live-action video content. Video content may also be made available for post-event review and/or analysis, for example. As discussed more fully below, such as in connection with FIG. 3, targeting aid device 300 may include a wireless communications interface that may operate via a communications infrastructure which may facilitate, at least in part, a capability for a user and/or other individuals to review, analyze, and/or evaluate the user’s shooting performance and/or technique at a remote location and/or at a later time.
In an implementation, targeting aid device 300 may transmit and/or receive wireless signals and/or signal packets, such as via a wireless communications network. In particular implementations, targeting aid device 300 may communicate with an external device, such smartphone, tablet device and/or computing device 250, via a point-to-point WiFi interconnect. For example, targeting aid device 300 may comprise an access point and external device 250 may comprise a wireless-capable device. In other implementations, targeting aid device 300 and external device 250 may communicate via a wireless local-area network (WLAN). For example, targeting aid device 300 and external device 250 may connect to an access point, such as local transceiver 215.
Also, for example, targeting aid device 300 may communicate with a cellular communications network by transmitting wireless signals to and/or receiving wireless signals from one or more cellular transceivers 210 which may comprise a wireless base transceiver subsystem, a Node B and/or an evolved NodeB (eNodeB), for example, over wireless communication link 223. Similarly, targeting aid device 300 may transmit wireless signals to and/or may receive wireless signals from one or more local transceivers 215 over wireless communication link 225. Local transceiver 215 may comprise an access point (AP), femtocell, Home Base Station, small cell base station, Home Node B (HNB) or Home eNodeB (HeNB), for example, and/or may provide access to a WLAN (e.g., IEEE 802.11 network), for example.
As mentioned, targeting aid device 300 may communication with one or more external devices, such as external device 250. For example, targeting aid device 300 may communicate with one or more external devices such as, for example, smartphone, tablet device and/or computing device 250 via a wireless personal area network (WPAN, e.g., Bluetooth® network), WLAN, and/or a cellular network (e.g. an LTE network or other wireless wide area network, such as those discussed herein), for example. Of course, it should be understood that these are merely examples of devices and/or networks that may communicate with a targeting aid device, such as targeting aid device 300, over a wireless link, and the scope of subject matter is not limited in scope in these respects. In particular implementations, cellular transceiver 210, local transceiver 215 and/or satellite 214 may represent touchpoints which may permit targeting aid device 300 to interact with a network 222.
Examples of network technologies that may support wireless communication link 223 may include GSM, Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Long Term Evolution LTE), High Rate Packet Data (HRPD), etc. GSM, WCDMA and/or LTE may comprise technologies defined by 3rd Generation Partnership Project (3GPP). CDMA and/or HRPD may comprise technologies defined by the 3rd Generation Partnership Project 2 (3GPP2). WCDMA may also be part of the Universal Mobile Telecommunications System (UMTS) and/or may be supported by an HNB. Cellular transceivers 210 may comprise deployments of equipment providing subscriber access to a wireless telecommunication network for a service (e.g., under a service contract), for example. In example implementation 200 depicted in FIG. 2, cellular transceivers 210 may perform functions of a cellular base station in servicing subscriber devices within a cell determined based, at least in part, on a range at which the cellular transceivers 210 are capable of providing access service. Examples of radio technologies that may support wireless communication link 225 may include IEEE 802.11, BT, LTE. etc.
In a particular implementation, cellular transceivers 210 and/or local transceivers 215 may communicate with one or more external devices such as, for example, server 240, such as by way of network 222 via communication links 245. Here, network 240 may comprise any combination of wired or wireless links and may include cellular transceiver 210 and/or local transceiver 215 and/or server 240, for example. In a particular implementation, network 240 may comprise Internet Protocol (IP) and/or other infrastructure capable of facilitating communication between targeting aid device 300 and server 240 through local transceiver 215 or cellular transceiver 210. In another implementation, network 240 may comprise a cellular communication network infrastructure such as, for example, a base station controller or packet-based or circuit-based switching center (not shown) to facilitate mobile cellular communication with targeting aid device 300. In a particular implementation, network 240 may comprise local area network (LAN) elements such as WiFi APs, routers and/or bridges and/or may, in some implementations, comprise links to gateway elements that may provide access to wide area networks such as the Internet. In some implementations, network 240 may comprise multiple networks (e.g., one or more wireless networks and/or the Internet). In an implementation, network 240 may include one or more serving gateways and/or Packet Data Network gateways. In addition, one or more servers 240 may comprise an E-SMLC, a Secure User Plane Location (SUPL) Location Platform (SLP), a SUPL Location Center (SLC), a SUPL Positioning Center (SPC), a Position Determining Entity (PDE) and/or a gateway mobile location center (GMLC), each of which may connect to one or more location retrieval functions (LRFs) and/or mobility management entities (MMEs) of network 240, for example.
As mentioned, communications between targeting aid device 300 and cellular transceiver 110, satellite 114, local transceiver 115, smartphone, tablet device and/or computing device 250, etc. may occur utilizing signals communicated across wireless communications channels. Wireless signals and/or signal packets may be modulated to convey messages utilizing one or more techniques such as amplitude modulation, frequency modulation, binary phase shift keying (BPSK), quaternary phase shift keying (QPSK) and/or any of numerous other modulation techniques, and subject matter is not limited in scope in this respect.
In particular implementations, and as discussed below, targeting aid device 300 may comprise circuitry and/or processing resources capable of obtaining location related measurements (e.g. for signals received from GPS or other Satellite Positioning System (SPS) satellites 214), cellular transceiver 210 and/or local transceiver 215 and/or possibly computing a position fix or estimated location of targeting aid device 300 based at least in part on these location related measurements. In some implementations, location related measurements obtained by targeting aid device 300 may be transferred to a location server such as an enhanced serving mobile location center (E-SMLC) or SUPL location platform (SLP) (e.g. which may comprise a server, such as server 240) after which the location server may estimate or determine an estimated location for targeting aid device 300 based at least in part on the measurements. In the presently illustrated example, location related measurements obtained by targeting aid device 300 may include measurements of signals 224 received from satellites belonging to an SPS or Global Navigation Satellite System (GNSS) such as GPS, GLONASS, Galileo or Beidou and/or may include measurements of signals (such as 223 and/or 225) received from terrestrial transmitters fixed at known locations (e.g., such as cellular transceiver 210). Targeting aid device 300 and/or a separate location server may obtain a location estimate for targeting aid device 300 based at least in part on location related measurements using, for example, GNSS, Assisted GNSS (A-GNSS), Advanced Forward Link Trilateration (AFLT), Observed Time Difference Of Arrival (OTDOA) and/or Enhanced Cell ID (E-CID), or combinations thereof. In an implementation, targeting aid device 300 may comprise an Internet-of- Things (IoT) type device. “loT-type device” and/or the like refers to one or more electronic and/or computing devices capable of leveraging existing Internet and/or like infrastructure as part of the so-called “Internet of Things” or IoT, such as via a variety of applicable protocols, domains, applications, etc. The IoT may typically comprise a system of interconnected and/or internetworked physical devices in which computing may be embedded into hardware so as to facilitate and/or support devices’ ability to acquire, collect, and/or communicate content over one or more communications networks, for example, at times, without human participation and/or interaction.
FIG. 3 is a schematic block diagram depicting an example implementation of targeting aid device 300. Of course, subject matter is not limited in scope to the particular configurations and/or arrangements of components depicted and/or described for example devices mentioned herein. In an implementation, a targeting aid device, such as targeting aid device 300, may comprise one or more processors, such as processor 310, and/or may comprise one or more communications interfaces, such as communications interface 320. In an implementation, communications interface 320 may enable wireless communications between targeting aid device 300 and one or more other external systems and/or devices. For example, communications interface 320 may facilitate, at least in part, wireless communications between targeting aid device 300 and a smartphone, tablet device, laptop computing device, etc. In implementations, wireless communications may occur substantially in accordance any of a wide range of communication standards and/or protocols, such as those mentioned herein, for example. Targeting aid device 300 may also include one or more batteries, such as battery 370.
In a particular implementation, targeting aid device 300 may include a memory, such as memory 330. In a particular implementation, memory 330 may comprise a non-volatile memory, for example. Further, in a particular implementation, memory 330 may have stored therein executable instructions, such as for one or more operating systems, communications protocols, and/or applications, for example. Memory 330 may further store particular instructions, such as software and/or firmware code 332, for example. In an implementations, software and/or firmware code 332 may be updated via wired and/or wireless communication with one or more external devices and/or systems, for example.
In implementations, targeting aid device 300 may include an image sensor (e.g., camera) 350 and/or may further include one or more other sensors 360. As utilized herein, “sensor” and/or the like refers to a device and/or component that may respond to physical stimulus, such as, for example, heat, light, sound pressure, magnetism, particular motions, etc., and/or that may generate one or more signals and/or signal packets in response to physical stimulus. Example sensors may include, but are not limited to, accelerometers, gyroscopes, thermometers, magnetometers, barometers, light sensors, proximity sensors, heart-rate monitors, perspiration sensors, hydration sensors, breath sensors and/or other biometric sensors, microphones, etc., and/or any combination thereof. Targeting aid device 300, for example, may comprise an inertial motion unit (IMU). In implementations, an IMU may comprise one or more sensors such as, for example, one or more accelerometers, gyroscopes, magnetometers, etc.
Further, in a particular implementation, targeting aid device 300 may comprise a user interface, such as user interface 340. In particular implementations, user interface 340 may comprise a video and/or graphics display, although other implementations may not include a video and/or graphics display. User interface 340 may also include a touch-screen to receive user inputs in some implementations. In particular implementations, user interface 340 may include one or more push-buttons and/or switches and/or the like by which a user, such as user 105, may provide inputs to initiate and/or exit various device states and/or modes of operations, for example. User inputs may further include parameters indicative of one or more characteristics of ammunition being used and/or indicative of one or more characteristics of the particular firearm being used. Additionally, a user may provide inputs via an external device, such as smartphone, tablet device and/or computing device 250, for example. User inputs may also be provided by way of voice commands, in an implementation.
In implementations, user interface 340 may include one or more user feedback mechanisms to provide at least haptic, audible and/or visual feedback to a user, such as user 105. In implementations, real-time, near real-time and/or post-event user feedback may be provided to user 105 to improve the user’s shooting technique. For example, feedback provided to user 105 may be directed to prompting the user to adjust the user’s technique to be more in line with specified and/or common shooting practices. In implementations, user feedback may be provided to prompt the user to make adjustments with respect to gun mount technique, swing and/or target connection quality, shooting method and/or timing of shot routine stages and/or position of particular shot stages (e.g., hold point, break point, etc.) to name a few non-limiting examples. In implementations, feedback may be provided to user 105 via one or more hardware interfaces including, for example, a speaker implemented as part of targeting aid device 300 and/or implemented on an external device (e.g., smartphone, tablet device, laptop computing device, etc.), a wired and/or wireless headset (e.g., Bluetooth-enabled earphone(s)) to be worn by user 105, a graphics and/or video screen implemented as part of targeting aid device 300 and/or implemented as part of an external system or device, a haptic motor implemented as part of targeting aid device 300, etc.
In an implementation, feedback may be provided to a user by way of video and/or still images displayed on targeting aid device 300 and/or on an external device, such as smartphone, tablet device and/or computing device 250. For example, trajectories of a target and/or of a projectile or projectile cloud (e.g., lead shot) may be drawn on a display as overlays. Further, in an implementation, user interface 340 may include one or more indicator lights (e.g., light emitting diodes). See, for example, indicator lights 410 depicted in FIG. 4B.
In particular implementations, targeting aid device 300 may further include one or more timers and/or counters and/or like circuits, for example. In an implementation, one or more timers and/or counters and/or the like may track one or more aspects of device performance and/or operation. Also, in an implementation, processor 310 may synchronize image sensor content and/or other sensor content, such as IMU content. For example, processor 310 may timestamp image sensor content and/or other sensor content, such as IMU content, based at least in part on one or more timers and/or counters.
Although FIG. 3 depicts a particular example implementation of a targeting aid device, such as targeting aid device 300, including an image sensor, a processor, a memory and/or a user interface, other implementations of a targeting aid device may include a reduced set of capabilities and/or components. For example, in an implementation, a targeting aid device may include an image sensor to obtain image content and a communications interface to transmit the image content to an external device such as, for example, a smartphone, a tablet device and/or a laptop computing device and/or the like. Image content and/or other sensor content may be processed by an external device to determine what feedback to provide to the user and one or more feedback parameters may be transmitted from the external device to the targeting aid device. As discussed more fully below, various operations related to gathering image and/or other sensor content, processing image and/or other sensor content, determining what feedback to provide to a user, and/or providing feedback to the user may be partitioned between a targeting aid device and one or more external devices in a variety of ways.
As utilized herein, “feedback parameter” and/or the like refers to a physical signal, signal packet and/or physical state specifying at least one characteristic of particular feedback to be provided to a user. For example, a feedback parameter determined within a targeting aid device and/or determined by an external device may be communicated to the targeting aid device and the targeting aid device may, responsive to the received feedback parameter, provide the specified feedback to the user. For example, in an implementation, a feedback parameter may comprise one or more signals and/or signal packets specifying a particular audible signal to be provided to a user to prompt the user to move the user’s aim in a particular direction. Also, in an implementation, a feedback parameter may comprise one or more signals and/or signal packets representative of an image to be displayed to the user, wherein the image may depict an actual aim point vs a desired aim point at a particular point in time, for example. Further, for example, a feedback parameter may comprise one or more signals and/or signal packets representative of at least one video frame to be displayed to the user, wherein the at least one video frame may include one or more overlays depicting an actual aim trajectory vs a desired aim trajectory. Of course, these are merely examples of feedback parameters, and subject matter is not limited in scope in these respects.
FIG. 4A is an illustration depicting a perspective view of an example embodiment of a targeting aid device, such as targeting aid device 300. FIG. 4B depicts a rear face (i.e., user’s perspective) view. As mentioned, a targeting aid device, such as targeting aid device 300, may be mounted to a firearm. In implementations, targeting aid device 300 may be mounted to a barrel of a firearm, such as firearm 107. For example, targeting aid device 300 may comprise a barrel mount 410 that may allow targeting aid device 300 to be fixedly and/or removably mounted to a barrel of a firearm 107. In an implementation, a housing 420 may contain image sensor 350 and/or communications interface 320. Housing 420 may also contain processor 310, memory 330, user interface 340 and/or sensors 360 in some implementations, for example. In an implementation, housing 420 may be slidably attachable to barrel mount 410.
Also, in an implementation, targeting aid device 300 may include one or more indicator lights, such as indicator lights 430. In an implementation, indicator lights 430 may comprise light-emitting diodes (LED) that may be positioned on a userfacing surface of housing 420, for example. Indicator lights 430 may provide feedback and/or other information to a user such as battery level and/or modes of operation (e.g., sleep, standby, record, view, etc.), for example. In an implementation, for a right-handed user, indicator lights 430 may be placed at the top left portion of housing 420 so that they may be visible when the user is shooting. Similarly, for a left-handed shooter, indicator lights 430 may be placed at the top right portion of housing 420, for example.
FIG. 5 is a flow diagram depicting an embodiment 500 of a process performed at a targeting aid system and/or device, such as targeting aid device 300, for providing feedback to a user based at least in part on signals and/or signal packets obtained from at least one sensor of the targeting aid system and/or device. Embodiments in accordance with claimed subject matter may include all of blocks 510-540, fewer than blocks 510-540, and/or more than blocks 510-540. Likewise, it should be noted that content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example process 500 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed. In addition, although the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
In an implementation, signals and/or signal packets may be obtained from at least one sensor of a targeting aid system and/or device, such as targeting aid device 300, as depicted at block 510. As mentioned, sensors of targeting aid device 300 may include one or more image sensors, accelerometers, gyroscopes, thermometers, magnetometers, barometers, light sensors, proximity sensors, biometric sensors, microphones, IMU, etc., and/or any combination thereof.
As further indicated at block 520, based at least in part on the signals and/or signal packets obtained from the at least one sensor, a direction and speed of a target may be determined, in an implementation. As mentioned, a target may comprise a launched target such as clay pigeon, for example. A target may also comprise a bird, for example. In an implementation, processor 310 of targeting aid device 300 may perform operations to determine the direction and speed of a target. In another implementation, such operations may be performed by an external device, such as smartphone, tablet device and/or computing device 250, for example.
Based at least in part on at least one user action relative to the direction and speed of the target, at least one feedback parameter may be determined, as depicted at block 530. In an implementation, the at least one user action relative to the direction and speed of the target may include barrel movement and/or trigger activation, for example. Operations to determine the at least one feedback parameter may be performed by processor 310 of targeting aid device 300, in an implementation. In another implementation, operations to determine the at least one feedback parameter may be performed by an external device, such as a smartphone, tablet device and/or computing device 250, for example.
As additionally indicated at block 540, haptic, audible and/or visual feedback may be provided to a user, such as user 105, at least in part in accordance with the at least one feedback parameter, for example. As mentioned, feedback may be provided to user 105 via one or more hardware interfaces including, for example, a speaker implemented as part of targeting aid device 300 and/or implemented on an external device, a wired and/or wireless headset to be worn by the user, a graphics and/or video screen implemented as part of targeting aid device 300 and/or implemented as part of an external system or device, a haptic motor implemented as part of targeting aid device 300, etc. Of course, these are merely examples of the types of feedback mechanisms that may be implemented and subject matter is not limited in scope in these respects.
As mentioned, feedback provided to user 105, for example, may include realtime, near real-time and/or post-event feedback. Feedback may be directed to improving the user’s shooting technique. In implementations, user feedback may be provided to prompt the user to make adjustments with respect to gun mount technique, swing and/or target connection quality, shooting method and/or timing of shot routine stages and/or position of particular shot stages (e.g., hold point, break point, etc.) to name a few non-limiting examples. In particular implementations, realtime and/or near real-time feedback may indicate to the user whether a barrel of a firearm should be reoriented in an upward or downward direction (e.g., along a pitch axis), whether the barrel portion should be reoriented in a side-to-side direction (e.g., along a yaw axis), and/or whether the barrel portion should be reoriented in a roll axis, for example. Further, in particular implementations, a targeting aid system may facilitate post-event processing to allow a user and/or other individual to review, analyze, and/or evaluate a particular event which may, for example, permit the user to improve the user’s technique. In an implementation, a targeting aid system may comprise a targeting aid device, such as targeting aid device 300. In other implementations, a targeting aid system may include a targeting aid device, such as targeting aid device 300, and may also include one or more external devices, such as smartphone, tablet device and/or computing device 250.
FIG. 6 is a block diagram depicting an embodiment 600 of an example processing pipeline of an example targeting aid system. As mentioned, a targeting aid system may comprise a targeting aid device, such as targeting aid device 300, or may include a targeting aid device, such as targeting aid device 300, and one or more external devices, such as smartphone, tablet device and/or computing device 250. In implementations, the various circuits, operations, blocks, etc. depicted in example 600 may be partitioned in various was between a targeting aid device and one or more external devices. Also, embodiments in accordance with claimed subject matter may include all of the circuits, operations and/or blocks of example pipeline 600, may include fewer than the circuits, operations and/or blocks of example pipeline 600, and/or include more than the circuits, operations and/or blocks of example pipeline 600. Likewise, it should be noted that content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 600 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed. In addition, although the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features. It may also be noted that circuits, operations and/or blocks of example pipeline 600 may be further understood in the context of discussions below in connection with FIGS. 7-10 and/or table 1, for example.
Generally, with respect to example pipeline 600 and/or with respect to circuits, operations and/or blocks discussed below in connection with FIGS. 7-10 and/or table 1, implementations may be directed at determining: 1) where a shooter was aiming and what the shooter was doing at various and/or key points in time; 2) where a target was flying (e.g., speed and direction of target); 3) what technique elements the shooter should follow (e.g., according at least in part to techniques taught by experienced and/or knowledgeable instructors, and/or as described in handbooks from various shooting organizations) and/or how the shooter’s actions differed from the specified technique elements; and/or 4) what actionable feedback to provide to the shooter to help the shooter more closely align with the specified technique elements.
In an implementation, a targeting aid system may comprise an image sensor 602, an audio encoder/decoder (audio codec) 604 and/or one or more other sensors 606. Also, in an implementation, image sensor 602, audio codec 604 and other sensors 606 may provide signals and/or signal packets to buffers 610, 612 and 614, respectively. Image sensor buffer 610 may provide stored image sensor content to a video encoder 630, for example. Also, in an implementation, audio codec buffer 612 may provide audio content (e.g., obtained by audio codec 604 from a microphone) to an audio encoder 632. Other sensor content may be provided by buffer 614 to a content serializer 634, for example.
In an implementation, a multiplexer 636 may provide signals and/or signal packets from video encoder 630, audio encoder 632 and/or content serializer 634 to a writer block 638. In an implementation, writer block 638 may comprise circuitry to write signals and/or signal packets obtained from video encoder 630, audio encoder 632 and/or content serializer 634 to a storage device, such as a secure digital (SD) card and/or a solid state drive (SSD), for example.
Further, in an implementation, an analytics queue 620 may store image content from buffer 610, audio content from buffer 612 and/or other sensor content from buffer 614. Additionally, analytics queue 620 may provide image content, audio content and/or other sensor content to a pre-processing block 640. “Pre-processing” and/or the like in this context may refer to analysis of sensor content to translate raw sensor content into content having semantic meaning. For example, IMU content (e.g., signals and/or signal packets representative of readings from an accelerometer, gyroscope, magnetometer, etc.) may be pre-processed to determine barrel rotation, horizon leveling and/or barrel pointing up/down. IMU content may also be pre- processed to detect an action break (e.g., to reload), for example. Also, for example, image sensor content (e.g., video content), such as in raw, compressed, monochrome and/or color format, may be pre-processed to detect moving objects. For example, a static background may be subtracted from image content to reveal and/or emphasize a moving object (e.g., target in flight). Further, for example, pre-processing of image content may include classification and/or filtering of objects to detect a target and/or to detect a projectile or projectile cloud. Example pre-processing operations are discussed more fully below.
Example pipeline 600 may also include fusion blocks 650. “Fusion” in this context may refer to analysis and/or processing of the output of one or more preprocessing blocks to derive higher-order semantic meanings. For example, target speed and/or direction, weather parameters and/or user-provided content regarding firearm and/or projectile ballistics may be “fused” to calculate a projected trajectory of a target (e.g., clay pigeon, skeet, bird, etc.). Also, for example, barrel position, weather parameters and/or user-provided content regarding firearm and/or projectile ballistics may be fused to calculate a projected trajectory of a projectile and/or projectile cloud (e.g., lead shot). Also, as mentioned, a shooter’s technique may be improved via adherence to particular specified principles at particular points during a target’s trajectory. For example, the pickup point, hold point and/or break point may be key aspects of a shooting period. These points, for example, may be calculated by fusing barrel positioning content (e.g., derived from IMU signals and/or signal packets) with target trajectory and/or projectile or projectile cloud trajectory. Example fusion operations are discussed more fully below.
In an implementation, fusion blocks 650 may provide content to a user feedback generator 660 and/or to a content serializer 662 (e.g., for storage in an SD/SSD), for example. A storage medium, such as memory 330, SD card and/or SSD, for example, may receive content from a processor, such as processor 310, from writer block 638 and/or from content serializer block 662. In an implementation, such content may comprise a merged stream of video content and a text track and/or may comprise a video file and a separate metadata text file, for example. In an implementation, video and/or text content may be timestamped.
Example pipeline 600 also depicts an optional streaming option whereby image content may be provided in real-time or near real-time to a live streaming application that may be executed on an external device, such as a smartphone and/or tablet device, for example.
In particular implementations, operations discussed above in connection with example process 500 and/or example pipeline 600 may be described in terms of a number of logical hardware and/or software layers. For example, FIG. 7 is a flow diagram depicting an embodiment 700 of an example process logically partitioned into a plurality of layers, wherein operations associated with the various layers may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a combination of a targeting aid device and one or more external devices, such as smartphone, tablet device and/or computing device 250. Embodiments in accordance with claimed subject matter may include all of blocks 710-770, fewer than blocks 710-770, and/or more than blocks 710-770. Likewise, it should be noted that content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 700 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed. In addition, although the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
In an implementation, an example “sensors layer” may include collecting inputs from an image sensor, an IMU and/or other sensors, as depicted at block 710. A sensors layer may also include obtaining inputs from a user, for example. In an implementation, a user may provide inputs with respect to projectile (e.g., ammunition) characteristics and/or firearm characteristics, for example. Also, in an implementation, an example “drivers layer” may include timestamping and/or synchronizing inputs (e.g., sensor content) and/or may also include placing inputs in a first-in, first-out queue, for example.
Further, in an implementation, an example “pre-processing layer” may include pre-processing inputs and/or translating signals and/or signal packets obtained from one or more sensors, including an image sensor, into higher-order constructs (e.g., camera roll, barrel rotation, action break, etc.), as depicted at block 730. In an implementation, an example “fusion layer” may include fusing together higher-order constructs into system-wide constructs (e.g., target trajectory, projectile trajectory, key shooting points, etc.) as depicted at block 740.
In an implementation, an example “decision layer” may include making decisions based on system-wide constructs, as depicted at block 750. For example, decisions may be made with respect to determining appropriate feedback to provide to a user and/or to which state to transition the device and/or system. Also, as depicted at block 760, an example “user interface (UI) layer” may include providing feedback to a user with respect to ways to modify the user’s activity for improved shooting performance, for example. In an implementation, feedback types may include, by way of non-limiting examples, visual cues, audio cues and/or commands, haptic vibration, etc.
Also, in an implementation, an example “applications/storage layer” may include storing video content and/or corresponding metadata content in a storage medium, as depicted at block 770. Applications/storage layer may also include transmitting video content and/or corresponding metadata content to an external device, such as smartphone, tablet device and/or computing device 250, for later retrieval and/or analysis, for example.
Table 1, provided below, provides additional example detail for particular implementations related to the various logical operational layers discussed above in connection with example process 700, for example. Column A of table 1 indicates a particular layer. Column B of table 1 specifies for particular layers example hardware (HW) circuits and/or components such as may be implemented in a targeting aid device, such as targeting aid device 300, and/or in an external device, such as smartphone, tablet device and/or computing device 250. Column B additionally specifies particular example software/firmware agents and/or particular example operations for particular layers. Column C of table 1 specifies the origins of inputs to be provided for elements specified in column B for particular layers. Of course, the particular hardware circuits and/or components, software and/or firmware agents, operations, etc. mentioned in table 1 are merely examples, and subject matter is not limited in scope in these respects. Table 1
Figure imgf000024_0001
Figure imgf000025_0001
FIG. 8 is a flow diagram depicting an embodiment 800 for processing image sensor content. Example 800 may comprise a sub-routine with respect to example process 700, in an implementation. As with example process 700, example process 800 may be logically partitioned into a plurality of layers, wherein operations associated with the various layers may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a combination of a targeting aid device and one or more external devices, such as smartphone, tablet device and/or computing device 250. Embodiments in accordance with claimed subject matter may include all of blocks 810-860, fewer than blocks 810-860, and/or more than blocks 810-860. Likewise, it should be noted that content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 800 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed. In addition, although the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
As depicted at block 810, a sensors layer may include collecting inputs, such as signals and/or signal packets, from an image sensor, for example. A drivers layer, as depicted at block 820, may comprise timestamping and/or synchronizing inputs from one or more sensors, such as an image sensor, in an implementation. Drivers layer may also include placing signals and/or signal packets representative of inputs in a first-in, first-out queue, for example.
In an implementation, a pre-processing layer depicted at block 830 may include pre-processing inputs, such as image content obtained from an image sensor, to subtract a target from static background image content. Also, for example, preprocessing layer may include calculating a predicted trajectory of a target in 3D space (e.g., relative to the image sensor’s position). A fusion layer, for example, may include fusing a target trajectory with a projectile or projectile cloud trajectory to predict whether the two trajectories will meet if a trigger mechanism of a firearm is activated at a particular moment. Additionally, similar to example 700, for example, a decision layer, as depicted at block 850, may include determining what feedback to provide the user and user interface layer 860 may include providing feedback to the user.
FIG. 9 is a flow diagram depicting an embodiment 900 for processing IMU content. Process 900 may comprise a sub-routine with respect to example process 700, for example. As with example process 700 and/or example process 800, example process 900 may be logically partitioned into a plurality of layers, wherein operations associated with the various layers may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a combination of a targeting aid device and one or more external devices, such as smartphone, tablet device and/or computing device 250. Embodiments in accordance with claimed subject matter may include all of blocks 910-960, fewer than blocks 910-960, and/or more than blocks 910-960. Likewise, it should be noted that content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 900 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed. In addition, although the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
As depicted at block 910, a sensors layer may include collecting inputs, such as signals and/or signal packets, from an inertial motion unit (IMU), for example. A drivers layer, as depicted at block 920, may comprise timestamping and/or synchronizing inputs from one or more sensors, such as an IMU, in an implementation. Drivers layer may also include placing signals and/or signal packets representative of inputs in a first-in, first-out queue.
In an implementation, a pre-processing layer, such as depicted at block 930, may comprise pre-processing inputs, such as IMU content, to estimate barrel orientation (e.g., derived at least in part from gyroscope reading), rotation (e.g., derived at least in part from accelerometer reading) and/or position (e.g., derived at least in part from magnetometer reading). Block 940 depicts a fusion layer that may include, for example, fusing barrel orientation, rotation and/or position with projectile speed to calculate a projectile or projectile cloud trajectory and/or to predict whether and/or where the target trajectory and projectile trajectory will meet. Additionally, similar to examples 700 and 800, a decision layer depicted at block 950 may include determining what feedback to provide the user and user interface layer 960 may include providing feedback to the user.
FIG. 10 is a block diagram depicting an example state diagram 1000 for an example targeting aid system and/or device, such as targeting aid device 300. In an implementation, targeting aid device 300 may have several distinct states. Transitions between states may be initiated by particular events and/or conditions. Characteristics of particular states may be considered in conjunction with example pipeline 600 depicted in FIG. 6, discussed above, for example.
In an implementation, a transition from an initial “off’ state 1010 to a “deep sleep” state 1020 may occur responsive, at least in part, to a user input such as a button press and/or voice command, for example. Further, for example, a transition from deep sleep state 1020 to a “view” state 1030 may occur responsive at least in part to a closing of an action of a firearm (e.g., shotgun). Similarly, for example, an opening of the firearm’s action may initiate a transition from view state 1030 back to deep sleep state 1020. In an implementation, view state 1030 may include prebuffering of video frames and/or pre-buffering signals and/or signal packets obtained from one or more sensors such as, for example, an IMU, microphone, etc. “Prebuffering” in this context refers to storing signals and/or signal packets prior to issuance of a “pull” command (or some other command and/or input that may be interpreted to be similar to “pull”) from a user, for example.
Referring to example pipeline 600 depicted in FIG. 6, view state 1030 may include buffering content from image sensor 602, audio codec 604 and/or other sensors 606 in buffers 610, 612 and/or 614, respectively, for example. In an implementation, buffers 610, 612 and/or 614 may store a specified amount of sensor and/or image content. For example, buffers 610, 612 and/or 614 may store “n” seconds worth of image content, audio content and/or other sensor content. Also, in an implementation, buffers 610, 612 and/or 614 may comprise first-in, first-out buffers wherein oldest content may be replaced as the buffers become full, for example. As mentioned, view state 1030 may include maintaining n seconds of image and/or sensor content, in an implementation. Also, in an implementation, a transition from view state 1030 to a “record” state may occur responsive at least in part to a “pull” command and/or the like uttered and/or otherwise indicated by a user. In the event that a pull command and/or the like is not uttered and/or otherwise indicated by the user and/or in the event that targeting aid device 300 fails to detect the pull command, targeting aid device 300 may transition to a “process” state 1050, discussed below, responsive at least in part to a shot being fired. Because n seconds of content may be stored prior to a pull command, recording of video content, audio content and/or other sensor content during record state 1040 may begin n seconds prior to the pull command, in an implementation.
In an implementation, during view state 1030, video content may be streamed to an external device, such as smartphone, table device and/or computing device 250, for example. Communications between targeting aid device 300 and external device 250 is discussed above, such as in connection with FIG. 2. Pipeline 600 depicted in FIG. 6 also depicts streaming image content via a live streaming app 670 to a smartphone and/or tablet device, for example.
In an implantation, in record state 1040, image content, audio content and/or other sensor content, such as may be processed by video encoder 630, audio encoder 632 and/or content serializer 634, for example, may be written to a storage medium such as an SD card and/or SSD, for example, referring again to example pipeline 600 depicted in FIG. 6. Further, in record state 1040, for example, image content, audio content and/or other sensor content may be stored in analytics queue 620. Thus, upon transitioning to process state 1050 responsive at least in part to a detection of a shot fired, content stored in analytics queue 620 and/or buffers 610, 612 and/or 614 may be provided to pre-processing blocks 640, for example. In this manner, for example, image content, audio content and/or other sensor content going back n seconds prior to a pull command and/or the like may be available for processing, in an implementation.
Also, in an implementation, a transition from record state 1040 back to view state 1030 may occur responsive at least in part to a timeout (e.g., a shot is not detected within a specified period of time). Further, for example, a transition from record state 1040 to deep sleep state 1020 may occur responsive at least in part to an opening of the firearm action (e.g., indicating an end to a current shooting session).
As mentioned, a transition from view state 1040 to process state 1050 may occur responsive at least in part to a shot fired, for example. Referring again to example pipeline 600 depicted in FIG. 6, process state 1050 may involve pre-processing blocks 640, fusion blocks 650 and/or user feedback generator 660, for example. Also, during process state 1050, image content, audio content and/or other sensor content may continue to be written by writer block 638 to a storage medium, such as an SD card and/or an SSD, for example. Also, streaming of image content may continue via live stream app 670, for example.
In an implementation, a transition from process state 1050 to post-process state 1060 may occur responsive at least in part to a specified amount of time having passed since a most recently-detected shot. In post-process state 1060, recording of image content, audio content and/or other sensor content may cease. Referring again to example pipeline 600, post-process state 1060 may involve buffers 610, 612 and/or 614, analytics queue 620, pre-processing blocks 640, fusion blocks 650, user feedback generator block 660 and/or content serializer 662, for example. Also, in an implementation, a transition from post-process state 1060 to view state 1030 may occur responsive at least in part to completion of post-processing operations, for example. In an implementation, post-processing operations may include processing (e.g., via pre-processing blocks 640, fusion blocks 650 and/or user feedback generator block 660) content stored in analytics queue 620. Upon completion of such processing, post-process state 1060 may complete and target aid device may automatically transition to view state 1030.
As discussed previously, such as in connection with FIG. 5, for example, feedback parameters may be determined based at least in part on an action by a user. In implementations, an action by a user, such as user 105, may include positioning and/or moving a barrel of a firearm, such as firearm 107. FIG. 1 1 is a flow diagram depicting an embodiment 1100 of an example process for determining barrel movement. Operations associated example process 1100 may be performed at a targeting aid system and/or device, such as targeting aid device 300, and/or at a one or more external devices, such as smartphone, tablet device and/or computing device 250. Embodiments in accordance with claimed subject matter may include all of blocks 1 1 10-1 150, fewer than blocks 1 1 10- 1 150, and/or more than blocks 1 1 10-1 150. Likewise, it should be noted that content acquired or produced, such as, for example, input signals, output signals, operations, results, etc. associated with example embodiment 1 100 may be represented via one or more analog and/or digital signals and/or signal packets. It should also be appreciated that even though one or more operations are illustrated or described concurrently or with respect to a certain sequence, other sequences or concurrent operations may be employed. In addition, although the description below references particular aspects and/or features illustrated in certain other figures, one or more operations may be performed with other aspects and/or features.
To determine barrel movement, in an implementation, inputs in the form of signals and/or signal packets may be obtained from an image sensor of a targeting aid device, such as targeting aid device 300, as indicated at block 1 110. Further, as depicted at block 1120, a determination may be made as to whether permanent and/or static background features (e.g., trees) can be discerned in one or more video frames obtained via the image sensor inputs.
In an implementation, for circumstances wherein permanent and/or static background features can be discerned from video frames obtained via the image sensor inputs, barrel movement may be determined based at least in part on the image sensor inputs, as indicated at block 1130. For example, when processing two consecutive frames of video, if a particular permanent and/or static background object, such as a tree, is noted to shift position from one frame to the next it may be determined that the image sensor is moving. In an implementation, because targeting aid device 300 may be mounted to the barrel of the firearm, movement of the image sensor may correlate with movement of the barrel. At least in part by detecting how the permanent and/or static objects shift position from one frame to a subsequent frame, a particular movement of the barrel of the firearm may be calculated, for example.
Also, in an implementation, for circumstances in which permanent and/or static objects may not be visible and/or otherwise discernable from analysis of video frames obtained via the image sensor inputs, barrel movement may be determined based at least in part on inputs obtained from an IMU of targeting aid device 300. For example, as indicated at block 1140, inputs (e.g., readings in the form of signals and/or signal packets) may be obtained from the IMU. In an implementation, an IMU may provide readings related to linear acceleration and/or angular speed, for example. As further indicated at block 1 150, barrel movement may be determined based at least in part on the linear acceleration and/or angular speed readings obtained from the IMU, for example.
In the context of the present patent application, the term “connection,” the term “component” and/or similar terms are intended to be physical, but are not necessarily always tangible. Whether or not these terms refer to tangible subject matter, thus, may vary in a particular context of usage. As an example, a tangible connection and/or tangible connection path may be made, such as by a tangible, electrical connection, such as an electrically conductive path comprising metal or other conductor, that is able to conduct electrical current between two tangible components. Likewise, a tangible connection path may be at least partially affected and/or controlled, such that, as is typical, a tangible connection path may be open or closed, at times resulting from influence of one or more externally derived signals, such as external currents and/or voltages, such as for an electrical switch. Non-limiting illustrations of an electrical switch include a transistor, a diode, etc. However, a “connection” and/or “component,” in a particular context of usage, likewise, although physical, can also be non-tangible, such as a connection between a client and a server over a network, particularly a wireless network, which generally refers to the ability for the client and server to transmit, receive, and/or exchange communications, as discussed in more detail later.
In a particular context of usage, such as a particular context in which tangible components are being discussed, therefore, the terms "coupled” and "connected" are used in a manner so that the terms are not synonymous. Similar terms may also be used in a manner in which a similar intention is exhibited. Thus, "connected" is used to indicate that two or more tangible components and/or the like, for example, are tangibly in direct physical contact. Thus, using the previous example, two tangible components that are electrically connected are physically connected via a tangible electrical connection, as previously discussed. However, “coupled," is used to mean that potentially two or more tangible components are tangibly in direct physical contact. Nonetheless, “coupled” is also used to mean that two or more tangible components and/or the like are not necessarily tangibly in direct physical contact, but are able to co-operate, liaise, and/or interact, such as, for example, by being "optically coupled.” Likewise, the term “coupled” is also understood to mean indirectly connected. It is further noted, in the context of the present patent application, since memory, such as a memory component and/or memory states, is intended to be non- transitory, the term physical, at least if used in relation to memory necessarily implies that such memory components and/or memory states, continuing with the example, are tangible.
Additionally, in the present patent application, in a particular context of usage, such as a situation in which tangible components (and/or similarly, tangible materials) are being discussed, a distinction exists between being “on” and being “over.” As an example, deposition of a substance “on” a substrate refers to a deposition involving direct physical and tangible contact without an intermediary, such as an intermediary substance, between the substance deposited and the substrate in this latter example; nonetheless, deposition “over” a substrate, while understood to potentially include deposition “on” a substrate (since being “on” may also accurately be described as being “over”), is understood to include a situation in which one or more intermediaries, such as one or more intermediary substances, are present between the substance deposited and the substrate so that the substance deposited is not necessarily in direct physical and tangible contact with the substrate.
A similar distinction is made in an appropriate particular context of usage, such as in which tangible materials and/or tangible components are discussed, between being “beneath” and being “under.” While “beneath,” in such a particular context of usage, is intended to necessarily imply physical and tangible contact (similar to “on,” as just described), “under” potentially includes a situation in which there is direct physical and tangible contact, but does not necessarily imply direct physical and tangible contact, such as if one or more intermediaries, such as one or more intermediary substances, are present. Thus, “on” is understood to mean “immediately over” and “beneath” is understood to mean “immediately under.”
It is likewise appreciated that terms such as “over” and “under” are understood in a similar manner as the terms “up,” “down,” “top,” “bottom,” and so on, previously mentioned. These terms may be used to facilitate discussion, but are not intended to necessarily restrict scope of claimed subject matter. For example, the term “over,” as an example, is not meant to suggest that claim scope is limited to only situations in which an embodiment is right side up, such as in comparison with the embodiment being upside down, for example. An example includes a flip chip, as one illustration, in which, for example, orientation at various times (e.g., during fabrication) may not necessarily correspond to orientation of a final product. Thus, if an object, as an example, is within applicable claim scope in a particular orientation, such as upside down, as one example, likewise, it is intended that the latter also be interpreted to be included within applicable claim scope in another orientation, such as right side up, again, as an example, and vice-versa, even if applicable literal claim language has the potential to be interpreted otherwise. Of course, again, as always has been the case in the specification of a patent application, particular context of description and/or usage provides helpful guidance regarding reasonable inferences to be drawn.
Unless otherwise indicated, in the context of the present patent application, the term “or” if used to associate a list, such as A, B, or C, is intended to mean A, B, and C, here used in the inclusive sense, as well as A, B, or C, here used in the exclusive sense. With this understanding, “and” is used in the inclusive sense and intended to mean A, B, and C; whereas “and/or” can be used in an abundance of caution to make clear that all of the foregoing meanings are intended, although such usage is not required. In addition, the term “one or more” and/or similar terms is used to describe any feature, structure, characteristic, and/or the like in the singular, “and/or” is also used to describe a plurality and/or some other combination of features, structures, characteristics, and/or the like. Likewise, the term “based on” and/or similar terms are understood as not necessarily intending to convey an exhaustive list of factors, but to allow for existence of additional factors not necessarily expressly described.
Furthermore, it is intended, for a situation that relates to implementation of claimed subject matter and is subject to testing, measurement, and/or specification regarding degree, that the particular situation be understood in the following manner. As an example, in a given situation, assume a value of a physical property is to be measured. If alternatively reasonable approaches to testing, measurement, and/or specification regarding degree, at least with respect to the property, continuing with the example, is reasonably likely to occur to one of ordinary skill, at least for implementation purposes, claimed subject matter is intended to cover those alternatively reasonable approaches unless otherwise expressly indicated. As an example, if a plot of measurements over a region is produced and implementation of claimed subject matter refers to employing a measurement of slope over the region, but a variety of reasonable and alternative techniques to estimate the slope over that region exist, claimed subject matter is intended to cover those reasonable alternative techniques unless otherwise expressly indicated.
To the extent claimed subject matter is related to one or more particular measurements, such as with regard to physical manifestations capable of being measured physically, such as, without limit, temperature, pressure, voltage, current, electromagnetic radiation, etc., it is believed that claimed subject matter does not fall within the abstract idea judicial exception to statutory subject matter. Rather, it is asserted, that physical measurements are not mental steps and, likewise, are not abstract ideas.
It is noted, nonetheless, that a typical measurement model employed is that one or more measurements may respectively comprise a sum of at least two components. Thus, for a given measurement, for example, one component may comprise a deterministic component, which in an ideal sense, may comprise a physical value (e.g., sought via one or more measurements), often in the form of one or more signals, signal samples and/or states, and one component may comprise a random component, which may have a variety of sources that may be challenging to quantify. At times, for example, lack of measurement precision may affect a given measurement. Thus, for claimed subject matter, a statistical or stochastic model may be used in addition to a deterministic model as an approach to identification and/or prediction regarding one or more measurement values that may relate to claimed subject matter.
It is further noted that the terms “type” and/or “like,” if used, such as with a feature, structure, characteristic, and/or the like, using “optical” or “electrical” as simple examples, means at least partially of and/or relating to the feature, structure, characteristic, and/or the like in such a way that presence of minor variations, even variations that might otherwise not be considered fully consistent with the feature, structure, characteristic, and/or the like, do not in general prevent the feature, structure, characteristic, and/or the like from being of a “type” and/or being “like,” (such as being an “optical-type” or being “optical-like,” for example) if the minor variations are sufficiently minor so that the feature, structure, characteristic, and/or the like would still be considered to be substantially present with such variations also present. Thus, continuing with this example, the terms optical-type and/or optical-like properties are necessarily intended to include optical properties. Likewise, the terms electrical-type and/or electrical-like properties, as another example, are necessarily intended to include electrical properties. It should be noted that the specification of the present patent application merely provides one or more illustrative examples and claimed subject matter is intended to not be limited to one or more illustrative examples; however, again, as has always been the case with respect to the specification of a patent application, particular context of description and/or usage provides helpful guidance regarding reasonable inferences to be drawn.
With advances in technology, it has become more typical to employ distributed computing and/or communication approaches in which portions of a process, such as signal processing of signal samples, for example, may be allocated among various devices, including one or more client devices and/or one or more server devices, via a computing and/or communications network, for example. A network may comprise two or more devices, such as network devices and/or computing devices, and/or may couple devices, such as network devices and/or computing devices, so that signal communications, such as in the form of signal packets and/or signal frames (e.g., comprising one or more signal samples), for example, may be exchanged, such as between a server device and/or a client device, as well as other types of devices, including between wired and/or wireless devices coupled via a wired and/or wireless network, for example.
In the context of the present patent application, the term network device refers to any device capable of communicating via and/or as part of a network and may comprise a computing device. While network devices may be capable of communicating signals (e.g., signal packets and/or frames), such as via a wired and/or wireless network, they may also be capable of performing operations associated with a computing device, such as arithmetic and/or logic operations, processing and/or storing operations (e.g., storing signal samples), such as in memory as tangible, physical memory states, and/or may, for example, operate as a server device and/or a client device in various embodiments. Network devices capable of operating as a server device, a client device and/or otherwise, may include, as examples, dedicated rack-mounted servers, desktop computers, laptop computers, set top boxes, tablets, netbooks, smart phones, wearable devices, integrated devices combining two or more features of the foregoing devices, and/or the like, or any combination thereof. As mentioned, signal packets and/or frames, for example, may be exchanged, such as between a server device and/or a client device, as well as other types of devices, including between wired and/or wireless devices coupled via a wired and/or wireless network, for example, or any combination thereof. It is noted that the terms, server, server device, server computing device, server computing platform and/or similar terms are used interchangeably. Similarly, the terms client, client device, client computing device, client computing platform and/or similar terms are also used interchangeably. While in some instances, for ease of description, these terms may be used in the singular, such as by referring to a “client device” or a “server device,” the description is intended to encompass one or more client devices and/or one or more server devices, as appropriate. Along similar lines, references to a “database” are understood to mean, one or more databases and/or portions thereof, as appropriate.
It should be understood that for ease of description, a network device (also referred to as a networking device) may be embodied and/or described in terms of a computing device and vice-versa. However, it should further be understood that this description should in no way be construed so that claimed subject matter is limited to one embodiment, such as only a computing device and/or only a network device, but, instead, may be embodied as a variety of devices or combinations thereof, including, for example, one or more illustrative examples.
A network may also include now known, and/or to be later developed arrangements, derivatives, and/or improvements, including, for example, past, present and/or future mass storage, such as network attached storage (NAS), a storage area network (SAN), and/or other forms of device readable media, for example. A network may include a portion of the Internet, one or more local area networks (LANs), one or more wide area networks (WANs), wire-line type connections, wireless type connections, other connections, or any combination thereof. Thus, a network may be worldwide in scope and/or extent. Likewise, sub-networks, such as may employ differing architectures and/or may be substantially compliant and/or substantially compatible with differing protocols, such as network computing and/or communications protocols (e.g., network protocols), may interoperate within a larger network.
In the context of the present patent application, the term sub-network and/or similar terms, if used, for example, with respect to a network, refers to the network and/or a part thereof. Sub-networks may also comprise links, such as physical links, connecting and/or coupling nodes, so as to be capable to communicate signal packets and/or frames between devices of particular nodes, including via wired links, wireless links, or combinations thereof. Various types of devices, such as network devices and/or computing devices, may be made available so that device interoperability is enabled and/or, in at least some instances, may be transparent. In the context of the present patent application, the term “transparent,” if used with respect to devices of a network, refers to devices communicating via the network in which the devices are able to communicate via one or more intermediate devices, such as one or more intermediate nodes, but without the communicating devices necessarily specifying the one or more intermediate nodes and/or the one or more intermediate devices of the one or more intermediate nodes and/or, thus, may include within the network the devices communicating via the one or more intermediate nodes and/or the one or more intermediate devices of the one or more intermediate nodes, but may engage in signal communications as if such intermediate nodes and/or intermediate devices are not necessarily involved. For example, a router may provide a link and/or connection between otherwise separate and/or independent LANs.
In the context of the present patent application, a “private network” refers to a particular, limited set of devices, such as network devices and/or computing devices, able to communicate with other devices, such as network devices and/or computing devices, in the particular, limited set, such as via signal packet and/or signal frame communications, for example, without a need for re-routing and/or redirecting signal communications. A private network may comprise a stand-alone network; however, a private network may also comprise a subset of a larger network, such as, for example, without limitation, all or a portion of the Internet. Thus, for example, a private network “in the cloud” may refer to a private network that comprises a subset of the Internet. Although signal packet and/or frame communications (e.g. signal communications) may employ intermediate devices of intermediate nodes to exchange signal packets and/or signal frames, those intermediate devices may not necessarily be included in the private network by not being a source or designated destination for one or more signal packets and/or signal frames, for example. It is understood in the context of the present patent application that a private network may direct outgoing signal communications to devices not in the private network, but devices outside the private network may not necessarily be able to direct inbound signal communications to devices included in the private network.
The Internet refers to a decentralized global network of interoperable networks that comply with the Internet Protocol (IP). It is noted that there are several versions of the Internet Protocol. The term Internet Protocol, IP, and/or similar terms are intended to refer to any version, now known and/or to be later developed. The Internet includes local area networks (LANs), wide area networks (WANs), wireless networks, and/or long haul public networks that, for example, may allow signal packets and/or frames to be communicated between LANs. The term World Wide Web (WWW or Web) and/or similar terms may also be used, although it refers to a part of the Internet that complies with the Hypertext Transfer Protocol (HTTP). For example, network devices may engage in an HTTP session through an exchange of appropriately substantially compatible and/or substantially compliant signal packets and/or frames. It is noted that there are several versions of the Hypertext Transfer Protocol. The term Hypertext Transfer Protocol, HTTP, and/or similar terms are intended to refer to any version, now known and/or to be later developed. It is likewise noted that in various places in this document substitution of the term Internet with the term World Wide Web (“Web”) may be made without a significant departure in meaning and may, therefore, also be understood in that manner if the statement would remain correct with such a substitution.
Although claimed subject matter is not in particular limited in scope to the Internet and/or to the Web; nonetheless, the Internet and/or the Web may without limitation provide a useful example of an embodiment at least for purposes of illustration. As indicated, the Internet and/or the Web may comprise a worldwide system of interoperable networks, including interoperable devices within those networks. The Internet and/or Web has evolved to a public, self-sustaining facility accessible to potentially billions of people or more worldwide. Also, in an embodiment, and as mentioned above, the terms “WWW” and/or “Web” refer to a part of the Internet that complies with the Hypertext Transfer Protocol. The Internet and/or the Web, therefore, in the context of the present patent application, may comprise a service that organizes stored digital content, such as, for example, text, images, video, etc., through the use of hypermedia, for example. It is noted that a network, such as the Internet and/or Web, may be employed to store electronic files and/or electronic documents.
The term electronic file and/or the term electronic document are used throughout this document to refer to a set of stored memory states and/or a set of physical signals associated in a manner so as to thereby at least logically form a file (e.g., electronic) and/or an electronic document. That is, it is not meant to implicitly reference a particular syntax, format and/or approach used, for example, with respect to a set of associated memory states and/or a set of associated physical signals. If a particular type of file storage format and/or syntax, for example, is intended, it is referenced expressly. It is further noted an association of memory states, for example, may be in a logical sense and not necessarily in a tangible, physical sense. Thus, although signal and/or state components of a file and/or an electronic document, for example, are to be associated logically, storage thereof, for example, may reside in one or more different places in a tangible, physical memory, in an embodiment.
In the context of the present patent application, the term “Web site” and/or similar terms refer to Web pages that are associated electronically to form a particular collection thereof. Also, in the context of the present patent application, “Web page” and/or similar terms refer to an electronic file and/or an electronic document accessible via a network, including by specifying a uniform resource locator (URL) for accessibility via the Web, in an example embodiment. As alluded to above, in one or more embodiments, a Web page may comprise digital content coded (e.g., via computer instructions) using one or more languages, such as, for example, markup languages, including HTML and/or XML, although claimed subject matter is not limited in scope in this respect. Also, in one or more embodiments, application developers may write code (e.g., computer instructions) in the form of JavaScript (or other programming languages), for example, executable by a computing device to provide digital content to populate an electronic document and/or an electronic file in an appropriate format, such as for use in a particular application, for example. Use of the term “JavaScript” and/or similar terms intended to refer to one or more particular programming languages are intended to refer to any version of the one or more programming languages identified, now known and/or to be later developed. Thus, JavaScript is merely an example programming language. As was mentioned, claimed subject matter is not intended to be limited to examples and/or illustrations. In the context of the present patent application, the terms “entry,” “electronic entry,” “document,” “electronic document,” “content,”, “digital content,” “item,” and/or similar terms are meant to refer to signals and/or states in a physical format, such as a digital signal and/or digital state format, e.g., that may be perceived by a user if displayed, played, tactilely generated, etc. and/or otherwise executed by a device, such as a digital device, including, for example, a computing device, but otherwise might not necessarily be readily perceivable by humans (e.g., if in a digital format). Likewise, in the context of the present patent application, digital content provided to a user in a form so that the user is able to readily perceive the underlying content itself (e.g., content presented in a form consumable by a human, such as hearing audio, feeling tactile sensations and/or seeing images, as examples) is referred to, with respect to the user, as “consuming” digital content, “consumption” of digital content, “consumable” digital content and/or similar terms. For one or more embodiments, an electronic document and/or an electronic file may comprise a Web page of code (e.g., computer instructions) in a markup language executed or to be executed by a computing and/or networking device, for example. In another embodiment, an electronic document and/or electronic file may comprise a portion and/or a region of a Web page. However, claimed subject matter is not intended to be limited in these respects.
Also, for one or more embodiments, an electronic document and/or electronic file may comprise a number of components. As previously indicated, in the context of the present patent application, a component is physical, but is not necessarily tangible. As an example, components with reference to an electronic document and/or electronic file, in one or more embodiments, may comprise text, for example, in the form of physical signals and/or physical states (e.g., capable of being physically displayed). Typically, memory states, for example, comprise tangible components, whereas physical signals are not necessarily tangible, although signals may become (e.g., be made) tangible, such as if appearing on a tangible display, for example, as is not uncommon. Also, for one or more embodiments, components with reference to an electronic document and/or electronic file may comprise a graphical object, such as, for example, an image, such as a digital image, and/or sub-objects, including attributes thereof, which, again, comprise physical signals and/or physical states (e.g., capable of being tangibly displayed). In an embodiment, digital content may comprise, for example, text, images, audio, video, and/or other types of electronic documents and/or electronic files, including portions thereof, for example.
Also, in the context of the present patent application, the term parameters (e.g., one or more parameters) refer to material descriptive of a collection of signal samples, such as one or more electronic documents and/or electronic files, and exist in the form of physical signals and/or physical states, such as memory states. For example, one or more parameters, such as referring to an electronic document and/or an electronic file comprising an image, may include, as examples, time of day at which an image was captured, latitude and longitude of an image capture device, such as a camera, for example, etc. In another example, one or more parameters relevant to digital content, such as digital content comprising a technical article, as an example, may include one or more authors, for example. Claimed subject matter is intended to embrace meaningful, descriptive parameters in any format, so long as the one or more parameters comprise physical signals and/or states, which may include, as parameter examples, collection name (e.g., electronic file and/or electronic document identifier name), technique of creation, purpose of creation, time and date of creation, logical path if stored, coding formats (e.g., type of computer instructions, such as a markup language) and/or standards and/or specifications used so as to be protocol compliant (e.g., meaning substantially compliant and/or substantially compatible) for one or more uses, and so forth.
Signal packet communications and/or signal frame communications, also referred to as signal packet transmissions and/or signal frame transmissions (or merely “signal packets” or “signal frames”), may be communicated between nodes of a network, where a node may comprise one or more network devices and/or one or more computing devices, for example. As an illustrative example, but without limitation, a node may comprise one or more sites employing a local network address, such as in a local network address space. Likewise, a device, such as a network device and/or a computing device, may be associated with that node. It is also noted that in the context of this patent application, the term “transmission” is intended as another term for a type of signal communication that may occur in any one of a variety of situations. Thus, it is not intended to imply a particular directionality of communication and/or a particular initiating end of a communication path for the “transmission” communication. For example, the mere use of the term in and of itself is not intended, in the context of the present patent application, to have particular implications with respect to the one or more signals being communicated, such as, for example, whether the signals are being communicated “to” a particular device, whether the signals are being communicated “from” a particular device, and/or regarding which end of a communication path may be initiating communication, such as, for example, in a “push type” of signal transfer or in a “pull type” of signal transfer. In the context of the present patent application, push and/or pull type signal transfers are distinguished by which end of a communications path initiates signal transfer.
Thus, a signal packet and/or frame may, as an example, be communicated via a communication channel and/or a communication path, such as comprising a portion of the Internet and/or the Web, from a site via an access node coupled to the Internet or vice-versa. Likewise, a signal packet and/or frame may be forwarded via network nodes to a target site coupled to a local network, for example. A signal packet and/or frame communicated via the Internet and/or the Web, for example, may be routed via a path, such as either being “pushed” or “pulled,” comprising one or more gateways, servers, etc. that may, for example, route a signal packet and/or frame, such as, for example, substantially in accordance with a target and/or destination address and availability of a network path of network nodes to the target and/or destination address. Although the Internet and/or the Web comprise a network of interoperable networks, not all of those interoperable networks are necessarily available and/or accessible to the public.
In the context of the particular patent application, a network protocol, such as for communicating between devices of a network, may be characterized, at least in part, substantially in accordance with a layered description, such as the so-called Open Systems Interconnection (OSI) seven layer type of approach and/or description. A network computing and/or communications protocol (also referred to as a network protocol) refers to a set of signaling conventions, such as for communication transmissions, for example, as may take place between and/or among devices in a network. In the context of the present patent application, the term “between” and/or similar terms are understood to include “among” if appropriate for the particular usage and vice-versa. Likewise, in the context of the present patent application, the terms “compatible with,” “comply with” and/or similar terms are understood to respectively include substantial compatibility and/or substantial compliance.
A network protocol, such as protocols characterized substantially in accordance with the aforementioned OSI description, has several layers. These layers are referred to as a network stack. Various types of communications (e.g., transmissions), such as network communications, may occur across various layers. A lowest level layer in a network stack, such as the so-called physical layer, may characterize how symbols (e.g., bits and/or bytes) are communicated as one or more signals (and/or signal samples) via a physical medium (e.g., twisted pair copper wire, coaxial cable, fiber optic cable, wireless air interface, combinations thereof, etc.). Progressing to higher- level layers in a network protocol stack, additional operations and/or features may be available via engaging in communications that are substantially compatible and/or substantially compliant with a particular network protocol at these higher-level layers. For example, higher-level layers of a network protocol may, for example, affect device permissions, user permissions, etc.
A network and/or sub-network, in an embodiment, may communicate via signal packets and/or signal frames, such as via participating digital devices and may be substantially compliant and/or substantially compatible with, but is not limited to, now known and/or to be developed, versions of any of the following network protocol stacks: ARCNET, AppleTalk, ATM, Bluetooth, DECnet, Ethernet, FDDI, Frame Relay, HIPPI, IEEE 1394, IEEE 802.11, IEEE-488, Internet Protocol Suite, IPX, Myrinet, OSI Protocol Suite, QsNet, RS-232, SPX, System Network Architecture, Token Ring, USB, and/or X.25. A network and/or sub-network may employ, for example, a version, now known and/or later to be developed, of the following: TCP/IP, UDP, DECnet, NetBEUI, IPX, AppleTalk and/or the like. Versions of the Internet Protocol (IP) may include IPv4, IPv6, and/or other later to be developed versions.
Regarding aspects related to a network, including a communications and/or computing network, a wireless network may couple devices, including client devices, with the network. A wireless network may employ stand-alone, ad-hoc networks, mesh networks, Wireless LAN (WLAN) networks, cellular networks, and/or the like. A wireless network may further include a system of terminals, gateways, routers, and/or the like coupled by wireless radio links, and/or the like, which may move freely, randomly and/or organize themselves arbitrarily, such that network topology may change, at times even rapidly. A wireless network may further employ a plurality of network access technologies, including a version of Long Term Evolution (LTE), WLAN, Wireless Router (WR) mesh, 2nd, 3rd, or 4th generation (2G, 3G, 4G, or 5G) cellular technology and/or the like, whether currently known and/or to be later developed. Network access technologies may enable wide area coverage for devices, such as computing devices and/or network devices, with varying degrees of mobility, for example.
A network may enable radio frequency and/or other wireless type communications via a wireless network access technology and/or air interface, such as Global System for Mobile communication (GSM), Universal Mobile Telecommunications System (UMTS), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), 3GPP Long Term Evolution (LTE), LTE Advanced, Wideband Code Division Multiple Access (WCDMA), Bluetooth, ultra- wideband (UWB), 802.1 lb/g/n, and/or the like. A wireless network may include virtually any type of now known and/or to be developed wireless communication mechanism and/or wireless communications protocol by which signals may be communicated between devices, between networks, within a network, and/or the like, including the foregoing, of course.
In one example embodiment, as shown in FIG. 12, a system embodiment may comprise a local network (e.g., device 1204 and medium 1240) and/or another type of network, such as a computing and/or communications network. For purposes of illustration, therefore, FIG. 12 shows an embodiment 1200 of a system that may be employed to implement either type or both types of networks. Network 1208 may comprise one or more network connections, links, processes, services, applications, and/or resources to facilitate and/or support communications, such as an exchange of communication signals, for example, between a computing device, such as 1202, and another computing device, such as 1206, which may, for example, comprise one or more client computing devices and/or one or more server computing device. By way of example, but not limitation, network 1208 may comprise wireless and/or wired communication links, telephone and/or telecommunications systems, Wi-Fi networks, Wi-MAX networks, the Internet, a local area network (LAN), a wide area network (WAN), or any combinations thereof. Example devices in FIG. 12 may comprise features, for example, of a client computing device and/or a server computing device, in an embodiment. It is further noted that the term computing device, in general, whether employed as a client and/or as a server, or otherwise, refers at least to a processor and a memory connected by a communication bus. Likewise, in the context of the present patent application at least, this is understood to refer to sufficient structure within the meaning of 35 USC § 1 12 (f) so that it is specifically intended that 35 USC § 1 12 (f) not be implicated by use of the term “computing device” and/or similar terms; however, if it is determined, for some reason not immediately apparent, that the foregoing understanding cannot stand and that 35 USC § 112 (f), therefore, necessarily is implicated by the use of the term “computing device” and/or similar terms, then, it is intended, pursuant to that statutory section, that corresponding structure, material and/or acts for performing one or more functions be understood and be interpreted to be described at least in figure(s) 1-1 1 and in the text associated at least with the foregoing figure(s) of the present patent application.
Referring now to FIG. 12, in an embodiment, first and third devices 1202 and 1206 may be capable of rendering a graphical user interface (GUI) for a network device and/or a computing device, for example, so that a user-operator may engage in system use. Device 1204 may potentially serve a similar function in this illustration. Likewise, in FIG. 12, computing device 1202 (‘first device’ in figure) may interface with computing device 1204 (‘second device’ in figure), which may, for example, also comprise features of a client computing device and/or a server computing device, in an embodiment. Processor (e.g., processing device) 1220 and memory 1222, which may comprise primary memory 1224 and secondary memory 1226, may communicate by way of a communication bus 1215, for example. The term “computing device,” in the context of the present patent application, refers to a system and/or a device, such as a computing apparatus, that includes a capability to process (e.g., perform computations) and/or store digital content, such as electronic files, electronic documents, measurements, text, images, video, audio, sensor content, etc. in the form of signals and/or states. Thus, a computing device, in the context of the present patent application, may comprise hardware, software, firmware, or any combination thereof (other than software per se). Computing device 1204, as depicted in FIG. 12, is merely one example, and claimed subject matter is not limited in scope to this particular example.
For one or more embodiments, a device, such as a computing device and/or networking device, may comprise, for example, any of a wide range of digital electronic devices, including, but not limited to, targeting aid devices, desktop and/or notebook computers, high-definition televisions, digital versatile disc (DVD) and/or other optical disc players and/or recorders, game consoles, satellite television receivers, cellular telephones, tablet devices, wearable devices, personal digital assistants, mobile audio and/or video playback and/or recording devices, Internet of Things (IOT) type devices, endpoint and/or sensor nodes, gateway devices, or any combination of the foregoing. Further, unless specifically stated otherwise, a process as described, such as with reference to flow diagrams and/or otherwise, may also be executed and/or affected, in whole or in part, by a computing device and/or a network device. A device, such as a computing device and/or network device, may vary in terms of capabilities and/or features. Claimed subject matter is intended to cover a wide range of potential variations. For example, a device may include a numeric keypad and/or other display of limited functionality, such as a monochrome liquid crystal display (LCD) for displaying text, for example. In contrast, however, as another example, a web-enabled device may include a physical and/or a virtual keyboard, mass storage, one or more accelerometers, one or more gyroscopes, global positioning system (GPS) and/or other location-identifying type capability, and/or a display with a higher degree of functionality, such as a touch-sensitive color 2D or 3D display, for example.
As suggested previously, communications between a computing device and/or a network device and a wireless network may be in accordance with known and/or to be developed network protocols including, for example, global system for mobile communications (GSM), enhanced data rate for GSM evolution (EDGE), 802.1 Ib/g/n/h, etc., and/or worldwide interoperability for microwave access (WiMAX). A computing device and/or a networking device may also have a subscriber identity module (SIM) card, which, for example, may comprise a detachable or embedded smart card that is able to store subscription content of a user, and/or is also able to store a contact list. It is noted, however, that a SIM card may also be electronic, meaning that is may simply be stored in a particular location in memory of the computing and/or networking device. A user may own the computing device and/or network device or may otherwise be a user, such as a primary user, for example. A device may be assigned an address by a wireless network operator, a wired network operator, and/or an Internet Service Provider (ISP). For example, an address may comprise a domestic or international telephone number, an Internet Protocol (IP) address, and/or one or more other identifiers. In other embodiments, a computing and/or communications network may be embodied as a wired network, wireless network, or any combinations thereof.
A computing and/or network device may include and/or may execute a variety of now known and/or to be developed operating systems, derivatives and/or versions thereof, including computer operating systems, such as Windows, iOS, Linux, a mobile operating system, such as iOS, Android, Windows Mobile, and/or the like. A computing device and/or network device may include and/or may execute a variety of possible applications, such as a client software application enabling communication with other devices. For example, one or more messages (e.g., content) may be communicated, such as via one or more protocols, now known and/or later to be developed, suitable for communication of email, short message service (SMS), and/or multimedia message service (MMS), including via a network, such as a social network, formed at least in part by a portion of a computing and/or communications network, including, but not limited to, Facebook, Linkedln, Twitter, and/or Flickr, to provide only a few examples. A computing and/or network device may also include executable computer instructions to process and/or communicate digital content, such as, for example, textual content, digital multimedia content, sensor content, and/or the like. A computing and/or network device may also include executable computer instructions to perform a variety of possible tasks, such as browsing, searching, playing various forms of digital content, including locally stored and/or streamed video, and/or games such as, but not limited to, fantasy sports leagues. The foregoing is provided merely to illustrate that claimed subject matter is intended to include a wide range of possible features and/or capabilities.
In FIG. 12, computing device 1202 may provide one or more sources of executable computer instructions in the form physical states and/or signals (e.g., stored in memory states), for example. Computing device 1202 may communicate with computing device 1204 by way of a network connection, such as via network 1208, for example. As previously mentioned, a connection, while physical, may not necessarily be tangible. Although computing device 1204 of FIG. 12 shows various tangible, physical components, claimed subject matter is not limited to a computing devices having only these tangible components as other implementations and/or embodiments may include alternative arrangements that may comprise additional tangible components or fewer tangible components, for example, that function differently while achieving similar results. Rather, examples are provided merely as illustrations. It is not intended that claimed subject matter be limited in scope to illustrative examples.
Memory 1222 may comprise any non-transitory storage mechanism. Memory 1222 may comprise, for example, primary memory 1224 and secondary memory 1226, additional memory circuits, mechanisms, or combinations thereof may be used. Memory 1222 may comprise, for example, random access memory, read only memory, etc., such as in the form of one or more storage devices and/or systems, such as, for example, a disk drive including an optical disc drive, a tape drive, a solid-state memory drive, etc., just to name a few examples.
Memory 1222 may be utilized to store a program of executable computer instructions. For example, processor 1220 may fetch executable instructions from memory and proceed to execute the fetched instructions. Memory 1222 may also comprise a memory controller for accessing device readable-medium 1240 that may carry and/or make accessible digital content, which may include code, and/or instructions, for example, executable by processor 1220 and/or some other device, such as a controller, as one example, capable of executing computer instructions, for example. Under direction of processor 1220, a non-transitory memory, such as memory cells storing physical states (e.g., memory states), comprising, for example, a program of executable computer instructions, may be executed by processor 1220 and able to generate signals to be communicated via a network, for example, as previously described. Generated signals may also be stored in memory, also previously suggested.
Memory 1222 may store electronic files and/or electronic documents, such as relating to one or more users, and may also comprise a computer-readable medium that may carry and/or make accessible content, including code and/or instructions, for example, executable by processor 1220 and/or some other device, such as a controller, as one example, capable of executing computer instructions, for example. As previously mentioned, the term electronic file and/or the term electronic document are used throughout this document to refer to a set of stored memory states and/or a set of physical signals associated in a manner so as to thereby form an electronic file and/or an electronic document. That is, it is not meant to implicitly reference a particular syntax, format and/or approach used, for example, with respect to a set of associated memory states and/or a set of associated physical signals. It is further noted an association of memory states, for example, may be in a logical sense and not necessarily in a tangible, physical sense. Thus, although signal and/or state components of an electronic file and/or electronic document, are to be associated logically, storage thereof, for example, may reside in one or more different places in a tangible, physical memory, in an embodiment.
Algorithmic descriptions and/or symbolic representations are examples of techniques used by those of ordinary skill in the signal processing and/or related arts to convey the substance of their work to others skilled in the art. An algorithm is, in the context of the present patent application, and generally, is considered to be a self- consistent sequence of operations and/or similar signal processing leading to a desired result. In the context of the present patent application, operations and/or processing involve physical manipulation of physical quantities. Typically, although not necessarily, such quantities may take the form of electrical and/or magnetic signals and/or states capable of being stored, transferred, combined, compared, processed and/or otherwise manipulated, for example, as electronic signals and/or states making up components of various forms of digital content, such as signal measurements, text, images, video, audio, etc.
It has proven convenient at times, principally for reasons of common usage, to refer to such physical signals and/or physical states as bits, values, elements, parameters, symbols, characters, terms, numbers, numerals, measurements, content and/or the like. It should be understood, however, that all of these and/or similar terms are to be associated with appropriate physical quantities and are merely convenient labels. Unless specifically stated otherwise, as apparent from the preceding discussion, it is appreciated that throughout this specification discussions utilizing terms such as "processing," "computing," "calculating," "determining", “establishing”, “obtaining”, “identifying”, “selecting”, “generating”, and/or the like may refer to actions and/or processes of a specific apparatus, such as a special purpose computer and/or a similar special purpose computing and/or network device. In the context of this specification, therefore, a special purpose computer and/or a similar special purpose computing and/or network device is capable of processing, manipulating and/or transforming signals and/or states, typically in the form of physical electronic and/or magnetic quantities, within memories, registers, and/or other storage devices, processing devices, and/or display devices of the special purpose computer and/or similar special purpose computing and/or network device. In the context of this particular patent application, as mentioned, the term “specific apparatus” therefore includes a general purpose computing and/or network device, such as a general purpose computer, once it is programmed to perform particular functions, such as pursuant to program software instructions.
In some circumstances, operation of a memory device, such as a change in state from a binary one to a binary zero or vice-versa, for example, may comprise a transformation, such as a physical transformation. With particular types of memory devices, such a physical transformation may comprise a physical transformation of an article to a different state or thing. For example, but without limitation, for some types of memory devices, a change in state may involve an accumulation and/or storage of charge or a release of stored charge. Likewise, in other memory devices, a change of state may comprise a physical change, such as a transformation in magnetic orientation. Likewise, a physical change may comprise a transformation in molecular structure, such as from crystalline form to amorphous form or vice-versa. In still other memory devices, a change in physical state may involve quantum mechanical phenomena, such as, superposition, entanglement, and/or the like, which may involve quantum bits (qubits), for example. The foregoing is not intended to be an exhaustive list of all examples in which a change in state from a binary one to a binary zero or vice-versa in a memory device may comprise a transformation, such as a physical, but non-transitory, transformation. Rather, the foregoing is intended as illustrative examples.
Referring again to FIG. 12, processor 1220 may comprise one or more circuits, such as digital circuits, to perform at least a portion of a computing procedure and/or process. By way of example, but not limitation, processor 1220 may comprise one or more processors, such as controllers, microprocessors, microcontrollers, application specific integrated circuits, digital signal processors, programmable logic devices, field programmable gate arrays, the like, or any combination thereof. In various implementations and/or embodiments, processor 1220 may perform signal processing, typically substantially in accordance with fetched executable computer instructions, such as to manipulate signals and/or states, to construct signals and/or states, etc., with signals and/or states generated in such a manner to be communicated and/or stored in memory, for example.
FIG. 12 also illustrates device 1204 as including a component 1232 operable with input/output devices, for example, so that signals and/or states may be appropriately communicated between devices, such as device 1204 and an input device and/or device 1204 and an output device. A user may make use of an input device, such as a computer mouse, stylus, track ball, keyboard, and/or any other similar device capable of receiving user actions and/or motions as input signals. Likewise, for a device having speech to text capability, a user may speak to a device to generate input signals. A user may make use of an output device, such as a display, a printer, etc., and/or any other device capable of providing signals and/or generating stimuli for a user, such as visual stimuli, audio stimuli and/or other similar stimuli.
In the preceding description, various aspects of claimed subject matter have been described. For purposes of explanation, specifics, such as amounts, systems and/or configurations, as examples, were set forth. In other instances, well-known features were omitted and/or simplified so as not to obscure claimed subject matter. While certain features have been illustrated and/or described herein, many modifications, substitutions, changes and/or equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all modifications and/or changes as fall within claimed subject matter.

Claims

1. An apparatus, comprising: a targeting aid system to be secured to a firearm, at least in part, the firearm to be operated by a user, and wherein the targeting aid system to comprise: one or more sensors, at least one sensor of the one or more sensors to comprise an image sensor; and at least one processor to: obtain one or more signals or signal packets from the at least one sensor; determine at least a direction and speed of a target based at least in part on the one or more signals or signal packets to be obtained from the at least one sensor; determine at least one feedback parameter based at least in part on at least one user action and based at least in part on the direction and speed of the target; and initiate at least one feedback process to provide at least haptic, audible and/or visual feedback to the user at least in part in accordance with the at least one feedback parameter.
2. The apparatus of claim 1, wherein the firearm to comprise a barrel and wherein the targeting aid system to be secured to the barrel.
3. The apparatus of claim 1 or 2, wherein the firearm to further comprise a trigger mechanism, and wherein the action to be performed by the user to include a firearm-handling action or an aiming action, or any combination thereof.
4. The apparatus of any one of the preceding claims, wherein the at least one sensor to comprise an inertial motion unit.
5. The apparatus of claim 4 when dependent on claim 2, wherein the at least one processor to determine the at least one feedback parameter based at least in part on a movement of the barrel in accordance with one or more signals or signal packets generated by the inertial motion unit.
6. The apparatus of any one of the preceding claims, wherein the at least one processor to determine the direction and speed of the target based at least in part on one or more signals or signal packets obtained from the image sensor.
7. The apparatus of claim 6, wherein the at least one processor to determine the direction and speed of the target at least in part via analysis of image content representative of the target or one or more background features, or any combination thereof.
8. The apparatus of claim 6 or 7, wherein the at least one processor to determine the direction and speed of the target at least in part via analysis of pixel content of two or more consecutive images obtained from the image sensor.
9. The apparatus of any one of the preceding claims, wherein the at least one processor to determine the at least one feedback parameter based at least in part on the direction and speed of the target and based at least in part on one or more parameters representative of one or more characteristics of at least one projectile or at least one projectile cloud to be discharged from the firearm.
10. The apparatus of claim 9, wherein the targeting aid system to further comprise a user interface and wherein the one or more parameters representative of the one or more characteristics of the at least one projectile or the at least one projectile cloud to be obtained via the user interface.
11. The apparatus of claim 9 or 10, wherein the one of more parameters representative of the one or more characteristics of the at least one projectile or the at least one projectile cloud to include one or more parameters representative of ammunition type, expected projectile speed, expected drop or expected deviation from a straight-line trajectory, or any combination thereof.
12. The apparatus of any one of the preceding claims, wherein the at least one feedback process to indicate in real-time or near real-time, or any combination thereof, to the user a deviation of the user activity relative to the direction and speed of the target.
13. The apparatus of any one of the preceding claims, wherein the at least one sensor to include a GPS/GNSS sensor, an altimeter, a barometer, a thermometer or a pulsimeter, or any combination thereof.
14. The apparatus of any one of the preceding claims, wherein the processor to determine the at least one feedback parameter further based at least in part on wind speed, air pressure, humidity or air temperature, or any combination thereof.
15. The apparatus of claim 14, wherein one or more parameters specifying wind speed, air pressure, humidity or air temperature, or any combination thereof, to be obtained via the at least one sensor, via a user interface or via a wireless communication interface, or any combination thereof.
16. The apparatus of any one of the preceding claims, wherein the targeting aid system to further include at least one memory device, wherein the at least one processor to initiate storage of at least sensor content or content representative of the at least one user action, or any combination thereof, in the at least one memory device.
17. The apparatus of any one of the preceding claims, wherein the targeting aid system to further include a communication interface, wherein the at least one processor to initiate communication of one or more signal packets representative, at least in part, of the sensor content between the targeting aid system and an external computing system.
18. The apparatus of claim 17, wherein the external computing system to comprise a cellular telephone or a tablet device.
19. The apparatus of any one of the preceding claims, wherein the feedback process to include an image display to provide visual feedback to the user based, at least in part, on the at least one user action relative to the direction and speed of the target.
20. The apparatus of claim 19, wherein the image display to comprise a cellular telephone or a tablet device.
21. An apparatus, comprising: a targeting aid system to be secured to a firearm, wherein the firearm to be operated by a user, and wherein the targeting aid system to comprise: a plurality of sensors to include at least an image sensor and a movement sensor, wherein the movement sensor to detect acceleration, angular speed or azimuth position, or any combination thereof, of the firearm; a communication interface to transmit at least sensor readings to an external computing system, wherein the communication interface further to obtain at least one feedback parameter from the external computing system, wherein the at least one feedback parameter to be based at least in part on the sensor content and based at least in part on at least one user action relative to a direction and speed of a particular target; and at least one feedback process to provide at least haptic, audible and/or visual feedback to the user at least in part in accordance with the at least one feedback parameter.
22. The apparatus of claim 21, wherein the direction and speed of the particular target to be calculated based at least in part on one or more signals and/or signal packets obtained from the image sensor.
23. The apparatus of claim 21 or 22, wherein the at least one feedback parameter further to be based at least in part on one or more parameters representative of one or more characteristics of at least one projectile to be discharged from the firearm.
24. A method, comprising: at a targeting aid system to be secured to a firearm, wherein the firearm to be operated by a user: obtaining one or more signals or signal packets from at least one sensor of the targeting aid system; determining, via at least one processor of the targeting aid system, a direction and speed of a target based at least in part on the one or more signals or signal packets obtained from the at least one sensor; determining at least one feedback parameter based at least in part on at least one user action relative to the direction and speed of the target; and providing haptic, audible and/or visual feedback to the user at least in part in accordance with the at least one feedback parameter.
25. The method of claim 24, wherein the firearm comprises a barrel and wherein the targeting aid system is secured to the barrel.
26. The method of claim 25, wherein the firearm further comprises a trigger mechanism, and wherein the at least one user action includes movement of the barrel or activating the trigger mechanism, or any combination thereof.
27. The method of any one of claims 24 to 26, wherein the at least one sensor includes an inertial motion unit.
28. The method of claim 27 when dependent on claim 25, wherein the determining the at least one feedback parameter is based at least in part on the movement of the barrel in accordance with one or more signals or signal packets generated by the inertial motion unit.
29. The method of any one of claims 24 to 28, wherein the at least one sensor comprises an image sensor.
30. The method of claim 29, wherein the determining the direction and speed of the target is based at least in part on one or more signals or signal packets obtained from the image sensor.
31. The method of claim 29 or 30, wherein the determining the direction and speed of the target comprises analysis of image content representative of the target or one or more background features, or a combination thereof.
32. The method of any one of claims 24 to 31 , wherein the determining the at least one feedback parameter is further based at least in part on one or more parameters representative of one or more characteristics of at least one projectile or at least one projectile cloud to be discharged from the firearm.
33. The method of claim 32, wherein the targeting aid system further comprises a user interface.
34. The method of claim 33, further comprising obtaining the one or more parameters representative of the one or more characteristics of the at least one projectile or the at least one projectile cloud via the user interface.
35. The method of any one of claim 32 to 34, wherein the one of more parameters representative of the one or more characteristics of the at least one projectile or the at least one projectile cloud includes one or more parameters representative of ammunition type, expected projectile speed, expected drop or expected deviation from a straight-line trajectory, or any combination thereof.
36. The method of any one of claims 24 to 35, wherein the providing the haptic, audible and/or visual feedback to the user includes providing feedback indicative of a deviation of the user activity relative to the direction and speed of the target in realtime or near real-time, or any combination thereof.
37. The method of any one of claims 24 to 36, further comprising storing the one or more signals or signal packets obtained from the at least one sensor in at least one memory of the targeting aid system.
38. The method of any one of claims 24 to 37, further including transmitting, via a communication interface of the targeting aid system, one or more signal packets representative, at least in part, of the one or more signals or signal packets obtained from the at least one sensor to an external computing system.
39. The method of any one of claims 24 to 38, wherein the proving feedback to the user includes providing visual feedback via an image display of the targeting aid system based at least in part on the at least one user action relative to the direction and speed of the target.
PCT/RU2022/000090 2022-03-24 2022-03-24 Targeting aid system, device, and/or method WO2023182901A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/RU2022/000090 WO2023182901A1 (en) 2022-03-24 2022-03-24 Targeting aid system, device, and/or method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2022/000090 WO2023182901A1 (en) 2022-03-24 2022-03-24 Targeting aid system, device, and/or method

Publications (1)

Publication Number Publication Date
WO2023182901A1 true WO2023182901A1 (en) 2023-09-28

Family

ID=81648711

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2022/000090 WO2023182901A1 (en) 2022-03-24 2022-03-24 Targeting aid system, device, and/or method

Country Status (1)

Country Link
WO (1) WO2023182901A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140272807A1 (en) * 2013-03-15 2014-09-18 Kenneth W. Guenther Interactive system and method for shooting and target tracking for self-improvement and training
WO2015116675A1 (en) * 2014-01-29 2015-08-06 Virtual Sports Training, Inc. Motion tracking, analysis and feedback systems and methods for performance training applications
WO2017145122A1 (en) * 2016-02-24 2017-08-31 Pautler James Anthony Skeet and bird tracker
US20170292813A1 (en) * 2016-04-07 2017-10-12 Jab Company Llc Target shooting
WO2021048307A1 (en) * 2019-09-10 2021-03-18 Fn Herstal S.A. Imaging system for firearm
US20210389080A1 (en) * 2020-06-11 2021-12-16 Reactor LLC Rifle Intelligence Systems and Methods

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140272807A1 (en) * 2013-03-15 2014-09-18 Kenneth W. Guenther Interactive system and method for shooting and target tracking for self-improvement and training
WO2015116675A1 (en) * 2014-01-29 2015-08-06 Virtual Sports Training, Inc. Motion tracking, analysis and feedback systems and methods for performance training applications
WO2017145122A1 (en) * 2016-02-24 2017-08-31 Pautler James Anthony Skeet and bird tracker
US20170292813A1 (en) * 2016-04-07 2017-10-12 Jab Company Llc Target shooting
WO2021048307A1 (en) * 2019-09-10 2021-03-18 Fn Herstal S.A. Imaging system for firearm
US20210389080A1 (en) * 2020-06-11 2021-12-16 Reactor LLC Rifle Intelligence Systems and Methods

Similar Documents

Publication Publication Date Title
US11355157B2 (en) Special effect synchronization method and apparatus, and mobile terminal
US11516557B2 (en) System and method for enhanced video image recognition using motion sensors
US20210105510A1 (en) Live video streaming services using one or more external devices
EP3354007B1 (en) Video content selection
US10219009B2 (en) Live interactive video streaming using one or more camera devices
US20230239567A1 (en) Wearable Multimedia Device and Cloud Computing Platform with Application Ecosystem
US10999560B2 (en) Remote electronic monitoring infrastructure
JP6503070B2 (en) Method for determining the position of a portable device
US10602097B2 (en) Wearable camera, wearable camera system, and information processing apparatus
CN104838642B (en) With founder's identity or the method and apparatus of scene markers media
US20160337718A1 (en) Automated video production from a plurality of electronic devices
JP2019121362A (en) Connection of physical object and virtual object in augmented reality
US20120213212A1 (en) Life streaming
EP2476066A1 (en) An apparatus
US20160323483A1 (en) Automatically generating notes and annotating multimedia content specific to a video production
US11106452B2 (en) Infrastructure for validating updates via a network of IoT-type devices
US20200074839A1 (en) Situational awareness platform, methods, and devices
CN111582116A (en) Video erasing trace detection method, device, equipment and storage medium
US20170090582A1 (en) Facilitating dynamic and intelligent geographical interpretation of human expressions and gestures
KR20160132408A (en) Devices and methods for facilitating wireless communications based on implicit user cues
US10681335B2 (en) Video recording method and apparatus
CN109862264A (en) A kind of image pickup method, terminal and computer readable storage medium
WO2023182901A1 (en) Targeting aid system, device, and/or method
Boddhu et al. Increasing situational awareness using smartphones
CN112859136A (en) Positioning method and related device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22723220

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)