US20180202775A1 - Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors - Google Patents

Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors Download PDF

Info

Publication number
US20180202775A1
US20180202775A1 US15/853,710 US201715853710A US2018202775A1 US 20180202775 A1 US20180202775 A1 US 20180202775A1 US 201715853710 A US201715853710 A US 201715853710A US 2018202775 A1 US2018202775 A1 US 2018202775A1
Authority
US
United States
Prior art keywords
target
player
shot
camera
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/853,710
Inventor
Rod Ghani
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/474,874 external-priority patent/US9891028B2/en
Application filed by Individual filed Critical Individual
Priority to US15/853,710 priority Critical patent/US20180202775A1/en
Publication of US20180202775A1 publication Critical patent/US20180202775A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/10Cinematographic hit-indicating systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/14Apparatus for signalling hits or scores to the shooter, e.g. manually operated, or for communication between target and shooter; Apparatus for recording hits or scores

Definitions

  • the present disclosure is directed to systems and methods of collecting and analyzing data related to firearms marksmanship.
  • a shooting game system capable of collecting and analyzing shot data for multiple players shooting at the same target in a competitive setting is provided.
  • the system and methods described herein provide that a score for each player is automatically updated when each player takes his turn.
  • shooting scoring apps i.e. Target Scan app for iOS
  • a photographed (or scanned) paper target is examined for the location of the shots and the total score is determined electronically.
  • a lighted background or white background paper
  • the system then distinguishes the center of a shot from the area weighted geometry of the hole.
  • the software can have difficulty recognizing a shot accurately, and a manual option is given to the user to correct or place a shot to be scored.
  • CN1347040 also describes a scoring system where a target with bullet holes is analyzed for scoring. However, no disclosure was made as to how a shot was located in the camera image frame, and how a score was determined.
  • US Patent Publication No. 2014/0106311 describes a shooting training system where a shot is displayed to the shooter by alternating views of the current target versus an image capture the target image before the latest shot. This system only captures images and does not generate an automatic score, and does not determine a shot location in any camera image capture.
  • a target scanning type of scoring system does not lend itself to instant updates on a shooter's score. Such delays in retrieving a score dampens the sense of competition among the shooters. Also, the scanning systems cannot separate the score between multiple shooters on the same target.
  • US Patent Publication No. 2010/0178967 and U.S. Pat. No. 4,898,391 describe a shooting game with a target and a gun that sends a beam of light to a game console for scoring against the target.
  • this type of scoring system does not use a gun which fires real bullets and is a less satisfying game to play.
  • the embodied invention is a method and equipment suitable for a shooting game with dynamic shot recognition and automatic scoring among multiple players firing at the same target.
  • Each player's shot is scored based on a difference in the target's image from a prior image as viewed by a camera.
  • the scoring target is aligned with the camera, and the output of the score change is displayed to the multiple shooters.
  • Important game enhancements include a dynamic update of the reference target image to follow multiple shot holes.
  • a shot event is recognized and the area of change identified for the placement of the shot.
  • the shot score is then accumulated in a display that is viewable by all players.
  • FIG. 1 shows a shooting gallery lane designed for the multi-player target game system.
  • FIG. 2 shows a detail of FIG. 1 .
  • FIG. 3 is a player's view of the shooting gallery.
  • FIG. 4 is a simplified profile view of the shooting gallery.
  • FIGS. 5 and 6 are block diagrams for how the latest shot is recognized and the score is determined.
  • FIG. 7 illustrates how the camera pixel sensors and an averaging filter are used to identify a shot location.
  • FIG. 8 is a game display showing the players their score and shot positions.
  • FIGS. 9A and 9B illustrate how the target image distortion is corrected when a camera is located above the target.
  • FIG. 10 shows communication flow between equipment components.
  • FIG. 11 shows a typical game display partway through the game showing additional features.
  • FIG. 12 illustrates a shooting system in accordance with various embodiments.
  • FIG. 13 illustrates a method in accordance with various embodiments.
  • FIG. 14 illustrates a game display with remote sensor data in accordance with various embodiments.
  • the present disclosure generally relates to gamified firearms marksmanship, and more particularly, to systems and methods for providing a shooting game to firearms users for various purposes such as entertainment, competition, and skill development.
  • the detailed description of various embodiments herein makes reference to the accompanying drawings, which show the exemplary embodiments by way of illustration. While these exemplary embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, it should be understood that other embodiments may be realized and that logical and mechanical changes may be made without departing from the spirit and scope of the disclosure. Thus, the detailed description herein is presented for purposes of illustration only and not of limitation. For example, the steps recited in any of the method or process descriptions may be executed in any order and are not limited to the order presented. Moreover, any of the functions or steps may be outsourced to or performed by one or more third parties. Furthermore, any reference to singular includes plural embodiments, and any reference to more than one component may include a singular embodiment.
  • references to “various embodiments”, “one embodiment”, “an embodiment”, etc. indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. After reading the description, it will be apparent to one skilled in the relevant art(s) how to implement the disclosure in alternative embodiments.
  • terms such as “transmit,” “communicate” and/or “deliver” may include sending electronic data from one system component to another over a network connection.
  • data may include information such as commands, queries, files, data for storage, and the like in digital or any other form.
  • a shooting game can comprise a competition among different players (also referred to herein as shooters) who shoot at the same target.
  • the shooting game system can comprise a camera, such as a video camera, in electronic communication with a shooting game system computer.
  • the terms “camera” and “video camera” may be used interchangeably herein.
  • the video camera may be configured to monitor the target and relay target image data to the shooting game system.
  • the shooting game system may be configured to relay the target image to a screen or electronic display that is viewable by all players.
  • the players in turn according to a defined sequence of players, shoot at the target and the shot hole is automatically recognized by the shooting game system based on digital image processing of target image data to identify a significant change in the target image from shot to shot.
  • the shot placement is identified and a digital display marker is placed over the shot hole on the target image.
  • the digital image data may also be automatically analyzed to determine the score of the shot.
  • the current shooter's overall score is updated and displayed on a game screen. When the first player completes his/her turn, the next player in the sequence of players becomes the shooter.
  • a shooting game can comprise two to four players per game.
  • a shooting game can comprise any suitable number of players, and the number of players may be user-selectable based on a setup that is input into the game system computer via a user interface. Identifying information may be input for each player participating in a shooting game, and the player sequence may likewise be determined by a player or other user.
  • a player or other user can input a firearm type and/or a firearm caliber to be used by a player in the game system computer.
  • the firearm type and/or firearm caliber may be selected from a predetermined menu of firearm types and/or calibers.
  • Input of a firearm type and/or firearm caliber into the game system computer may facilitate target image data processing, such as shot registration and scoring, by the game system computer.
  • a game system computer may select an appropriate shot registration and scoring algorithm in response to input of a particular caliber by a user.
  • a game system computer may determine a firearm caliber used by a player based on target image data processing.
  • a shooting game can be configured to provide each shooter with a predefined number of shots per game, for example, 5, 7, 10, 14, or 20 shots per player per game.
  • the number of shots per player per game can be selected from a predetermined range, such as between 1 and 21 shots per game.
  • a player's turn can comprise a single shot, or a turn can comprise multiple shots.
  • the number of shots to be taken by a player in a single turn may be input by a user or operator or may be selected from a predetermined range of shot numbers.
  • a player can provide an input to the game computer to switch players, such as by pressing a button or otherwise providing an input to the game system computer to switch the scoring to the next player in various embodiments, the game system computer may automatically switch to the next player in the sequence of players following detection and scoring of a player's shot.
  • the game system computer may display the identifying information for the next player to prompt the next player in a sequence of players to take his or her turn.
  • the game system may be configured to clear marked shots upon a switch to a new player, particularly in embodiments of game play in which each player takes multiple shots in a turn.
  • a game system can comprise a target mounted inside a metal frame.
  • a metal frame may comprise certain structural features that can serve as reference points for optical calibration of a digital camera and/or a shooting system. The frame may be designed to fit the target tightly so that the alignment and position relative to the camera is substantially maintained and a set up calibration is not needed when a new game is started.
  • a target may be attached to a metal frame by an interference fit, such as by placing the margins of the target between components of the metal frame configured to connect with one another by an interference fit or other mechanical connection.
  • a target can comprise a paper sheet or a sheet of cardboard material.
  • a target can comprise a sheet of non-paper material, such as polymer material, including various natural or synthetic polymers.
  • a polymer target material may be a self-healing polymer.
  • a camera may be positioned in front of the target, and above it, so that the camera is suitably positioned to obtain an image of the target.
  • the alignment between the target and the camera may be established at the beginning of the game in response to a user-selected target position, or the alignment may be previously established.
  • the camera is kept at a fixed distance and position from the target to simplify the set up and accuracy of the cameras image.
  • a camera may be mounted to the motorized target trolley. Mounting a camera to the motorized target trolley may facilitate maintaining a fixed distance and position with respect to the target.
  • a shooting lane can comprise more than one target-oriented camera.
  • a camera may be mounted in fixed positions suitable to monitor targets set at 10 ft, 25 ft, and 50 ft from the shooting position.
  • a shooting lane can comprise one or more fixed position cameras and a camera mounted to the motorized target trolley.
  • each camera may be used to capture target images. Use of multiple cameras to capture target images may provide for more robust image data and enhanced shot detection accuracy
  • a camera may be housed in a protective housing.
  • a protective housing may be configured to protect a camera from impact or penetration by an errant bullet fired by a player.
  • a protective housing may comprise plate steel configured to enclose or shield a camera and protect the camera from direct impacts and/or from the effects of a bullet impact with the housing.
  • a protective housing may comprise an angled metal baffle mounted to a structure such as the shooting lane ceiling and configured to deflect a bullet downrange from the shooter.
  • a protective housing can comprise a steel housing configured with angled surface to similarly deflect a bullet downrange while minimizing the energy transferred to the camera.
  • a protective housing may be vibrationally isolated from a camera protected by the housing.
  • the camera, and automatic image post processing must be able to recognize when a shot is made and score it accurately. This is done by continuously monitoring the output of the target camera and observing when there is a significant image pixel change.
  • the ability of the camera to recognize a shot depends partly upon the signal to noise ration.
  • Modern digital image sensors in cameras have known errors that create signal noise.
  • the total noise is dependent on the noise factor, background, readout noise, and EM gain.
  • the sensor noise must be filtered out in order to recognize a shot.
  • a single board computer is used to interface between the camera and the scoring display.
  • the single board computer is suitably programmed to perform the automatic scoring functions and display any scoring on the game display.
  • the game display may be any type of display that may be used to display game information.
  • a game display can include, for example, a high definition television or monitor.
  • a game display may also comprise a mobile device such as a tablet or mobile phone.
  • a game display may be electronically connected to the system via a wired connection or a wireless communication link.
  • the total score for each shooter is automatically computed and displayed under each player's name.
  • the game system may be configured to automatically identify a winner based on comparison of each player's score.
  • the game system may announce each player's score and/or the winning player via an audio announcement.
  • target image acquisitions is performed using a camera. Pixel changes in a target image due to a bullet impact in an image frame corresponding to the target area are used to identify a shot. The image from the camera is continuously monitored to recognize the location of the shot and score it accurately.
  • FIG. 1 shows significant elements of the game setup.
  • a players' stand 101 has a display 102 which displays multiple players on the screen.
  • the output of the wireless camera displays the current target image on display 102 .
  • Each player's score, the target, and marks of their shot positions are also shown on the scoring display.
  • a keyboard (or a dedicated control box) may include next player, previous player, display, game start, game setup, and game end buttons or selections.
  • a setup screen may be provided to allow players or operators to select certain game parameters such as the target type, target size, target distance, number of players per game, the ammunition type, and the number of shots per player.
  • a target type and/or target size maybe selected from a range of available target types and/or target sizes.
  • the target assembly 115 is shown in FIG. 1 .
  • the target 103 is housed inside a metal frame 107 and removably connected to the motorized target trolley 105 through a mounting clip 108 .
  • the target 103 and metal frame 107 may be at least partially vibrationally isolated from mounting clip 108 and/or trolley 105 by integration of a vibration dampening material or mechanism, such as a mechanism integrated into the mounting clip 108 , between mounting clip 108 and metal frame 107 , or between mounting clip 108 and trolley 105 .
  • the motorized target positioner moves on a rail 104 to set the target at the correct distance from the player.
  • the target camera 106 is attached to the target positioner with a good and clear view of the target.
  • a projector can be co-located on the target positioner with camera 106 .
  • Motorized target trolley 105 can be configured to be positioned on rail 104 at various user-selectable preset distances, such as 10 yards, 25 yards, and 50 yards. In various embodiments, target distances need not be present, and motorized target trolley 105 can be configured to be positioned at any distance from the shooting position compatible with physical parameters of the rail and shooting lane,
  • a target frame attaching clip 108 connects the target frame 107 with the target trolley 105 .
  • the trolley rides on the rail 104 and travels to the charging station 109 during the normal course of shooting.
  • An additional (optional) overhead game display 110 provides game status to the players.
  • the charging station is connected via communication cables 1120 to the game displays ( FIG. 2 ).
  • the target trolley is battery operated, and charges at the charging station through contacts 113 a,b ( FIGS. 1 and 2 ).
  • LED lighting 114 is used to illuminate the target.
  • the frequency of how often the target image is analyzed can be varied to a more suitable time to allow the target and camera vibration to dissipate.
  • An interval of 1 second has been found to be optimal in improving the reliability and accuracy of determining the shot location.
  • a vibration dampening mechanism such as that described above may be incorporated into the system and may serve to reduce the vibrational or movement-induced reflectivity changes created by the impact of a bullet with a target or frame.
  • the camera view of the target is distorted as the angle of the target is not perpendicular to the camera view.
  • the angle depends upon the position of the camera, which is preferably above the target. This is not a concern as to identifying the shot, but is important in scoring each shot correctly.
  • the lower edge of the target is narrower than the top. The result is that there are fewer pixels per inch at the bottom edge of the target than at the top.
  • the wireless camera is powered by a battery that is rechargeable.
  • the battery connects to a recharging station when target positioning assembly is moved to the players stand.
  • the motorized target positioner is also battery driven.
  • the target image is preferably taken from the camera 106 with a 1080p resolution.
  • the sensor in the camera is a CCD type or a CMOS type.
  • cameras use a Bayer color filter array.
  • the Bayer color filter array includes red, green, and blue light filters in a mosaic grid pattern in front of the individual camera pixel sensors.
  • the Bayer 2 ⁇ 2 pixel filter grid comprises a green and red color filter in the first row, and then a blue and green color filter in the second row. This 2 ⁇ 2 filter grid is repeated over the entire camera image sensor.
  • each camera pixel sensor only registers the light intensity of one color, the intensity of the other two color values at that sensor pixel are not known, and is missing information.
  • the other two colors at each of the four camera sensor pixels are interpolated using a de-mosaicing algorithm. This can be done in the camera or in a post processing algorithm.
  • Typical de-mosaicing algorithms include copying the colors from neighboring sensor pixels, averaging different colors from nearby sensor pixels, or using linear interpolation of nearby colors. The goal is to reasonably estimate all three colors at each camera sensor pixel.
  • the camera is a black and white camera and the color information is ignored.
  • FIG. 3 is a player's view of the shooting gallery. As shown, players stand behind the table and fire at the target. A display is shown above the players to identify which player is shooting and each player's score.
  • FIG. 4 is an alternate view of the shooting gallery showing a computer that is used as a shot record and a user interface.
  • a process flow for basic game play is illustrated.
  • a round of basic game play (also referred to herein a shooting event) can be initiated when the first player to shoot presses start button 501 to begin their shooting turn.
  • the camera captures a continual baseline image frame 502 of the target from the continuous target camera video stream.
  • the target camera then watches for a significant change in the target 504 by comparing current image with the previous baseline image 505 .
  • a shot will be recognized when there is a significant pixel count with changes 503 .
  • a shot will be recognized when more than 0.02% of the pixels change from image to image.
  • a video frame i.e. shot image
  • the shot scoring method determines where the shot occurred on the target and updates the player score.
  • the one second criteria is adjustable.
  • a shooting system can include numerous additional features that enrich the shooting experience, including various interactive competitive and social features.
  • a system can support addition of one or more players to a shooting event, such as one, two, three, four, five, six, seven, eight, nine, ten, or n players, where n can be any number, and the total number of players entered into a shooting event can be essentially unlimited.
  • each player can enter individual player information, such as a player's name or player identification, various demographic or other identifying information such as sex, age, street address, email address, social media account information, and the like.
  • Social media account information can include user IDs and passwords to facilitate integration of the system with a player's social media account, such as their Facebook, Instagram, Periscope, or YouTube account, to enable a player to request that the system publish images, video, or other information from the system to a social media feed associated with a social media account connected to the player's individual account.
  • a player's shooting session or a portion thereof may be live-streamed, such as via one or more of the player's social media accounts or via a channel provided by the system, in response to a user command, such as a prompt or request from a player or other operator.
  • Video of a player's shooting session or a portion thereof may be recorded and uploaded to the internet or posted to social media following a shooting session.
  • a player's individual player information can be associated with a guest account used for a single visit to a shooting facility, or a player's individual information can be retained by the system and associated with a player account that may be accessed and used by an individual player over multiple visits to a shooting facility or multiple shooting sessions during a visit.
  • the player account may include a player account user ID and password to uniquely identify and secure each player's account in the system.
  • a player account may be configured to record and store various historical information from completed shooting sessions for a player, such as number of shots fired, firearms and/or calibers used, target types used (including stationary and animated targets), overall shot accuracy, shot accuracy for particular firearms and/or calibers, shot accuracy for specific target types used, shot accuracy trends, and the like.
  • a system can also be configured so that a player's account information can be integrated into and/or available at a point-of-sale (POS) system, such that a player may be charged based on use, or eligible for various discounts based on use (including number of shooting sessions, or number of visits to shooting range), shooting performance, and the like.
  • POS point-of-sale
  • player information can be entered via a keyboard, computer terminal, or dedicated control box located at a players' stand 101 , or player information can be entered via a wirelessly connected device, such as a tablet provided by the system manager, or by a player's personal mobile device that is wirelessly connected to the system.
  • the system can comprise a mobile application that may be downloaded to a player's personal mobile device and be configured to interact with or control the system and the player's gameplay experience.
  • a system can also be configured with POS system integration to enable financial transactions to take place at a dedicated control box located at players' stand 101 or on a player's personal mobile device that is wirelessly connected to the system.
  • FIG. 6 is a block diagram that shows how a shot is recognized.
  • the camera records a target image at a frame rate of 60 frames per second.
  • a baseline target image is dynamically maintained at a predetermined interval (typically an image 1 second previously) relative to the current target image frame transmitted by the camera.
  • the average value for the x and y coordinates of the changed pixels identify the center of the shot on the target.
  • a scoring marker (such as a circle or square) is then fitted around the pixels with a significant change.
  • the scoring marker is a fixed size.
  • the most important value is the location of the shot center in reference to the score markings on the target 604 .
  • a scoring marker may comprise different colors or shapes. For example, a red marker may be used to indicate the location of the most recent shot, while yellow marks may be used to mark previous shots. Any color or combination of colors may be used.
  • a particular marker shape may be associated with a specific player. For example, a circular marker can be assigned to a first player, a square marker assigned to a second player, a triangular marker assigned to a third player, and a diamond-shaped marker assigned to a fourth player.
  • the game system can be used to toggle through screens displaying the target overlayed with markers corresponding to each player's shot, such that each player can view his or her shot grouping independently of the other players' shot markers.
  • a particular marker shape or color can be associated with a particular caliber.
  • FIG. 7 illustrates additional information as to how the digital camera recognizes a new hole position.
  • a bullet hole is shown inside a 1′′ ⁇ 1′′ area 701 , which is made during a shooting game.
  • the 1′′ ⁇ 1′′ area surrounding the hole is overlaid on top of a 10 ⁇ 10 grid representing camera resolution 702 of 10 pixels per inch.
  • the bullet hole may be identified with reference to a change in signal at individual camera pixels 703 , and some pixels are directly affected and change in light intensity 704 b due to the hole (black dot) no longer reflecting light. In this illustration, 13 pixels have light intensities are directly changed to at least a small degree (i.e. black dot is touching).
  • neighboring pixels 704 a are also changed in value due to being averaged with the directly affected pixels. In this illustration, 20 additional neighboring pixels are changed.
  • a shot is recognized because 33% of the pixels in the 1 ⁇ 1′′ area are recognized as having been changed, and this is above the 25% threshold.
  • other approaches to image processing may likewise, be used to recognize physical changes to a target in response to target penetration by a projectile. Any suitable method for image processing that has been previously developed or is he developed in the future may be used for shot recognition in accordance with the present disclosure.
  • any distortion between the camera and the target has to be corrected.
  • the distortion correction is accomplished by mapping each pixel from an (row, column) position to a (height, width) position.
  • FIGS. 9A and 9B illustrate how the image is distorted which allows for a methodology to adjust the camera image.
  • FIG. 9A shows how an image is distorted due to the position of the camera, and how the image will be seen by the camera.
  • the target 901 has a projected image plane 902 which is perpendicular to the camera 903 orientation. Evenly spaced dashed lines 904 at the target 901 edge pass through the projected image plane to the camera, which shows how the image is distorted due to the position and viewpoint of the camera.
  • the camera sees the projected image on the projected image plane 902 . This causes the lower portion of the target to be compressed in the camera view as seen on the projected image plane 902 .
  • FIG. 9B is a left side view of FIG. 9A .
  • the projected image plane 902 is shown across the width and projected (dashed) lines show how the edges of the target image is sent to the camera.
  • the largest adjustment to the camera image is adjusting the vertical height of the image, particularly on the lower portion.
  • the height adjustment is not linear.
  • the horizontal image will receive corrections and the distortion changes as a function of height.
  • one embodiment is to map out a grid of changes in a matrix format, based on the projected geometry, and then apply the change grid to the image. This will establish a variable scaling and re-positioning of an image pixel based on the row and column to an x,y position.
  • the change grid can be established on graphic plotting at chosen grid points, such as every 2 ⁇ 2 inches, and then linear interpolating between the chosen grid points to establish a corrected (x,y) position for each image pixel.
  • a grid/interpolating system can be effective as it involves basic matrix math and is relatively easy to understand.
  • 3D computer aided drafting (CAD) can be helpful in establishing projecting geometry.
  • Another embodiment is to use analytic geometry to identify the intersection of a line with a plane.
  • the first point of the line is the camera position (x 1 ,y 1 ,z 1 ) and the second point of the line is a position (x 2 ,y 2 ,z 2 ) of an image pixel as taken by the camera. All of the camera image pixels are located in a plane perpendicular to the camera orientation.
  • the equation of the line defined by the camera and image pixel can be projected onto the target plane. This method can be used to establish a target position (x,y,z) for each image pixel (row, column), effectively correcting camera distortion.
  • the image correction can be verified by utilizing a target with easily recognizable shapes (circles, equilateral triangles, squares, crosses, etc.) to determine if an image that is taken by the camera is corrected satisfactorily.
  • the camera is located above the target and in close proximity to it. This means that the camera is no further distance to the target than the maximum width or the maximum height. Since the camera is preferably located above the target, and out of the way of shooting, the camera resolution per inch will be greater for the top portion of the target, and somewhat lower for the lower portion of the target. This adjustment in scale must be accounted for in the shot placement. Typical view angles between the camera and the target are 0 to 60 degrees (as measured from the horizontal plane), but this is nota strict requirement.
  • each pixel is gray-scaled by averaging the pixel RGB values. Additionally, and to smooth out any target image pixels that might be incorrectly identified as a pixel change, each pixel is then averaged with its immediately surrounding eight pixels. If an image pixel is on the edge of the sensor, then the pixel image is averaged with the five surrounding image pixels. This creates an effective filter that smooths out image problems.
  • a particular problem with shooting is vibration of the paper target due to the shot penetration.
  • the target can vibrate in the area of the shot causing the reflectivity of the target image to change immediately following the shot. This creates unpredictable pixel changes and potential shot misplacement.
  • the target may be allowed to recover for approximately 1 second before a scoring placement is made.
  • the frequency (i.e. 60 Hz) of the lighting on the target can cause difficulties with shot recognition.
  • the camera scanning frequency may match that of the lighting frequency and this can cause a target image shadow to be read as a changed target image frame. Consequently, a DC (direct current) based lighting system is preferred, such as a light emitting diode (LED) which is powered by a constant voltage power supply (DC).
  • DC constant voltage power supply
  • the entire target is examined for a significant change, pixel by pixel. If a cluster of changes are detected in a 1′′ ⁇ 1′′ square surrounding the changed pixel, then a shot is recognized.
  • the threshold of determining a shot is a value that is empirically determined. During a test on a typical web-cam type camera with a CMOS sensor, a threshold of 25% was determined to be a good balance between sensitivity in detecting a shot and avoiding sensor noise which causes a false shot recognition. Other threshold values are possible based on the type of camera chosen and the amount of camera sensor noise. A camera with a low noise sensor, for example, will use a lower threshold that better identifies overlapping shots.
  • a typical camera that is useful for shot recognition will have an 8 bit resolution and captures color images. Also, a camera with a low sensor noise ratio is helpful in minimizing the amount of filtering required. Shot recognition is improved with a low signal to noise ratio.
  • the camera could equally be a black and white which outputs a grayscale image. In this case, the grayscale image conversion is not necessary.
  • a system can comprise a projector configured to project high resolution video onto a display screen.
  • the display screen can comprise, for example, a disposable paper screen or a disposable screen constructed of another material suitable to display projected video and to physically register impact or penetration by a projectile.
  • a projector can be mounted to a motorized target trolley such as motorized target trolley 105 ( FIG. 1 ), for example at or approximately located in the position of camera 106 .
  • a display screen can comprise a target mounted to a metal frame connected to the motorized target trolley as described above with respect target assembly 110 .
  • a display screen used for a projected video target can comprise a blank or plain target to facilitate visibility of the projected video image.
  • a projected video can comprise a projected video image target.
  • Projected video image targets can have any of a variety of target configurations, such as a circular bullseye, a human silhouette, a steel or reactive target, a bottle, a can, a bird, mammal or other animal, a clay pigeon, a zombie, a balloon, a saucer, and the like. Any type of projected image suitable for use as a shooting target may be used as a projected video image target in accordance with various embodiments of the present disclosure.
  • a projected video image target can be still or animated.
  • an animated projected video image target may be animated in response to a detected shot.
  • a projected video image target of a frangible target such as a clay pigeon may be shown to explosively fragment in the projected video in response to a registered shot corresponding to the location of the projected video image of the clay pigeon target (i.e., a “hit” target or scoring shot).
  • a projected video image of a steel reactive target may spin or flip in response to a scoring shot.
  • a projected video image of a game animal may respond in a realistic or lifelike fashion to a hit, including, for example, responding differently for a grazing shot and a “kill shot.”
  • an animated target may be timed, and a player's score for a shot may depend on the amount of time required to successfully hit the animated target.
  • a projected video image may comprise background scenery.
  • Background scenery may be animated.
  • Animated background scenery may comprise, for example, various settings corresponding to various natural or urban environments, such as a woodland environment, a grassland environment, an urban outdoor or street environment, an indoor urban environment, and the like.
  • a projected video target environment may enhance the realism of a shooting experience and may correspond to any of a variety of natural hunting environments or tactical environments.
  • a projected video target environment may challenge a shooter with various non-target features.
  • a hunting environment may present a shooter with various non-target game animals, such as distracting animals, does, or the like.
  • a tactical environment can present a shooter with non-target civilians, hostages, friendly forces, vehicles, and the like.
  • a projected video target environment may be configured provide a shooter with a particular difficulty level.
  • the difficulty level may be selected from a range of difficulty levels.
  • Various factors such as target size, speed of target movement, and presence of non-target features in the projected video target environment may vary in response to different difficulty levels.
  • a player's score may likewise take into consideration the difficulty level of a particular projected video target and/or projected video target environment.
  • FIG. 8 shows a game display. Up to four players can shoot, and their individual shot scores (bold letters) along with a total for each player is shown. The individual shot locations have been identified and marked with an individualized geometric marker, such as a triangle, square, hexagon, and rhombus. Other marker geometry could equally be used.
  • an individualized geometric marker such as a triangle, square, hexagon, and rhombus. Other marker geometry could equally be used.
  • the shot can be scored based on either the distorted camera image or the corrected target image utilizing the target score markings. It is important that the score is accurate, relative to the location of the markings on a distorted or corrected image.
  • the position of the shot can be mapped based on a score mapping on the camera image. The row and column position of each pixel can be grouped and assigned to a particular score.
  • FIG. 10 shows communication flow between equipment components.
  • the camera 1002 views the target 1001 and is hardwired to a small dedicated single board computer 1003 which wirelessly communicates to a game computer 1004 which is hardwired to the game display 1005 .
  • a user interface 1005 (keyboard, button board) may be hard wired to game computer 1004 .
  • a user interface need not be hard wired to the game computer.
  • a user interface in accordance with various embodiments of the present disclosure can comprise a dedicated touchscreen display, a tablet, or a mobile device.
  • a dedicated touchscreen display may be hard wired or wirelessly connected to game computer 1004 .
  • the single board computer is generally conceived to include a CPU, both volatile and non-volatile memory, an operating system, onboard communications between distinct components, a wireless transmitter, and suitable software programming to execute non-transient computer instructions.
  • the single board computer could be selected from the portfolio of the Raspberry Pi single board computers as manufactured by the Raspberry Pi Foundation (United Kingdom).
  • FIG. 11 shows a shooting game, partway through game completion.
  • a player list 1101 on the left side show the current player up to make the next shot.
  • the current player's total score 1102 is displayed.
  • a header text 1103 indicates the current player and current player's round.
  • a list of the current player's shots along with the score per shot is displayed in a list 1106 .
  • a current target image 1104 , along with display markers 1105 on current player's recognized and scored shots is shown. This information on the display is helpful for game clarity and to enhance the game competition by having feedback on any game status questions the players may have.
  • a shooting game system may be configured to receive data from a remote sensor.
  • Data from a remote sensor may include data regarding a player and/or a player's firearm.
  • a shooting game system may be configured to receive data from one or more wearable devices that may be worn by a player, such as a heart rate monitor, a photoplethysmograph, electroencephalography sensors, a respiration rate sensor, an accelerometer, an inertial measurement unit (IMU), a magnetometer, a gyroscope, and any other suitable microelectromechanical system or sensor that may be used to detect a physical or environmental condition.
  • a shooting game system may be configured to receive data from a sensor that may be mounted to a firearm.
  • PCT/US2016/013760 discloses a rail-mounted firearm remote sensor apparatus and methods and is herein incorporated by reference in its entirety.
  • a firearm-mounted sensor may be used to detect various events such as trigger squeeze, trigger break, firing, recoil, and the like, along with gun movement associated with such events and/or throughout the shot-taking process.
  • a shooting game system may comprise one or more video cameras directed at the player.
  • a video camera directed at the player may collect image data that may be analyzed to provide useful information surrounding a player's shot, such as information regarding stance, respiratory rate, body, arm and hand movements, eye tracking, gun barrel kinematics, and the like.
  • a video camera directed at the player may be used to perform eye tracking.
  • Eye tracking may be performed using eye tracking glasses comprising a camera or other systems configured to perform conical imaging using methods such as pupil corneal reflection.
  • a system with eye tracking may be used to measure a period of time in which a shooter's gaze is locked on a specific location or object, a visual phenomenon referred to as a “quiet eye” period, for example by Causer et al., 2010, Medicine and Science in Sports and Exercise, 42(8): 1599-1608, which article is incorporated herein by reference in its entirety.
  • a remote sensor used to perform eye tracking may be used to produce and record tracked eye movement data.
  • the system may be configured to compare tracked eye movement data to animated projected video image target movement to assess a shooter's visual acquisition and tracking of a moving target during the player's shot.
  • a firearm mounted remote sensor and/or a video camera and digital image processing system may be used to provide gun barrel kinematics data that may be compared to animated projected video image target movement to assess a shooter's firearm tracking movements relative to the target movements.
  • a video camera directed at the player may comprise a trigger-oriented camera.
  • a trigger-oriented camera may be configured to capture video images of a shooter's trigger finger or firearm grip including the area around the trigger.
  • a system may be configured to perform digital image processing to determine when a shot is taken based on video image data from a trigger-oriented camera.
  • remote sensor data may be analyzed in conjunction with a player's shooting performance data.
  • a shooter's shot accuracy may be correlated with respiratory rate and pattern data obtained from a respiration sensor.
  • a shooter's accuracy may be correlated with heart rate and pattern data obtained from a heart rate sensor.
  • data analysis may facilitate a player identifying a respiratory rate, or identification of timing of a shot relative to a respiratory pattern, that provides for enhanced shot accuracy.
  • a firearm-mounted sensor may provide for detection of the timing of a shot with enhanced precision.
  • a firearm mounted sensor may be used to provide information regarding gun movement throughout a period of time that may include preparation for a shot, aiming, firing, and follow through.
  • Firearm movement data throughout a similar period of time may also be produced by digital analysis of video image data.
  • shot data and various types of data that may be obtained from remote sensors and/or from shooter-oriented video cameras may be integrated and analyzed to provide valuable feedback to a shooter that may be used to make adjustments to shooting technique to achieve enhanced accuracy.
  • FIG. 12 is a block diagram illustrating a shooting system 1200 in accordance with various embodiments.
  • Shooting system 1200 can comprise remote sensor subsystem 1210 and data processing and display subsystem 1220 .
  • Remote sensor subsystem 1210 can comprise one or more remote sensors 1211 , a data recorder 1212 , and a data transmitter 1113 .
  • Data recorder 1212 can receive and record sensor data from one or more sensors 1211 .
  • Data recorder 1212 can be configured to send sensor data to data transmitter 1213 for transmission to data processing and display subsystem 1220 .
  • Data processing and display subsystem 1220 can be located at a distance from the firearm. Transmission of data from sensor subsystem 1210 to data processing and display subsystem 1220 can be via a wired connection or a wireless communications link.
  • Data processing and display subsystem 1220 can comprise data receiver 1221 , data storage module 1222 , data analysis module 1223 , and user interface 1224 .
  • Data receiver 1221 receives sensor data from data transmitter 1213 via the wired connection or wireless communications link, and stores the received sensor data in data storage module 1222 .
  • Sensor data from data receiver 1221 and data storage module 1222 can be processed and analyzed by data analysis module 1223 .
  • Target images, raw data and the results of data analysis can be displayed to a user via user interface 1224 .
  • User interface 1224 can comprise a game display configured to provide information to a player.
  • Data can be stored locally in sensor subsystem 1210 and/or in data processing and display subsystem 1220 . Data can be collected and aggregated for a series of shots. In some embodiments, aggregate data can be used for determining average scores and/or for establishing a trend.
  • a shooting system can comprise a plurality of remote sensor subsystems.
  • a shooting system can comprise a first remote sensor subsystem, a second remote sensor subsystem, a third remote sensor subsystem, a fourth remote sensor subsystem, and an nth remote sensor subsystem.
  • Each remote sensor subsystem may be in electronic communication with a data processing and display subsystem.
  • a first remote sensor subsystem can comprise a sensor subsystem located on a firearm
  • a second remote sensor subsystem can comprise a photoplethysmograph
  • a third remote sensor subsystem can comprise a respiration monitor
  • a fourth remote sensor subsystem can comprise an eye tracking system. Any number and combination of remote sensors may be used in a system in accordance with various embodiments.
  • a firearm-mounted sensor subsystem such as subsystem 1210 may be either integral to the firearm or attached to the firearm.
  • sensor subsystem 1210 can be attached at a suitable location on the firearm including, but not limited to, the rail, the slide, the trigger guard, the magazine, the barrel, or the stock.
  • Sensor subsystem 1210 can comprise one or more sensors 1211 .
  • Sensors 1211 can comprise motion-tracking devices including, but not limited to, one or more from the following list: a laser, a magnetometer, an inertial measurement unit, an accelerometer, and a gyroscope.
  • a method for integrating remote sensor data into a shooting system process is provided.
  • a method for integrating remote sensor data 1300 is illustrated in FIG. 13 .
  • Method 1300 can comprise a system initiation step (step 1310 ).
  • Method 1300 can then comprise initiation of a remote sensor (step 1320 ).
  • a system may then begin data collection (step 1330 ) to produce collected sensor data.
  • Method 1300 can comprise a method for detection of a fired shot (step 1340 ) to identify a fired shot event.
  • detection of a fired shot may be performed target image processing or by data from another remote sensor.
  • Method 1300 can comprise a step of correlation of collected data with detection of a fired shot by the system in step 1350 .
  • a system may be configured with a buffered data storage for data from one or more remote sensors. Detection of a fired shot in step 1340 and correlation of collected data with the fired shot event in step 1350 may comprise demarcating buffered data for a specified period before and after the fired shot event to produce correlated sensor data and uncorrelated sensor data. In this matter, collected data not associated with a fired shot event may be discarded by the system in step 1360 to conserve data storage space for valuable data associated with shots fired by players.
  • Method 1300 may further comprise displaying correlated sensor data associated with one or more fired shot events with target image data (step 1370 ), for example, in response to a player input or request.
  • Data displayed by a system comprising remote sensors can include, for example, video image of the shooter as a shot is fired 1401 , gun barrel kinematic data 1402 , target image data 1403 , eye movement data 1404 , heart rate data 1405 , respiration rate data 1406 , electroencephalography data 1407 , and game score information 1408 .
  • Any collected data can be displayed in any suitable arrangement in accordance with various embodiments.
  • a system can comprise at least one digital camera system which may utilize an image sensor such as a complementary metal-oxide-semiconductor (CMOS) or CCD, and the necessary electronic circuitry to transmit the captured images for further image processing.
  • CMOS complementary metal-oxide-semiconductor
  • a digital camera system can comprise a digital video camera system.
  • Alternate embodiments for the single board or game computer may contain a Graphics Processing Unit (GPU) with methods to split the computational workload between the CPU and GPU.
  • the described embodiments include a user interface that provides for the user to input user commands.
  • a user interface can comprise a keypad, keyboard, microphone, accelerometers, or touchscreen.
  • Each embodiment may contain an internal battery, accept interchangeable batteries, or receive power from an outside source such as mains power.
  • Each embodiment may contain a wired or wireless network interface to enable communication to and from external devices.
  • a visual output device such as a monitor or touchscreen may be included within any embodiment.
  • Digital cameras operate by recording light incident upon their sensors. There are many types of acceptable cameras with suitable image resolution to identify a new hole made in the target.
  • the conceived digital camera is preferably directly connected to the single board computer via technologies such as Universal Serial Bus (USB), FireWire, and ethernet. Wireless transmissions from the single board computer include standards such as Bluetooth, WiFi, and cellular networks.
  • the digital camera may also communicate via a Serial Interface or a Parallel Interface.
  • single board computer and computer system are intended to refer to a computer-related entity, comprising either hardware, a combination of hardware and software, software, or software in execution capable of performing the embodiments described.
  • the disclosed embodiments which use the single board computer refer to being interfaced to and controlled by a computer readable storage medium having stored thereon, a computer program.
  • the computer readable storage medium may include a plurality of components such as one or more of electronic components, hardware components, and/or computer software components. These components may include one or more computer readable storage media that generally store instructions such as software, firmware and/or assembly language for performing one or more portions of one or more implementations or embodiments of an algorithm as discussed herein. These computer readable storage media are generally non-transitory and/or tangible.
  • Examples of such a computer readable storage medium include a recordable data storage medium of a computer and/or storage device.
  • the computer readable storage media may employ, for example, one or more of a magnetic, electrical, and/or optical data storage medium. Further, such media may take the form of, for example, floppy disks, magnetic tapes, CD-ROMs, DVD-ROMs, hard disk drives, micro SD cards, standard SD cards, and/or solid-state or electronic memory. Other forms of non-transitory and/or tangible computer readable storage media not list may be employed with the disclosed embodiments.
  • Such components can be combined or divided in an implementation of a computer system. Further, such components may include a set and/or series of computer instructions written in or implemented with any of a number of programming languages, as will be appreciated by those skilled in the art. Computer instructions are executed by at least one central processing unit. In addition, other forms of computer readable media such as a carrier wave may be employed to embody a computer data signal representing a sequence of instructions that when executed by one or more computers causes the one or more computers to perform one or more portions of one or more implementations or embodiments of a sequence. Computer instructions for various components of data processing performed by a system disclosed herein may be performed on a local server or may be performed on a cloud-based server.
  • network includes any cloud, cloud computing system or electronic communications system or method which incorporates hardware and/or software components. Communication may be accomplished through any suitable communication channels, such as, for example, a telephone network, an extranet, an intranet, Internet, point of interaction device (point of sale device, personal digital assistant (e.g., mobile device, kiosk, etc.), online communications, satellite communications, off line communications, wireless communications, transponder communications, local area network (LAN), wide area network (WAN), virtual private network (VPN), networked or linked devices, keyboard, mouse and/or any suitable communication or data input modality.
  • a telephone network such as, for example, a telephone network, an extranet, an intranet, Internet, point of interaction device (point of sale device, personal digital assistant (e.g., mobile device, kiosk, etc.), online communications, satellite communications, off line communications, wireless communications, transponder communications, local area network (LAN), wide area network (WAN), virtual private network (VPN), networked or linked devices, keyboard, mouse and/or any suitable communication or data input modality.
  • system may be implemented with TCP/IP communications protocols
  • system may also be implemented using IPX, Appletalk, IP-6, NetBIOS, OSI, any tunneling protocol (e.g. IPsec, SSH), or any number of existing or future protocols.
  • IPsec IP Security
  • SSH Secure Shell
  • Cloud or “Cloud computing includes a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. Cloud computing may include location-independent computing, whereby shared servers provide resources, software, and data to computers and other devices on demand. For more information regarding cloud computing, see the NISTS (National Institute of Standards and Technology) definition of cloud computing at http://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-145.pdf (last visited Dec. 4, 2018), which is hereby incorporated by reference in its entirety.
  • NISTS National Institute of Standards and Technology
  • components, modules, and/or engines of a system as described herein may be implemented as micro-applications or micro-apps.
  • Micro-apps are typically deployed in the context of a mobile operating system, including for example, a Windows mobile operating system, an Android Operating System, Apple iOS, a Blackberry operating system and the like.
  • the micro-app may be configured to leverage the resources of the larger operating system and associated hardware via a set of predetermined rules which govern the operations of various operating systems and hardware resources. For example, where a micro-app desires to communicate with a device or network other than the mobile device or mobile operating system, the micro-app may leverage the communication protocol of the operating system and associated device hardware under the predetermined rules of the mobile operating system.
  • the micro-app may be configured to request a response from the operating system which monitors various hardware components and then communicates a detected input from the hardware to the micro-app.
  • a micro-app may be made available as a service.
  • system 1100 may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and/or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • system 1100 may be implemented with any programming or scripting language such as C, C++, Java, COBOL, assembler, PERL, Visual Basic, SQL Stored Procedures, extensible markup language (XML), with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements.
  • system 1100 may employ any number of conventional techniques for data transmission, signaling, data processing, network control, and/or the like.
  • These software elements may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine. Such that the instructions that execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flow chart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.

Abstract

The embodied invention is a method and equipment suitable for a shooting game with dynamic shot recognition and automatic scoring among multiple players firing at the same target. Each player's shot is scored based on a difference in the target's image from a prior image as viewed by a camera. The scoring target is aligned with the camera, and the output of the score change is displayed to the multiple shooters.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This Continuation-In-Part Application claims the benefit of and priority to U.S. Provisional Application Ser. No. 62/493,100, entitled “Shooting Game for Multiple Players with Dynamic Shot Position Recognition on a Paper Target,” filed. Jun. 22, 2016, and U.S. patent application Ser. No. 15/474,874, entitled “Shooting Game for Multiple Players with Dynamic Shot Position Recognition on a Paper Target,” filed Mar. 30, 2017. The entire disclosures of the aforementioned applications are incorporated herein by reference for any purpose.
  • FIELD
  • The present disclosure is directed to systems and methods of collecting and analyzing data related to firearms marksmanship. In various embodiments, a shooting game system capable of collecting and analyzing shot data for multiple players shooting at the same target in a competitive setting is provided. The system and methods described herein provide that a score for each player is automatically updated when each player takes his turn.
  • BACKGROUND
  • Shooting at targets dates back to antiquity.
  • In modern times, competitive rifle/pistol scoring is commonly done by shooting at a paper based target with suitable markings for scoring. The shooter's score is determined based on the position of holes made in the target, and scoring markings. An accurate result is determined when the target is closely examined.
  • Others have worked in the field to improve the scoring. For example, there are shooting scoring apps (i.e. Target Scan app for iOS) where a photographed (or scanned) paper target is examined for the location of the shots and the total score is determined electronically. To improve the accuracy of the scoring, a lighted background (or white background paper) is added behind the target to provide visual contrast between the target and the openings created by a shot. The system then distinguishes the center of a shot from the area weighted geometry of the hole. The software can have difficulty recognizing a shot accurately, and a manual option is given to the user to correct or place a shot to be scored.
  • Similarly CN1347040 also describes a scoring system where a target with bullet holes is analyzed for scoring. However, no disclosure was made as to how a shot was located in the camera image frame, and how a score was determined.
  • US Patent Publication No. 2014/0106311 describes a shooting training system where a shot is displayed to the shooter by alternating views of the current target versus an image capture the target image before the latest shot. This system only captures images and does not generate an automatic score, and does not determine a shot location in any camera image capture.
  • There are problems with this type of scoring system. In a shooting competition, it can take an undesirable amount of time to determine a score for a shooter versus other competing shooters. Multiple targets have to be retrieved, scanned, and the results have to be tabulated manually for each player. A target scanning type of scoring system does not lend itself to instant updates on a shooter's score. Such delays in retrieving a score dampens the sense of competition among the shooters. Also, the scanning systems cannot separate the score between multiple shooters on the same target.
  • Similarly, US Patent Publication No. 2010/0178967 and U.S. Pat. No. 4,898,391 describe a shooting game with a target and a gun that sends a beam of light to a game console for scoring against the target. Unfortunately, this type of scoring system does not use a gun which fires real bullets and is a less satisfying game to play.
  • Currently, during a shooting competition match, the paper target is often at a significant distance and binoculars or other visual aids must be used to estimate the current score. The end result is that the exact score is difficult to determine until the match is over. US Patent Publication No. 2014/0106311 describes a method whereby the target is monitored by a remote camera and the target image is sent back to a player. However, this system does not provide any automated scoring.
  • Shooting in a multi-player competition often requires a separate shooting lane for each player, and this can be expensive, particularly in an indoor shooting situation. Also, each player is not able to watch the other player shoot.
  • It is possible for multiple shooters to compete in a single lane and have each player shoot at their target in sequence. However, this is less desirable in a competitive shooting situation as the target must be retrieved for each player and scored separately. A new target has to be placed (manually or automatically) at the shooting distance. These types of delays diminish the competitive environment due to the loss of playing momentum.
  • What is needed is an instant type of scoring system where multiple players shoot at the same target in a competition in a way which adds to the feeling of competitive tension in the game. The current art lacks this important feature. It is preferable that a shot by shot competition be created where each incremental score is shown to all of the shooters, thereby adding increasing game tension. The tension will increase as the game progresses, and may be very high for the last two or three shots. This can lead to a very satisfying competition and elated feelings for the victor, or victorious team.
  • SUMMARY
  • The embodied invention is a method and equipment suitable for a shooting game with dynamic shot recognition and automatic scoring among multiple players firing at the same target. Each player's shot is scored based on a difference in the target's image from a prior image as viewed by a camera. The scoring target is aligned with the camera, and the output of the score change is displayed to the multiple shooters.
  • Important game enhancements include a dynamic update of the reference target image to follow multiple shot holes. When a significant change is detected from a reference target image, a shot event is recognized and the area of change identified for the placement of the shot. The shot score is then accumulated in a display that is viewable by all players.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows a shooting gallery lane designed for the multi-player target game system.
  • FIG. 2 shows a detail of FIG. 1.
  • FIG. 3 is a player's view of the shooting gallery.
  • FIG. 4 is a simplified profile view of the shooting gallery.
  • FIGS. 5 and 6 are block diagrams for how the latest shot is recognized and the score is determined.
  • FIG. 7 illustrates how the camera pixel sensors and an averaging filter are used to identify a shot location.
  • FIG. 8 is a game display showing the players their score and shot positions.
  • FIGS. 9A and 9B illustrate how the target image distortion is corrected when a camera is located above the target.
  • FIG. 10 shows communication flow between equipment components.
  • FIG. 11 shows a typical game display partway through the game showing additional features.
  • FIG. 12 illustrates a shooting system in accordance with various embodiments.
  • FIG. 13 illustrates a method in accordance with various embodiments.
  • FIG. 14 illustrates a game display with remote sensor data in accordance with various embodiments.
  • DETAILED DESCRIPTION
  • The present disclosure generally relates to gamified firearms marksmanship, and more particularly, to systems and methods for providing a shooting game to firearms users for various purposes such as entertainment, competition, and skill development. The detailed description of various embodiments herein makes reference to the accompanying drawings, which show the exemplary embodiments by way of illustration. While these exemplary embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, it should be understood that other embodiments may be realized and that logical and mechanical changes may be made without departing from the spirit and scope of the disclosure. Thus, the detailed description herein is presented for purposes of illustration only and not of limitation. For example, the steps recited in any of the method or process descriptions may be executed in any order and are not limited to the order presented. Moreover, any of the functions or steps may be outsourced to or performed by one or more third parties. Furthermore, any reference to singular includes plural embodiments, and any reference to more than one component may include a singular embodiment.
  • Systems, methods and computer program products are provided. In the detailed description herein, references to “various embodiments”, “one embodiment”, “an embodiment”, etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to affect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described. After reading the description, it will be apparent to one skilled in the relevant art(s) how to implement the disclosure in alternative embodiments.
  • As used herein, terms such as “transmit,” “communicate” and/or “deliver” may include sending electronic data from one system component to another over a network connection. Additionally, as used herein, “data” may include information such as commands, queries, files, data for storage, and the like in digital or any other form.
  • In various embodiments of the present disclosure, a shooting game is provided. A shooting game can comprise a competition among different players (also referred to herein as shooters) who shoot at the same target. The shooting game system can comprise a camera, such as a video camera, in electronic communication with a shooting game system computer. The terms “camera” and “video camera” may be used interchangeably herein. The video camera may be configured to monitor the target and relay target image data to the shooting game system. The shooting game system may be configured to relay the target image to a screen or electronic display that is viewable by all players. The players, in turn according to a defined sequence of players, shoot at the target and the shot hole is automatically recognized by the shooting game system based on digital image processing of target image data to identify a significant change in the target image from shot to shot. The shot placement is identified and a digital display marker is placed over the shot hole on the target image. The digital image data may also be automatically analyzed to determine the score of the shot. The current shooter's overall score is updated and displayed on a game screen. When the first player completes his/her turn, the next player in the sequence of players becomes the shooter.
  • In various embodiments, a shooting game can comprise two to four players per game. However, a shooting game can comprise any suitable number of players, and the number of players may be user-selectable based on a setup that is input into the game system computer via a user interface. Identifying information may be input for each player participating in a shooting game, and the player sequence may likewise be determined by a player or other user.
  • In various embodiments, a player or other user can input a firearm type and/or a firearm caliber to be used by a player in the game system computer. In various embodiments, the firearm type and/or firearm caliber may be selected from a predetermined menu of firearm types and/or calibers. Input of a firearm type and/or firearm caliber into the game system computer may facilitate target image data processing, such as shot registration and scoring, by the game system computer. For example, a game system computer may select an appropriate shot registration and scoring algorithm in response to input of a particular caliber by a user. In various embodiments, a game system computer may determine a firearm caliber used by a player based on target image data processing.
  • A shooting game can be configured to provide each shooter with a predefined number of shots per game, for example, 5, 7, 10, 14, or 20 shots per player per game. In various embodiments, the number of shots per player per game can be selected from a predetermined range, such as between 1 and 21 shots per game.
  • In various embodiments, a player's turn can comprise a single shot, or a turn can comprise multiple shots. The number of shots to be taken by a player in a single turn may be input by a user or operator or may be selected from a predetermined range of shot numbers.
  • In various embodiments, a player can provide an input to the game computer to switch players, such as by pressing a button or otherwise providing an input to the game system computer to switch the scoring to the next player in various embodiments, the game system computer may automatically switch to the next player in the sequence of players following detection and scoring of a player's shot. The game system computer may display the identifying information for the next player to prompt the next player in a sequence of players to take his or her turn. In various embodiments, the game system may be configured to clear marked shots upon a switch to a new player, particularly in embodiments of game play in which each player takes multiple shots in a turn.
  • In various embodiments, a game system can comprise a target mounted inside a metal frame. A metal frame may comprise certain structural features that can serve as reference points for optical calibration of a digital camera and/or a shooting system. The frame may be designed to fit the target tightly so that the alignment and position relative to the camera is substantially maintained and a set up calibration is not needed when a new game is started. In various embodiments, a target may be attached to a metal frame by an interference fit, such as by placing the margins of the target between components of the metal frame configured to connect with one another by an interference fit or other mechanical connection. In various embodiments, a target can comprise a paper sheet or a sheet of cardboard material. A target can comprise a sheet of non-paper material, such as polymer material, including various natural or synthetic polymers. In various embodiments, a polymer target material may be a self-healing polymer.
  • In various embodiments, a camera may be positioned in front of the target, and above it, so that the camera is suitably positioned to obtain an image of the target. The alignment between the target and the camera may be established at the beginning of the game in response to a user-selected target position, or the alignment may be previously established. Preferably, the camera is kept at a fixed distance and position from the target to simplify the set up and accuracy of the cameras image. In various embodiments, a camera may be mounted to the motorized target trolley. Mounting a camera to the motorized target trolley may facilitate maintaining a fixed distance and position with respect to the target. In various embodiments, a shooting lane can comprise more than one target-oriented camera. For example, in various embodiments, a camera may be mounted in fixed positions suitable to monitor targets set at 10 ft, 25 ft, and 50 ft from the shooting position. In various embodiments, a shooting lane can comprise one or more fixed position cameras and a camera mounted to the motorized target trolley. In embodiments comprising a fixed position camera and a camera mounted to the motorized target trolley, each camera may be used to capture target images. Use of multiple cameras to capture target images may provide for more robust image data and enhanced shot detection accuracy
  • In various embodiments, a camera may be housed in a protective housing. A protective housing may be configured to protect a camera from impact or penetration by an errant bullet fired by a player. For example, a protective housing may comprise plate steel configured to enclose or shield a camera and protect the camera from direct impacts and/or from the effects of a bullet impact with the housing. In various embodiments, as as for fixed-placement cameras, a protective housing may comprise an angled metal baffle mounted to a structure such as the shooting lane ceiling and configured to deflect a bullet downrange from the shooter. In embodiments comprising a camera mounted to a motorized target trolley, a protective housing can comprise a steel housing configured with angled surface to similarly deflect a bullet downrange while minimizing the energy transferred to the camera. In various embodiments, a protective housing may be vibrationally isolated from a camera protected by the housing.
  • The camera, and automatic image post processing, must be able to recognize when a shot is made and score it accurately. This is done by continuously monitoring the output of the target camera and observing when there is a significant image pixel change.
  • The ability of the camera to recognize a shot depends partly upon the signal to noise ration. Modern digital image sensors in cameras have known errors that create signal noise. In general, the total noise is dependent on the noise factor, background, readout noise, and EM gain. The sensor noise must be filtered out in order to recognize a shot.
  • In various embodiments, a single board computer is used to interface between the camera and the scoring display. The single board computer is suitably programmed to perform the automatic scoring functions and display any scoring on the game display.
  • The game display may be any type of display that may be used to display game information. A game display can include, for example, a high definition television or monitor. In various embodiments, a game display may also comprise a mobile device such as a tablet or mobile phone. A game display may be electronically connected to the system via a wired connection or a wireless communication link.
  • At the end of a game round, the total score for each shooter is automatically computed and displayed under each player's name. The game system may be configured to automatically identify a winner based on comparison of each player's score. In various embodiments, the game system may announce each player's score and/or the winning player via an audio announcement.
  • In accordance with various embodiments, target image acquisitions is performed using a camera. Pixel changes in a target image due to a bullet impact in an image frame corresponding to the target area are used to identify a shot. The image from the camera is continuously monitored to recognize the location of the shot and score it accurately.
  • To aid in understanding the game, FIG. 1 shows significant elements of the game setup. A players' stand 101 has a display 102 which displays multiple players on the screen. The output of the wireless camera displays the current target image on display 102. Each player's score, the target, and marks of their shot positions are also shown on the scoring display. A keyboard (or a dedicated control box) may include next player, previous player, display, game start, game setup, and game end buttons or selections. Also, a setup screen may be provided to allow players or operators to select certain game parameters such as the target type, target size, target distance, number of players per game, the ammunition type, and the number of shots per player.
  • In various embodiments, a target type and/or target size maybe selected from a range of available target types and/or target sizes.
  • The target assembly 115 is shown in FIG. 1. The target 103 is housed inside a metal frame 107 and removably connected to the motorized target trolley 105 through a mounting clip 108. The target 103 and metal frame 107 may be at least partially vibrationally isolated from mounting clip 108 and/or trolley 105 by integration of a vibration dampening material or mechanism, such as a mechanism integrated into the mounting clip 108, between mounting clip 108 and metal frame 107, or between mounting clip 108 and trolley 105. The motorized target positioner moves on a rail 104 to set the target at the correct distance from the player. The target camera 106 is attached to the target positioner with a good and clear view of the target. In various embodiments, a projector can be co-located on the target positioner with camera 106.
  • Motorized target trolley 105 can be configured to be positioned on rail 104 at various user-selectable preset distances, such as 10 yards, 25 yards, and 50 yards. In various embodiments, target distances need not be present, and motorized target trolley 105 can be configured to be positioned at any distance from the shooting position compatible with physical parameters of the rail and shooting lane,
  • A target frame attaching clip 108 connects the target frame 107 with the target trolley 105. The trolley rides on the rail 104 and travels to the charging station 109 during the normal course of shooting. An additional (optional) overhead game display 110 provides game status to the players. The charging station is connected via communication cables 1120 to the game displays (FIG. 2). The target trolley is battery operated, and charges at the charging station through contacts 113 a,b (FIGS. 1 and 2). LED lighting 114 is used to illuminate the target.
  • When a shot hits the target 103 (or inadvertently hits the metal target frame 107), significant motion of the target can occur due to the impact, causing a local area to move backward. Also, the bullet impact can cause the target to shift in position, relative to the camera, with vibration or motion. This vibration can cause a significant enough change for a group of pixels to falsely report an area where a shot has penetrated the target. The vibration can also be subtle, with the impact of the bullet causing a change in reflectivity on the target surface, also causing a significant enough change for a group of pixels to falsely report an area where a shot has penetrated the target.
  • To minimize the effect of the target vibration and reflectivity changes, the frequency of how often the target image is analyzed can be varied to a more suitable time to allow the target and camera vibration to dissipate. An interval of 1 second has been found to be optimal in improving the reliability and accuracy of determining the shot location.
  • In various embodiments, a vibration dampening mechanism such as that described above may be incorporated into the system and may serve to reduce the vibrational or movement-induced reflectivity changes created by the impact of a bullet with a target or frame.
  • The camera view of the target is distorted as the angle of the target is not perpendicular to the camera view. The angle depends upon the position of the camera, which is preferably above the target. This is not a concern as to identifying the shot, but is important in scoring each shot correctly. The lower edge of the target is narrower than the top. The result is that there are fewer pixels per inch at the bottom edge of the target than at the top.
  • To facilitate improved reliability and to simplify the equipment needs, the wireless camera is powered by a battery that is rechargeable. The battery connects to a recharging station when target positioning assembly is moved to the players stand. In a preferred embodiment, the motorized target positioner is also battery driven.
  • The target image is preferably taken from the camera 106 with a 1080p resolution. The sensor in the camera is a CCD type or a CMOS type. Typically, to obtain a color image, cameras use a Bayer color filter array. The Bayer color filter array includes red, green, and blue light filters in a mosaic grid pattern in front of the individual camera pixel sensors. Typically, the Bayer 2×2 pixel filter grid comprises a green and red color filter in the first row, and then a blue and green color filter in the second row. This 2×2 filter grid is repeated over the entire camera image sensor.
  • Since each camera pixel sensor only registers the light intensity of one color, the intensity of the other two color values at that sensor pixel are not known, and is missing information. To create an image file with fine detail, the other two colors at each of the four camera sensor pixels are interpolated using a de-mosaicing algorithm. This can be done in the camera or in a post processing algorithm.
  • Typical de-mosaicing algorithms include copying the colors from neighboring sensor pixels, averaging different colors from nearby sensor pixels, or using linear interpolation of nearby colors. The goal is to reasonably estimate all three colors at each camera sensor pixel.
  • In an alternate embodiment, the camera is a black and white camera and the color information is ignored.
  • FIG. 3 is a player's view of the shooting gallery. As shown, players stand behind the table and fire at the target. A display is shown above the players to identify which player is shooting and each player's score.
  • FIG. 4 is an alternate view of the shooting gallery showing a computer that is used as a shot record and a user interface.
  • In FIG. 5, a process flow for basic game play is illustrated. In various embodiments, a round of basic game play (also referred to herein a shooting event) can be initiated when the first player to shoot presses start button 501 to begin their shooting turn. The camera captures a continual baseline image frame 502 of the target from the continuous target camera video stream. The target camera then watches for a significant change in the target 504 by comparing current image with the previous baseline image 505. A shot will be recognized when there is a significant pixel count with changes 503. A shot will be recognized when more than 0.02% of the pixels change from image to image.
  • Typically, a video frame (i.e. shot image) is captured after 1 second after the shot to determine the score. This avoids issues with any slight target motion or changes in reflectivity from the shot impact. The shot scoring method determines where the shot occurred on the target and updates the player score. The one second criteria is adjustable.
  • In various embodiments, a shooting system can include numerous additional features that enrich the shooting experience, including various interactive competitive and social features. For example, a system can support addition of one or more players to a shooting event, such as one, two, three, four, five, six, seven, eight, nine, ten, or n players, where n can be any number, and the total number of players entered into a shooting event can be essentially unlimited.
  • In various embodiments, each player can enter individual player information, such as a player's name or player identification, various demographic or other identifying information such as sex, age, street address, email address, social media account information, and the like. Social media account information can include user IDs and passwords to facilitate integration of the system with a player's social media account, such as their Facebook, Instagram, Periscope, or YouTube account, to enable a player to request that the system publish images, video, or other information from the system to a social media feed associated with a social media account connected to the player's individual account. In various embodiments, a player's shooting session or a portion thereof may be live-streamed, such as via one or more of the player's social media accounts or via a channel provided by the system, in response to a user command, such as a prompt or request from a player or other operator. Video of a player's shooting session or a portion thereof may be recorded and uploaded to the internet or posted to social media following a shooting session.
  • A player's individual player information can be associated with a guest account used for a single visit to a shooting facility, or a player's individual information can be retained by the system and associated with a player account that may be accessed and used by an individual player over multiple visits to a shooting facility or multiple shooting sessions during a visit. The player account may include a player account user ID and password to uniquely identify and secure each player's account in the system. A player account may be configured to record and store various historical information from completed shooting sessions for a player, such as number of shots fired, firearms and/or calibers used, target types used (including stationary and animated targets), overall shot accuracy, shot accuracy for particular firearms and/or calibers, shot accuracy for specific target types used, shot accuracy trends, and the like. A system can also be configured so that a player's account information can be integrated into and/or available at a point-of-sale (POS) system, such that a player may be charged based on use, or eligible for various discounts based on use (including number of shooting sessions, or number of visits to shooting range), shooting performance, and the like.
  • In various embodiments, player information can be entered via a keyboard, computer terminal, or dedicated control box located at a players' stand 101, or player information can be entered via a wirelessly connected device, such as a tablet provided by the system manager, or by a player's personal mobile device that is wirelessly connected to the system. In various embodiments, the system can comprise a mobile application that may be downloaded to a player's personal mobile device and be configured to interact with or control the system and the player's gameplay experience. In various embodiments, a system can also be configured with POS system integration to enable financial transactions to take place at a dedicated control box located at players' stand 101 or on a player's personal mobile device that is wirelessly connected to the system.
  • FIG. 6 is a block diagram that shows how a shot is recognized. The camera records a target image at a frame rate of 60 frames per second. A baseline target image is dynamically maintained at a predetermined interval (typically an image 1 second previously) relative to the current target image frame transmitted by the camera.
      • 1. The current target image is turned into a grayscale image by averaging the Red-Green-Blue (RGB) colors in each target image pixel 601
      • 2. The current target image is smoothed by averaging each target image pixel with the surrounding neighbor pixels (i.e. surrounding 8 pixels, 5 pixels at the image edges) 602. This helps to eliminate camera sensor noise.
      • 3. The current target image is compared to the baseline target image to identify any pixels with a change in value 603.
      • 4. The shot is recognized by looking at each pixel with a change and examining the surrounding 1×1 inch area. When the pixel change count in the 1×1″ surrounding area is more than a threshold value (such as 25%), a shot is recognized as having taken place 604.
      • 5. For scoring and identification purposes, the center of the shot is placed at the average x, y image pixel location with a change. This is narrowed to the shot recognized 1×1″ area 605.
      • 6. The shot center position is then scaled to the target image, scored, and reported to display screen 606.
  • The average value for the x and y coordinates of the changed pixels identify the center of the shot on the target. To identify the shot on the game display, a scoring marker (such as a circle or square) is then fitted around the pixels with a significant change. Typically, the scoring marker is a fixed size. For scoring purposes, the most important value is the location of the shot center in reference to the score markings on the target 604.
  • In various embodiments, a scoring marker may comprise different colors or shapes. For example, a red marker may be used to indicate the location of the most recent shot, while yellow marks may be used to mark previous shots. Any color or combination of colors may be used. In various embodiments, a particular marker shape may be associated with a specific player. For example, a circular marker can be assigned to a first player, a square marker assigned to a second player, a triangular marker assigned to a third player, and a diamond-shaped marker assigned to a fourth player. In various embodiments, the game system can be used to toggle through screens displaying the target overlayed with markers corresponding to each player's shot, such that each player can view his or her shot grouping independently of the other players' shot markers.
  • In various embodiments, a particular marker shape or color can be associated with a particular caliber.
  • FIG. 7 illustrates additional information as to how the digital camera recognizes a new hole position. A bullet hole is shown inside a 1″×1″ area 701, which is made during a shooting game. To identify the image change due to the new hole, the 1″×1″ area surrounding the hole is overlaid on top of a 10×10 grid representing camera resolution 702 of 10 pixels per inch. The bullet hole may be identified with reference to a change in signal at individual camera pixels 703, and some pixels are directly affected and change in light intensity 704 b due to the hole (black dot) no longer reflecting light. In this illustration, 13 pixels have light intensities are directly changed to at least a small degree (i.e. black dot is touching). When the shot is filtered to grayscale and run through the neighbor pixel averaging filter, neighboring pixels 704 a are also changed in value due to being averaged with the directly affected pixels. In this illustration, 20 additional neighboring pixels are changed. A shot is recognized because 33% of the pixels in the 1×1″ area are recognized as having been changed, and this is above the 25% threshold. In various embodiments, other approaches to image processing may likewise, be used to recognize physical changes to a target in response to target penetration by a projectile. Any suitable method for image processing that has been previously developed or is he developed in the future may be used for shot recognition in accordance with the present disclosure.
  • To accurately place the shot position on the target, any distortion between the camera and the target has to be corrected. The distortion correction is accomplished by mapping each pixel from an (row, column) position to a (height, width) position.
  • The goal of distortion correction is to accurately re-create the target image and display it to the user. FIGS. 9A and 9B illustrate how the image is distorted which allows for a methodology to adjust the camera image.
  • FIG. 9A, shows how an image is distorted due to the position of the camera, and how the image will be seen by the camera. The target 901 has a projected image plane 902 which is perpendicular to the camera 903 orientation. Evenly spaced dashed lines 904 at the target 901 edge pass through the projected image plane to the camera, which shows how the image is distorted due to the position and viewpoint of the camera. The camera sees the projected image on the projected image plane 902. This causes the lower portion of the target to be compressed in the camera view as seen on the projected image plane 902.
  • FIG. 9B is a left side view of FIG. 9A. In FIG. 9B, the projected image plane 902 is shown across the width and projected (dashed) lines show how the edges of the target image is sent to the camera. Upon examination, it is seen that the largest adjustment to the camera image is adjusting the vertical height of the image, particularly on the lower portion. The height adjustment is not linear. The horizontal image will receive corrections and the distortion changes as a function of height.
  • To correct the target distortion, one embodiment is to map out a grid of changes in a matrix format, based on the projected geometry, and then apply the change grid to the image. This will establish a variable scaling and re-positioning of an image pixel based on the row and column to an x,y position. The change grid can be established on graphic plotting at chosen grid points, such as every 2×2 inches, and then linear interpolating between the chosen grid points to establish a corrected (x,y) position for each image pixel. Although time consuming to establish, a grid/interpolating system can be effective as it involves basic matrix math and is relatively easy to understand. 3D computer aided drafting (CAD) can be helpful in establishing projecting geometry.
  • Another embodiment is to use analytic geometry to identify the intersection of a line with a plane. The first point of the line is the camera position (x1,y1,z1) and the second point of the line is a position (x2,y2,z2) of an image pixel as taken by the camera. All of the camera image pixels are located in a plane perpendicular to the camera orientation. The target is the intersecting plane with a plane equation of ax+by+cz=d where a, b, c, and d are constants. Utilizing known analytic geometry methods, the equation of the line defined by the camera and image pixel can be projected onto the target plane. This method can be used to establish a target position (x,y,z) for each image pixel (row, column), effectively correcting camera distortion.
  • In either method, the image correction can be verified by utilizing a target with easily recognizable shapes (circles, equilateral triangles, squares, crosses, etc.) to determine if an image that is taken by the camera is corrected satisfactorily.
  • In a preferred embodiment, the camera is located above the target and in close proximity to it. This means that the camera is no further distance to the target than the maximum width or the maximum height. Since the camera is preferably located above the target, and out of the way of shooting, the camera resolution per inch will be greater for the top portion of the target, and somewhat lower for the lower portion of the target. This adjustment in scale must be accounted for in the shot placement. Typical view angles between the camera and the target are 0 to 60 degrees (as measured from the horizontal plane), but this is nota strict requirement.
  • It is known that digital camera sensors have noise when taking a picture or capturing a video frame. To that end, each pixel is gray-scaled by averaging the pixel RGB values. Additionally, and to smooth out any target image pixels that might be incorrectly identified as a pixel change, each pixel is then averaged with its immediately surrounding eight pixels. If an image pixel is on the edge of the sensor, then the pixel image is averaged with the five surrounding image pixels. This creates an effective filter that smooths out image problems.
  • A particular problem with shooting is vibration of the paper target due to the shot penetration. The target can vibrate in the area of the shot causing the reflectivity of the target image to change immediately following the shot. This creates unpredictable pixel changes and potential shot misplacement. To avoid an inaccurate shot placement, the target may be allowed to recover for approximately 1 second before a scoring placement is made.
  • When working with the camera and lighting, it was discovered that the frequency (i.e. 60 Hz) of the lighting on the target can cause difficulties with shot recognition. The camera scanning frequency may match that of the lighting frequency and this can cause a target image shadow to be read as a changed target image frame. Consequently, a DC (direct current) based lighting system is preferred, such as a light emitting diode (LED) which is powered by a constant voltage power supply (DC).
  • To further refine the shot recognition, the entire target is examined for a significant change, pixel by pixel. If a cluster of changes are detected in a 1″×1″ square surrounding the changed pixel, then a shot is recognized. The threshold of determining a shot is a value that is empirically determined. During a test on a typical web-cam type camera with a CMOS sensor, a threshold of 25% was determined to be a good balance between sensitivity in detecting a shot and avoiding sensor noise which causes a false shot recognition. Other threshold values are possible based on the type of camera chosen and the amount of camera sensor noise. A camera with a low noise sensor, for example, will use a lower threshold that better identifies overlapping shots.
  • A typical camera that is useful for shot recognition will have an 8 bit resolution and captures color images. Also, a camera with a low sensor noise ratio is helpful in minimizing the amount of filtering required. Shot recognition is improved with a low signal to noise ratio. The camera could equally be a black and white which outputs a grayscale image. In this case, the grayscale image conversion is not necessary.
  • In various embodiments, a system can comprise a projector configured to project high resolution video onto a display screen. The display screen can comprise, for example, a disposable paper screen or a disposable screen constructed of another material suitable to display projected video and to physically register impact or penetration by a projectile. In various embodiments, a projector can be mounted to a motorized target trolley such as motorized target trolley 105 (FIG. 1), for example at or approximately located in the position of camera 106. A display screen can comprise a target mounted to a metal frame connected to the motorized target trolley as described above with respect target assembly 110. A display screen used for a projected video target can comprise a blank or plain target to facilitate visibility of the projected video image.
  • In various embodiments, a projected video can comprise a projected video image target. Projected video image targets can have any of a variety of target configurations, such as a circular bullseye, a human silhouette, a steel or reactive target, a bottle, a can, a bird, mammal or other animal, a clay pigeon, a zombie, a balloon, a saucer, and the like. Any type of projected image suitable for use as a shooting target may be used as a projected video image target in accordance with various embodiments of the present disclosure. A projected video image target can be still or animated. In various embodiments, an animated projected video image target may be animated in response to a detected shot. For example, a projected video image target of a frangible target such as a clay pigeon may be shown to explosively fragment in the projected video in response to a registered shot corresponding to the location of the projected video image of the clay pigeon target (i.e., a “hit” target or scoring shot). A projected video image of a steel reactive target may spin or flip in response to a scoring shot. A projected video image of a game animal may respond in a realistic or lifelike fashion to a hit, including, for example, responding differently for a grazing shot and a “kill shot.”
  • In various embodiments, an animated target may be timed, and a player's score for a shot may depend on the amount of time required to successfully hit the animated target.
  • In various embodiments, a projected video image may comprise background scenery. Background scenery may be animated. Animated background scenery may comprise, for example, various settings corresponding to various natural or urban environments, such as a woodland environment, a grassland environment, an urban outdoor or street environment, an indoor urban environment, and the like. A projected video target environment may enhance the realism of a shooting experience and may correspond to any of a variety of natural hunting environments or tactical environments. A projected video target environment may challenge a shooter with various non-target features. For example, a hunting environment may present a shooter with various non-target game animals, such as distracting animals, does, or the like. Similarly, a tactical environment can present a shooter with non-target civilians, hostages, friendly forces, vehicles, and the like.
  • A projected video target environment may be configured provide a shooter with a particular difficulty level. The difficulty level may be selected from a range of difficulty levels. Various factors such as target size, speed of target movement, and presence of non-target features in the projected video target environment may vary in response to different difficulty levels. A player's score may likewise take into consideration the difficulty level of a particular projected video target and/or projected video target environment.
  • FIG. 8 shows a game display. Up to four players can shoot, and their individual shot scores (bold letters) along with a total for each player is shown. The individual shot locations have been identified and marked with an individualized geometric marker, such as a triangle, square, hexagon, and rhombus. Other marker geometry could equally be used.
  • The shot can be scored based on either the distorted camera image or the corrected target image utilizing the target score markings. It is important that the score is accurate, relative to the location of the markings on a distorted or corrected image. In one embodiment, the position of the shot can be mapped based on a score mapping on the camera image. The row and column position of each pixel can be grouped and assigned to a particular score.
  • FIG. 10 shows communication flow between equipment components. In a preferred embodiment, the camera 1002 views the target 1001 and is hardwired to a small dedicated single board computer 1003 which wirelessly communicates to a game computer 1004 which is hardwired to the game display 1005. A user interface 1005 (keyboard, button board) may be hard wired to game computer 1004. In various embodiments, a user interface need not be hard wired to the game computer. A user interface in accordance with various embodiments of the present disclosure can comprise a dedicated touchscreen display, a tablet, or a mobile device. A dedicated touchscreen display may be hard wired or wirelessly connected to game computer 1004.
  • The single board computer is generally conceived to include a CPU, both volatile and non-volatile memory, an operating system, onboard communications between distinct components, a wireless transmitter, and suitable software programming to execute non-transient computer instructions. For example, the single board computer could be selected from the portfolio of the Raspberry Pi single board computers as manufactured by the Raspberry Pi Foundation (United Kingdom).
  • FIG. 11 shows a shooting game, partway through game completion. A player list 1101 on the left side show the current player up to make the next shot. The current player's total score 1102 is displayed. A header text 1103 indicates the current player and current player's round. A list of the current player's shots along with the score per shot is displayed in a list 1106. A current target image 1104, along with display markers 1105 on current player's recognized and scored shots is shown. This information on the display is helpful for game clarity and to enhance the game competition by having feedback on any game status questions the players may have.
  • In various embodiments, a shooting game system may be configured to receive data from a remote sensor. Data from a remote sensor may include data regarding a player and/or a player's firearm. For example, a shooting game system may be configured to receive data from one or more wearable devices that may be worn by a player, such as a heart rate monitor, a photoplethysmograph, electroencephalography sensors, a respiration rate sensor, an accelerometer, an inertial measurement unit (IMU), a magnetometer, a gyroscope, and any other suitable microelectromechanical system or sensor that may be used to detect a physical or environmental condition. In various embodiments, a shooting game system may be configured to receive data from a sensor that may be mounted to a firearm. For example, PCT/US2016/013760 (Allgaier) discloses a rail-mounted firearm remote sensor apparatus and methods and is herein incorporated by reference in its entirety. A firearm-mounted sensor may be used to detect various events such as trigger squeeze, trigger break, firing, recoil, and the like, along with gun movement associated with such events and/or throughout the shot-taking process.
  • In various embodiments, a shooting game system may comprise one or more video cameras directed at the player. A video camera directed at the player may collect image data that may be analyzed to provide useful information surrounding a player's shot, such as information regarding stance, respiratory rate, body, arm and hand movements, eye tracking, gun barrel kinematics, and the like.
  • In various embodiments, a video camera directed at the player may be used to perform eye tracking. Eye tracking may be performed using eye tracking glasses comprising a camera or other systems configured to perform conical imaging using methods such as pupil corneal reflection. In various embodiments, a system with eye tracking may be used to measure a period of time in which a shooter's gaze is locked on a specific location or object, a visual phenomenon referred to as a “quiet eye” period, for example by Causer et al., 2010, Medicine and Science in Sports and Exercise, 42(8): 1599-1608, which article is incorporated herein by reference in its entirety.
  • In various embodiments in which an animated projected video image target is used, a remote sensor used to perform eye tracking may be used to produce and record tracked eye movement data. The system may be configured to compare tracked eye movement data to animated projected video image target movement to assess a shooter's visual acquisition and tracking of a moving target during the player's shot. Similarly, a firearm mounted remote sensor and/or a video camera and digital image processing system may be used to provide gun barrel kinematics data that may be compared to animated projected video image target movement to assess a shooter's firearm tracking movements relative to the target movements.
  • In various embodiments, a video camera directed at the player may comprise a trigger-oriented camera. A trigger-oriented camera may be configured to capture video images of a shooter's trigger finger or firearm grip including the area around the trigger. A system may be configured to perform digital image processing to determine when a shot is taken based on video image data from a trigger-oriented camera.
  • In various embodiments, remote sensor data may be analyzed in conjunction with a player's shooting performance data. For example, a shooter's shot accuracy may be correlated with respiratory rate and pattern data obtained from a respiration sensor. A shooter's accuracy may be correlated with heart rate and pattern data obtained from a heart rate sensor. In various embodiments, such data analysis may facilitate a player identifying a respiratory rate, or identification of timing of a shot relative to a respiratory pattern, that provides for enhanced shot accuracy. In various embodiments, a firearm-mounted sensor may provide for detection of the timing of a shot with enhanced precision. Likewise, a firearm mounted sensor may be used to provide information regarding gun movement throughout a period of time that may include preparation for a shot, aiming, firing, and follow through. Firearm movement data throughout a similar period of time may also be produced by digital analysis of video image data. In various embodiments, shot data and various types of data that may be obtained from remote sensors and/or from shooter-oriented video cameras may be integrated and analyzed to provide valuable feedback to a shooter that may be used to make adjustments to shooting technique to achieve enhanced accuracy.
  • FIG. 12 is a block diagram illustrating a shooting system 1200 in accordance with various embodiments. Shooting system 1200 can comprise remote sensor subsystem 1210 and data processing and display subsystem 1220. Remote sensor subsystem 1210 can comprise one or more remote sensors 1211, a data recorder 1212, and a data transmitter 1113.
  • Data recorder 1212 can receive and record sensor data from one or more sensors 1211. Data recorder 1212 can be configured to send sensor data to data transmitter 1213 for transmission to data processing and display subsystem 1220. Data processing and display subsystem 1220 can be located at a distance from the firearm. Transmission of data from sensor subsystem 1210 to data processing and display subsystem 1220 can be via a wired connection or a wireless communications link.
  • Data processing and display subsystem 1220 can comprise data receiver 1221, data storage module 1222, data analysis module 1223, and user interface 1224. Data receiver 1221 receives sensor data from data transmitter 1213 via the wired connection or wireless communications link, and stores the received sensor data in data storage module 1222.
  • Sensor data from data receiver 1221 and data storage module 1222 can be processed and analyzed by data analysis module 1223. Target images, raw data and the results of data analysis can be displayed to a user via user interface 1224. User interface 1224 can comprise a game display configured to provide information to a player.
  • Data can be stored locally in sensor subsystem 1210 and/or in data processing and display subsystem 1220. Data can be collected and aggregated for a series of shots. In some embodiments, aggregate data can be used for determining average scores and/or for establishing a trend.
  • In various embodiments, a shooting system can comprise a plurality of remote sensor subsystems. For example, a shooting system can comprise a first remote sensor subsystem, a second remote sensor subsystem, a third remote sensor subsystem, a fourth remote sensor subsystem, and an nth remote sensor subsystem. Each remote sensor subsystem may be in electronic communication with a data processing and display subsystem. For example, in various embodiments, a first remote sensor subsystem can comprise a sensor subsystem located on a firearm, a second remote sensor subsystem can comprise a photoplethysmograph, a third remote sensor subsystem can comprise a respiration monitor, and a fourth remote sensor subsystem can comprise an eye tracking system. Any number and combination of remote sensors may be used in a system in accordance with various embodiments.
  • In various embodiments, a firearm-mounted sensor subsystem such as subsystem 1210 may be either integral to the firearm or attached to the firearm. For example, sensor subsystem 1210 can be attached at a suitable location on the firearm including, but not limited to, the rail, the slide, the trigger guard, the magazine, the barrel, or the stock. Sensor subsystem 1210 can comprise one or more sensors 1211. Sensors 1211 can comprise motion-tracking devices including, but not limited to, one or more from the following list: a laser, a magnetometer, an inertial measurement unit, an accelerometer, and a gyroscope.
  • In various embodiments, a method for integrating remote sensor data into a shooting system process is provided. A method for integrating remote sensor data 1300 is illustrated in FIG. 13. Method 1300 can comprise a system initiation step (step 1310). Method 1300 can then comprise initiation of a remote sensor (step 1320). Following remote sensor initiation, a system may then begin data collection (step 1330) to produce collected sensor data. Method 1300 can comprise a method for detection of a fired shot (step 1340) to identify a fired shot event. In various embodiments, detection of a fired shot may be performed target image processing or by data from another remote sensor. Method 1300 can comprise a step of correlation of collected data with detection of a fired shot by the system in step 1350. For example, in various embodiments, a system may be configured with a buffered data storage for data from one or more remote sensors. Detection of a fired shot in step 1340 and correlation of collected data with the fired shot event in step 1350 may comprise demarcating buffered data for a specified period before and after the fired shot event to produce correlated sensor data and uncorrelated sensor data. In this matter, collected data not associated with a fired shot event may be discarded by the system in step 1360 to conserve data storage space for valuable data associated with shots fired by players. Method 1300 may further comprise displaying correlated sensor data associated with one or more fired shot events with target image data (step 1370), for example, in response to a player input or request.
  • A schematic of a data display 1400 is illustrated in FIG. 14. Data displayed by a system comprising remote sensors can include, for example, video image of the shooter as a shot is fired 1401, gun barrel kinematic data 1402, target image data 1403, eye movement data 1404, heart rate data 1405, respiration rate data 1406, electroencephalography data 1407, and game score information 1408. Any collected data can be displayed in any suitable arrangement in accordance with various embodiments.
  • Each described embodiment incorporates image processing equipment comprising one or more Central Processing Units (CPUs), volatile Random Access Memory (RAM), non-volatile storage such as Electronically Erasable Programmable Read Only Memory (EEPROM), flash, optical disk, magnetic disk, or solid state memory such as a solid state disk or a micro SD card. In various embodiments, a system can comprise at least one digital camera system which may utilize an image sensor such as a complementary metal-oxide-semiconductor (CMOS) or CCD, and the necessary electronic circuitry to transmit the captured images for further image processing. In various embodiments, a digital camera system can comprise a digital video camera system.
  • Alternate embodiments for the single board or game computer may contain a Graphics Processing Unit (GPU) with methods to split the computational workload between the CPU and GPU. The described embodiments include a user interface that provides for the user to input user commands. A user interface can comprise a keypad, keyboard, microphone, accelerometers, or touchscreen. Each embodiment may contain an internal battery, accept interchangeable batteries, or receive power from an outside source such as mains power. Each embodiment may contain a wired or wireless network interface to enable communication to and from external devices. Although not necessarily utilized, a visual output device such as a monitor or touchscreen may be included within any embodiment.
  • Digital cameras operate by recording light incident upon their sensors. There are many types of acceptable cameras with suitable image resolution to identify a new hole made in the target. The conceived digital camera is preferably directly connected to the single board computer via technologies such as Universal Serial Bus (USB), FireWire, and ethernet. Wireless transmissions from the single board computer include standards such as Bluetooth, WiFi, and cellular networks. The digital camera may also communicate via a Serial Interface or a Parallel Interface.
  • As used herein the terms single board computer and computer system are intended to refer to a computer-related entity, comprising either hardware, a combination of hardware and software, software, or software in execution capable of performing the embodiments described. The disclosed embodiments which use the single board computer refer to being interfaced to and controlled by a computer readable storage medium having stored thereon, a computer program. The computer readable storage medium may include a plurality of components such as one or more of electronic components, hardware components, and/or computer software components. These components may include one or more computer readable storage media that generally store instructions such as software, firmware and/or assembly language for performing one or more portions of one or more implementations or embodiments of an algorithm as discussed herein. These computer readable storage media are generally non-transitory and/or tangible. Examples of such a computer readable storage medium include a recordable data storage medium of a computer and/or storage device. The computer readable storage media may employ, for example, one or more of a magnetic, electrical, and/or optical data storage medium. Further, such media may take the form of, for example, floppy disks, magnetic tapes, CD-ROMs, DVD-ROMs, hard disk drives, micro SD cards, standard SD cards, and/or solid-state or electronic memory. Other forms of non-transitory and/or tangible computer readable storage media not list may be employed with the disclosed embodiments.
  • A number of such components can be combined or divided in an implementation of a computer system. Further, such components may include a set and/or series of computer instructions written in or implemented with any of a number of programming languages, as will be appreciated by those skilled in the art. Computer instructions are executed by at least one central processing unit. In addition, other forms of computer readable media such as a carrier wave may be employed to embody a computer data signal representing a sequence of instructions that when executed by one or more computers causes the one or more computers to perform one or more portions of one or more implementations or embodiments of a sequence. Computer instructions for various components of data processing performed by a system disclosed herein may be performed on a local server or may be performed on a cloud-based server.
  • As used herein, the term “network” includes any cloud, cloud computing system or electronic communications system or method which incorporates hardware and/or software components. Communication may be accomplished through any suitable communication channels, such as, for example, a telephone network, an extranet, an intranet, Internet, point of interaction device (point of sale device, personal digital assistant (e.g., mobile device, kiosk, etc.), online communications, satellite communications, off line communications, wireless communications, transponder communications, local area network (LAN), wide area network (WAN), virtual private network (VPN), networked or linked devices, keyboard, mouse and/or any suitable communication or data input modality. Moreover, although the system may be implemented with TCP/IP communications protocols, the system may also be implemented using IPX, Appletalk, IP-6, NetBIOS, OSI, any tunneling protocol (e.g. IPsec, SSH), or any number of existing or future protocols. If the network is in the nature of a public network, such as the Internet, it may be advantageous to presume the network to be insecure and open to eavesdroppers. Specific information related to the protocols, standards, and application software utilized in connection with the Internet is generally known to those skilled in the art and, as such, need not be detailed herein. See, for example, DILIPNAIK, INTERNET STANDARDS AND PROTOCOLS (1998); JAVA 2 COMPLETE, various authors, (Sybex 1999); DEBORAH RAY AND ERIC RAY, MASTERING HTML 4.0 (1997); and LOSHIN, TCP/IP CLEARLY EXPLAINED (1997) and DAVID GOURLEY AND BRIAN TOTTY, HTTP THE DEFINITIVE GUIDE (2002), the contents of which are hereby incorporated by reference.
  • “Cloud” or “Cloud computing includes a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. Cloud computing may include location-independent computing, whereby shared servers provide resources, software, and data to computers and other devices on demand. For more information regarding cloud computing, see the NISTS (National Institute of Standards and Technology) definition of cloud computing at http://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-145.pdf (last visited Dec. 4, 2018), which is hereby incorporated by reference in its entirety.
  • In various embodiments, components, modules, and/or engines of a system as described herein may be implemented as micro-applications or micro-apps. Micro-apps are typically deployed in the context of a mobile operating system, including for example, a Windows mobile operating system, an Android Operating System, Apple iOS, a Blackberry operating system and the like. The micro-app may be configured to leverage the resources of the larger operating system and associated hardware via a set of predetermined rules which govern the operations of various operating systems and hardware resources. For example, where a micro-app desires to communicate with a device or network other than the mobile device or mobile operating system, the micro-app may leverage the communication protocol of the operating system and associated device hardware under the predetermined rules of the mobile operating system. Moreover, where the micro-app desires an input from a user, the micro-app may be configured to request a response from the operating system which monitors various hardware components and then communicates a detected input from the hardware to the micro-app. In various embodiments, a micro-app may be made available as a service.
  • The disclosure may be described herein in terms of functional block components, screen shots, optional selections and various processing steps. It should be appreciated that such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions. It should be appreciated that such functional blocks may be realized by any number of computer-based systems and tangible non-transitory computer readable storage medium configured to perform the specified functions. For example, system 1100 may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and/or the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. Similarly, the software elements of system 1100 may be implemented with any programming or scripting language such as C, C++, Java, COBOL, assembler, PERL, Visual Basic, SQL Stored Procedures, extensible markup language (XML), with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements. Further, it should be noted that system 1100 may employ any number of conventional techniques for data transmission, signaling, data processing, network control, and/or the like.
  • These software elements may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine. Such that the instructions that execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flow chart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • Accordingly, functional blocks of the block diagrams and flowchart illustrations Support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, and program instruction means for performing the specified functions. It will also be understood that each functional block of the block diagrams and flowchart illustrations, and combinations of functional blocks in the block diagrams and flow chart illustrations, may be implemented by either special purpose hardware-based computer systems which perform the specified functions or steps, or suitable combinations of special purpose hardware and computer instructions. Practitioners will appreciate that the illustrated steps described herein may comprise in any number of configurations including the use of windows, web pages, web forms, popup windows, prompts and/or the like. It should be further appreciated that the multiple steps as illustrated and described may be combined into single web pages and/or windows but have been expanded for the sake of simplicity. In other cases, steps illustrated and described as single process steps may be separated into multiple web pages and/or windows but have been combined for simplicity.
  • While various embodiments of the present invention have been described, the invention may be modified and adapted to various operational methods to those skilled in the art. Therefore, this invention is not limited to the description and figure shown herein, and includes all such embodiments, changes, and modifications that are encompassed by the scope of the claims.

Claims (20)

What is claimed is:
1. A system comprising:
a rail;
a motorized target trolley configured to move on the rail;
a target assembly comprising a target and a metal frame, wherein the target assembly is removably attached to the motorized target trolley using a mounting clip;
a digital video camera configured to capture a video image of the target;
a game computer configured to receive the video image of the target and to process target image data;
a game display configured to display game information; wherein the game information comprises a least a portion of the video image of the target; and
a user interface in electronic communication with the game computer.
2. The system of claim 1,herein the user interface comprises a touchscreen display.
3. The system of claim 1, wherein the target assembly is at least partially vibrationally isolated from one of the mounting clip and the motorized target trolley.
4. The system of claim 1, wherein the digital video camera is mounted to the motorized target trolley.
5. The system of claim 1, wherein the system comprises a plurality of digital video cameras.
6. The system of claim 5, wherein at least one of the plurality of digital video cameras is mounted to the motorized target trolley.
7. The system of claim 1, wherein the system comprises a projector configured to project an image onto the target.
8. The system of claim 7, wherein the projector is mounted to the motorized target trolley.
9. The system of claim 1, wherein the system comprises a remote sensor.
10. The system of claim 9, wherein the remote sensor is selected from a heart rate monitor, a photoplethysmograph, a respiration rate sensor, an accelerometer, an inertial measurement unit, a magnetometer, a gyroscope, an eye tracker, and a video camera.
11. The system of claim 9, wherein the system comprises a plurality of remote sensors.
12. The system of claim 1, wherein the system is configured to receive player information.
13. The system of claim 12, wherein the player information comprises social media account information.
14. The system of claim 13, wherein the system is configured to publish game information to a social media feed associated with a player's social media account information in response to a user command.
15. The system of claim 14, wherein the system is configured to publish video of a player's shooting session.
16. The system of claim 1, wherein the system is configured provide a player account, and wherein a player account is configured to record and store historical information from completed shooting sessions for a player.
17. The system of claim 16, wherein the historical information comprises information selected from number of shooting sessions, number of shots fired, firearms used, calibers used, target types used, overall shot accuracy, shot accuracy for a particular firearm, shot accuracy for a particular caliber, shot accuracy for a particular target type, and shot accuracy trends.
18. The system of claim 1, wherein the system comprises a mobile application available for download to a player's personal mobile device, and wherein the mobile application is configured to permit a player to control the system from the player's personal mobile device.
19. A method comprising:
initiation of data collection by a remote sensor to produce collected sensor data;
detection of a fired shot to identify a fired shot event;
correlation of collected sensor data with the tired shot event to produce correlated sensor data and uncorrelated sensor data; and
discard of uncorrelated sensor data.
20. The method of claim 19, further comprising display of correlated sensor data with target image data.
US15/853,710 2016-06-22 2017-12-23 Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors Abandoned US20180202775A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/853,710 US20180202775A1 (en) 2016-06-22 2017-12-23 Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662493100P 2016-06-22 2016-06-22
US15/474,874 US9891028B2 (en) 2016-06-22 2017-03-30 Shooting game for multiple players with dynamic shot position recognition on a paper target
US15/853,710 US20180202775A1 (en) 2016-06-22 2017-12-23 Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/474,874 Continuation-In-Part US9891028B2 (en) 2016-06-22 2017-03-30 Shooting game for multiple players with dynamic shot position recognition on a paper target

Publications (1)

Publication Number Publication Date
US20180202775A1 true US20180202775A1 (en) 2018-07-19

Family

ID=62840697

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/853,710 Abandoned US20180202775A1 (en) 2016-06-22 2017-12-23 Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors

Country Status (1)

Country Link
US (1) US20180202775A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253864A1 (en) * 2017-03-03 2018-09-06 NVTEK Electronic Co., Ltd. Calibration method for a camera which monitors a target board
US20210394045A1 (en) * 2020-06-23 2021-12-23 Nspark Inc. Screen shooting range and method of playing screen shooting game using artificial intelligence technology
US11433313B2 (en) 2017-06-08 2022-09-06 Visual Shot Recognition Gaming, LLC Live fire gaming system
WO2023137489A1 (en) * 2022-01-17 2023-07-20 MILO Range System and method for a smart rail target
WO2024059155A1 (en) * 2022-09-13 2024-03-21 AccuShoot, Inc. Systems and methods for automated target identification, classification, and scoring

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180253864A1 (en) * 2017-03-03 2018-09-06 NVTEK Electronic Co., Ltd. Calibration method for a camera which monitors a target board
US10600209B2 (en) * 2017-03-03 2020-03-24 NVTEK Electronic Co., Ltd. Calibration method for a camera which monitors a target board
US11433313B2 (en) 2017-06-08 2022-09-06 Visual Shot Recognition Gaming, LLC Live fire gaming system
US20210394045A1 (en) * 2020-06-23 2021-12-23 Nspark Inc. Screen shooting range and method of playing screen shooting game using artificial intelligence technology
CN113827949A (en) * 2020-06-23 2021-12-24 朴鲁先 Screen shooting range using artificial intelligence technology and screen shooting game method
US11707668B2 (en) * 2020-06-23 2023-07-25 Nspark Inc. Screen shooting range and method of playing screen shooting game using artificial intelligence technology
WO2023137489A1 (en) * 2022-01-17 2023-07-20 MILO Range System and method for a smart rail target
WO2024059155A1 (en) * 2022-09-13 2024-03-21 AccuShoot, Inc. Systems and methods for automated target identification, classification, and scoring

Similar Documents

Publication Publication Date Title
US20180202775A1 (en) Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors
US10060713B2 (en) Shooting game for multiple players with dynamic shot position recognition on a paper target
US20160298930A1 (en) Target practice system
CN109034156B (en) Bullet point positioning method based on image recognition
US20160180532A1 (en) System for identifying a position of impact of a weapon shot on a target
CN100567879C (en) Thermal imaging type interactive shooting training system
US20120258432A1 (en) Target Shooting System
US20070190495A1 (en) Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios
US9504907B2 (en) Simulated shooting system and method
US10247517B2 (en) Systems, methods, and devices for electronically displaying individual shots from multiple shots on one physical target
US11040287B2 (en) Experience-oriented virtual baseball game apparatus and virtual baseball game control method using the same
CN107850417A (en) Automatic dartboard scoring system
US10458758B2 (en) Electronic audible feedback bullet targeting system
CN108981454A (en) Image recognition type gunnery system and its implementation
CN2786540Y (en) Heat imaging mutual active shoot training system
JP2016166731A (en) Shooting system, gun, and data processing device
CN109405637A (en) Analogue simulation gunnery system and its implementation
US10876819B2 (en) Multiview display for hand positioning in weapon accuracy training
US20230226454A1 (en) Method for managing and controlling target shooting session and system associated therewith
KR101864039B1 (en) System for providing solution of justice on martial arts sports and analyzing bigdata using augmented reality, and Drive Method of the Same
US20220049931A1 (en) Device and method for shot analysis
CN205279862U (en) A synchronization signal trigger for shooting training aiding system
CN110081774A (en) A kind of image shooting training system and training method
KR20200008776A (en) A Rader Electronic Shooting system
RU2583018C1 (en) Video shooting simulator

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION