US20120258432A1 - Target Shooting System - Google Patents

Target Shooting System Download PDF

Info

Publication number
US20120258432A1
US20120258432A1 US13/082,316 US201113082316A US2012258432A1 US 20120258432 A1 US20120258432 A1 US 20120258432A1 US 201113082316 A US201113082316 A US 201113082316A US 2012258432 A1 US2012258432 A1 US 2012258432A1
Authority
US
United States
Prior art keywords
target
program
hit
image
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/082,316
Inventor
Paul Weissler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Outwest Systems Inc
Original Assignee
Outwest Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Outwest Systems Inc filed Critical Outwest Systems Inc
Priority to US13/082,316 priority Critical patent/US20120258432A1/en
Assigned to OUTWEST SYSTEMS, INC. reassignment OUTWEST SYSTEMS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WEISSLER, PAUL
Publication of US20120258432A1 publication Critical patent/US20120258432A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/10Cinematographic hit-indicating systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/24Targets producing a particular effect when hit, e.g. detonation of pyrotechnic charge, bell ring, photograph

Definitions

  • the field of the present invention relates to target shooting and, in particular, to a target shooting system.
  • U.S. Pat. No. 4,204,683 issued 27 May 1980, by Filippini et al. for Device and Method for Detection of the Shots on a Target from a Distance discloses a video system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 4,763,903 issued 16 Aug. 1988, by Goodwin et al. for Target Scoring and Display System and Method discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 4,949,972 issued 21 Aug. 1990, by Goodwin et al. for Target Scoring and Display System discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 5,577,733 issued 26 Nov. 1996, by Downing for Targeting System discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 5,775,699 issued 7 Jul. 1998, by Orito et al. for Apparatus with Shooting Target and Method of Scoring Target Shooting discloses an apparatus for capturing shots on a target based upon light reflected through the point of penetration of a target by a projectile. This method requires a specialized target.
  • U.S. Pat. No. 5,924,868 issued 20 Jul. 1999, by Rod for Method and Apparatus for Training a Shooter of a Firearm discloses a video camera mounted on eyewear worn by a shooter to produce a displayed image of the target to assist the shooter in aiming the firearm. This method requires specialized eyewear integrated with a camera.
  • U.S. Pat. No. 7,158,167 issued 2 Jan. 2007 by Yerazunis et al. for Video Recording Device for a Targetable Weapon discloses a video image recording device which is mounted on a gun to record video images before and after firing of the gun. This system requires a specialized camera mounted on and integrated with the gun.
  • Shooting Range discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • US Patent Application 2004/0029642 published 12 Feb. 2004 by Akano for Target Practice Laser Transmitting/Receiving System, Target Practice Laser Transmitter, and Target Practice Laser Receiver discloses a target practice laser transmitting and receiving system to capture details of a shot fired upon a target including position, time, distance, ammunition type, weapon type and other variables of the fired shot. This system requires a specialized laser system to capture and analyze shots on a target.
  • US Patent Application 2006/0201046 published 14 Sep. 2006 by Gordon for Photographic Firearm Apparatus and Method discloses a telescopic firearm scope integrated with a camera to photograph a target at the instant the target is fired upon. This method requires a specialized scope integrated with a camera for use with a firearm.
  • a target shooting system of the present invention comprises a target; wherein the target is viewed by a scope; wherein the scope is interfaced with a camera; wherein the camera is interfaced with a computer; and a program executed by the computer; further wherein the program determines a location of a hit on the target as a change between a baseline value (a map of the target created by the program) and a new hit value (determined by the program from data comprising a series of chronological images received from the camera).
  • FIG. 1 is a perspective view of a target shooting system of the present invention.
  • FIG. 2 is a screen capture from a program.
  • FIG. 3 is a flow chart diagramming a first method of hit detection.
  • FIG. 4 is a flow chart of the system diagramming a second method of hit detection.
  • FIG. 5 is a flow chart diagramming a third method of hit detection of the system.
  • FIG. 6 is a flow chart diagramming a hit detection process of a fourth method of hit detection of the system.
  • the target shooting system of the present invention is capable of providing a target shooter or user with feedback, both visual and auditory, on the shooter's performance.
  • This feedback which, preferably, may be immediate, or delayed, allows the shooter to adjust his shooting fundamentals with each individual shot and enables him rapidly to improve his shooting skills.
  • the system comprises an imaging feed, preferably, real-time imaging (such as, preferably, video imaging) of a target being shot at by a target shooter; wherein the images are submitted to a computing device running a software program.
  • the program processes the images in such a manner as to be able to present the shooter with both audio and visual feedback reporting of where the shooter's shots are hitting the target.
  • the audio feedback reporting of the system comprises an announcement of a hit location on the target relative to a bullseye on the target (for example, “Hit one inch high, point five inch right.”).
  • the audio feedback reporting can be set to report different types of information concerning the hit, such as location on the target, a time at which the target was hit, a hit score and the like.
  • the visual feedback reporting of the system comprises an image of the target as displayed with a computer, preferably in real-time, wherein any hit on the target is highlighted on the display by an indicator, such as an outline of concentric, colored circles, or any other suitable indicator.
  • a visual hit indicator may be set to any color desired by a user.
  • an identification of each hit may be displayed by the computer, such as a lettered or numbered hit identification to reflect the order of the hits on the target.
  • An illustrative display 20 is shown in FIG. 2 wherein hits 16 , 18 on the target 12 are lettered in alphabetical order from an oldest old hit 16 , letter A, to a newest new hit 18 , letter F.
  • Hit identification enables the user to determine whether a grouping of hits on the target was made as a result of several consecutive hits, or was made as a result of non-consecutive hits.
  • the system is compact, simple and does not need a special target.
  • the program a camera (preferably, a web camera or webcam), and a spotting scope (or telescope), a user of the system can have a personalized target shooting system at his fingertips.
  • the system may be easily transported and may be set up in any suitable location. The system takes only a few minutes to set up and does not require any specialized computing device or target shooting knowledge.
  • the system may be used with any device or means of projecting a projectile, which projectile is capable of making an indication on a target by penetrating, creating a hole in, or temporarily or permanently leaving a mark on, a target (wherein any of these indications are defined as a hit).
  • These devices or means may include firearms of any type, BB guns and airsoft guns, as well as other shooting weapons or means capable of projecting a projectile, such as a bow and arrow, dart, slingshot, hand-thrown object (such as a knife, playing card, pebble, marble or any other object), and the like.
  • the system also is capable of detecting an indication or a hit on a target made by a laser (or other concentrated light source) shining on the target, such as would occur while target shooting with a laser device attached to a firearm or other weapon.
  • the laser device is used to indicate where a hit would have occurred on the target had the firearm been fired using live ammunition.
  • the system is designed to work at a wide variety of distance ranges: from a short range of approximately 5 yards, such as for a pistol; up to a long range of approximately 100 yards, such as for a rifle.
  • the system is limited only by the power and quality of the magnification means used with the system.
  • An advantage of the system is that it does not require a special target, or any special modification to a standard target setup, such as a paper target mounted by any suitable means. Accordingly, there is no need to modify a target stand, nor is a special target required. Blank paper, rather than specialized paper or targets, can be used as a practice target or on which to print a practice target. These advantages enable the system to be used readily at any existing target shooting or firing range, or other suitable location, and do not require any changes to the range or location, weapon or target equipment. These advantages also provide efficiencies to a user in terms of time and money savings.
  • the target shooting system 2 comprises: (a) a target 12 upon which a user fires hits; (b) a computer 10 , which may be any computing means or device (preferably, the computer is a portable computing device, such as a laptop, netbook, tablet, smart phone or other suitable computing device) capable of interfacing with a camera 4 and receiving images (preferably real-time images) from the camera 4 ; (c) a software program 22 executed by the computer 10 , which program 22 is capable of receiving and analyzing images from the camera 4 to determine a location of a hit on the target 12 (illustrative hits 16 , 18 on the target 12 are shown lettered in alphabetical order from an oldest old hit 16 , letter A, to a newest new hit 18 , letter F); (d) a camera 4 , which may be any means or device capable of image acquisition (preferably, a webcam, or a still frame or video digital camera, smart phone or other suitable image acquisition means or device capable of image acquisition, preferably in real-time) and
  • the scope 6 may interface with the camera 4 either as a unitary device integral with a camera 4 and having both image acquisition and image magnification capabilities (such as a camera with a high-powered zoom lens), or as a separate device from a camera 4 , such as a spotting scope or telescope.
  • the program 22 for use with the system 2 is input with data consisting of a chronologic series of still or video frame images, preferably, in real-time, of the target 12 from the camera 4 .
  • the images captured by the camera 4 may be magnified or enhanced by first processing an image of the target 12 being monitored through the scope 6 before being submitted to the camera 4 .
  • a still image or photograph of the target 12 may be taken by the camera 4 at any time during use of the system 2 .
  • An image acquired by the camera 4 is submitted to the program 22 and may be stored in jpeg or other suitable file format to allow for easy viewing and sharing of the images by a user. Storing images of the target 12 enables the user to capture and preserve a particular target shooting session for the user to examine later or share with others.
  • An image submitted by the camera 4 to the program 22 also may be zoomed and panned within the program 22 which enables the user to examine closely on a display 20 (as shown on FIG. 1 , FIG. 2 ) specific areas of the target 12 without a need for the user to relocate physically to the actual location of the target 12 , which in turn enhances the safety of the user at a target shooting range by keeping the user out of harm's way and provides convenience to the user in the use of the system 2 .
  • the target 12 may be any suitable type of target 12 that can enables detection by the system 2 of a hit on the target 12 .
  • the target 12 may be any suitable material, such as paper.
  • the target 12 has a calibration mark 24 on the target 12 which calibration mark 24 is visible within an area viewed by the scope 6 .
  • the calibration mark 24 may be either an indicator which marks a known distance, such as the length of one actual inch on the target 12 , or a symbol, such as a cross-hairs symbol.
  • the calibration mark 24 may be added to the target 12 either when the target 12 is made, such as may be done with a pre-printed, paper target, or added to the target 12 post-manufacture by any suitable means such as a sticker, pen mark, or any other visible marker placed anywhere on the target 12 .
  • User confirmation of whether the calibration mark 24 is visible within an area viewed by the scope 6 either may be done by physical examination by the user looking through the scope 6 , or may be done by means of the computer 10 display 20 after the program 22 is launched and operational.
  • the user For a target 12 bearing a calibration mark 24 , it is preferred that the user not hit the calibration mark 24 during target shooting with the system 2 . If the calibration mark 24 is hit, then the system 2 may lose calibration and its ability to detect a hit on the target 12 may be impaired.
  • the system 2 is setup. With reference to FIG. 1 , to do so, a target 12 at which to shoot is setup on location a suitable distance downrange from a user or shooter; and a scope 6 , camera 4 and computer 10 are setup out of the way, but near to, the user at the user's shooting position.
  • the computer 10 preferably is pre-installed with the program 22 prior to system 2 setup on location.
  • the camera 4 interfaces with the computer 10 by any suitable means or device, such as a USB cable or wireless connection.
  • the system 2 may further comprise a mounting apparatus 8 capable of interfacing the camera 4 and the scope 6 , such that a lens of the camera 4 abuts an eyepiece of the scope 6 .
  • a mounting apparatus 8 capable of interfacing the camera 4 and the scope 6 , such that a lens of the camera 4 abuts an eyepiece of the scope 6 .
  • the user sets up the scope 6 , by means of a tripod or the like.
  • the scope 6 is positioned to focus on the target 12 and the user looks through the eyepiece of the scope 6 and adjusts the scope 6 so that the target 12 is centered and in focus in the scope 6 at a desired magnification or zoom level.
  • the magnification or zoom capability of the system 2 can range from approximately 1x to a maximum of approximately 75x (wherein ‘x’ is understood to be an unmagnified image at 100% of normal viewing power).
  • the program 22 is launched or activated. Once activated, the program 22 is calibrated with the target 12 information.
  • the user In calibrate the system 2 , the user insures that an image, preferably in focus, of the target 12 is visible on the computer 10 display 20 . If a calibration mark 24 is on the target 12 , then the calibration mark 24 also should be visible on the computer 10 display 20 . The calibration mark 24 does not need to be visible within a magnified or zoomed viewing area of the computer 10 display 20 if a user is using the zoom feature of the program 22 ; however, the calibration mark 24 does need to be visible on the computer 10 display 20 when the magnification or zoom feature of the program 22 is set at 1x. As images, preferably real-time images, from the camera 4 are submitted to the program 22 , a multi-step process to calibrate the target 12 is initiated by the user and performed by the program 22 .
  • a caliber of ammunition or a projectile to be shot at the target 12 from the ammunition or projectile data parameters of the program 22 .
  • the program 22 can determine what hit size to detect to ensure accurate reporting of a hit on the target 12 .
  • the term ‘ammunition’ refers to and encompasses any projectile.
  • the program 22 determines a measurement value; wherein the measurement value may be expressed as pixels-per-inch, which is the number of inline pixels on the display 20 that are equivalent to one linear inch on the target 12 . Obviously, the measurement value may be expressed in any suitable distance data parameter, including metric, such as pixels-per-centimeter. The measurement value is used by the program 22 to determine a hit size and a location of a hit on the target 12 .
  • the measurement value either (a) may be determined automatically by the program 22 through the use of a pre-existing, pre-defined calibration mark 24 on the target 12 , such as a cross-hairs symbol of a known size, in which case no user action is required; or (b) may be determined manually by the user creating a calibration mark 24 on the target 12 and entering a definition of the calibration mark 24 into the program 22 from which the program 22 can determine the measurement value, wherein the definition may be a distance between at least two indicators on the target 12 marking a known distance, such as a length of one actual inch or centimeter on the target 12 .
  • the program 22 automatically searches for the calibration mark 24 in an area of the target 12 viewable by the camera 4 . Once the calibration mark 24 is found, then the program 22 is able to determine the measurement value, because the calibration mark 24 is defined. The program 22 determines the measurement value by comparing the number of pixels used to depict the calibration mark 24 on the display 20 to the definition of the calibration mark 24 .
  • the program 22 determines a baseline value. After the measurement value is determined, the program 22 creates a map of the target 12 and stores the map for use as a baseline value. The baseline value is used by the program 22 to detect hits on the target 12 . The program 22 detects a new hit 18 on the target 12 by comparing a change between the baseline value and a new hit value; wherein the new hit value is determined by the program 22 from data comprising a series of chronological images, such as still image frames or video image frames, received from the camera 4 .
  • a calibrate feature of the program 22 the user activates a calibrate feature of the program 22 .
  • the program 22 can automatically calibrate itself.
  • the program 22 calibrates itself by finding and locking on to a calibration mark 24 on the target 12 .
  • the program 22 scans the target 12 and stores the target 12 information to prepare the program 22 for detection of a new hit 18 fired on the target 12 .
  • the user may perform a manual calibration of the target 12 by means of the manual calibration feature of the program 22 .
  • the program 22 and the system 2 are ready to begin monitoring the target 12 for any new hit 18 fired on the target 12 .
  • the calibration step order may be a first step of determining measurement value, a second step of determining a baseline value; a third step of selecting a caliber of ammunition; and a fourth step of activating a calibration feature of the system.
  • the target 12 viewing area on the display 20 is stabilized and the program 22 is able to compensate for any movement of the target 12 .
  • the compensation ability of the program 22 eliminates the detection of a false-positive hit, or a hit that is not really a hit, but rather an illusory hit resulting from extraneous movement or apparent movement of the target 12 , such as from environmental vibration of the target 12 or scope 6 .
  • the user may select a section of the display 20 to be monitored by the program 22 .
  • the display 20 includes the actual target 12 area being shot at by the user and not any extraneous background around the target 12 .
  • the user activates a set-hit-area feature from the main control panel of the program 22 and then the user selects a desired target 12 area by means of the program 22 controls (for example, by using a mouse to click and drag a cursor to define a target 12 area).
  • An outline of the target 12 area will remain visible on the display 20 for the user's reference.
  • the user may not use or may deactivate the set-hit-area feature of the program 22 .
  • the set-hit-area feature of the program 22 limits detection of hits on the target 12 to a specific area of the target 12 as set within the program 22 by the user. Rather than having the program 22 process the entire target 12 area viewable by the camera 4 for a new hit, the program 22 limits the target 12 monitoring to the set hit area. By setting a hit area, a sub-area of the displayed target 12 area will be monitored for hits. A set hit area may be changed, redefined, or removed at the user's preference by means of the program 22 controls.
  • Additional benefits of setting a hit area are that the processing speed of the program 22 can be boosted and system 2 accuracy can be improved by eliminating false-positive hits and ignoring hits outside of the set hit area as invalid hits, either of which type of hit may occur as the result of undesired or interfering movement in the area of the target 12 .
  • the user may set a bullseye 14 area or location within the target 12 area by activating the set-bullseye feature from the main control panel of the program 22 and then selecting a desired location and size for the bullseye 14 by means of the program 22 controls (for example, by using a mouse to click and drag a cursor to define a target 12 area to be the bullseye 14 ).
  • the bullseye 14 may be located anywhere on the target 12 , and the bullseye 14 size may be set by the user in accordance with the user's target shooting skill level. Selection of a bullseye 14 is optional, but preferable.
  • the bullseye 14 is used by the program 22 to report the location of a hit on the target 12 relative to the bullseye 14 . As one area of the target 12 becomes cluttered with multiple hits, the bullseye 14 may be relocated to a new location on the target 12 , thus extending the useful life of and eliminating much of the need to replace the target 12 .
  • a target 12 is shown with old hits 16 and a new hit 18 .
  • a detection method executed by the program 22 is used by the system 2 to detect a new hit 18 on the target 12 and to locate the new hit 18 on the target 12 relative to the bullseye 14 .
  • the location of the new hit 18 on the target 12 is determined by the program 22 as a new hit value.
  • a new hit value is determined by the program 22 ; wherein each image frame from the camera 4 received by the program 22 is processed to create a resultant value, which is a map of the target 12 with the new hit 18 , and the resultant value is compared to the baseline value; wherein a difference between the resultant value and the baseline value is the new hit value.
  • the program 22 is prepared or setup by process steps of selecting a caliber of ammunition (process 1 ), entering the scope zoom or magnification level (process 3 ), setting within the target 12 a hit area (process 5 ), setting within the target 12 a bullseye 14 (process 7 ), performing the program 22 calibration (process 9 ), and starting monitoring of the system 2 (process 11 ).
  • the program 22 may detect a hit on the target 12 by various contour detection, calibration and monitoring processes. Hit detection by the system 2 may be accomplished by four different methods. Each of these hit detection methods are disclosed in turn.
  • a first hit detection method comprises the following steps.
  • a still or video image frame which is in the form of a bitmap image, is obtained from the camera 4 (process 13 ) and is submitted to the program 22 which executes a procedure that implements a mapping algorithm of the image, such as the Canny contour mapping algorithm (process 15 ).
  • the algorithm provides a mapping of all detectable contours (a contour is defined as a border change, by color or intensity, between a series of adjacent pixels in the bitmap image) on the target 12 in the form of line segments and bounds boxes (a bounds box is defined as the smallest box that can be drawn around the line segments that constitute an individual contour) (process 17 ).
  • a representation of the target 12 is constructed by the program 22 and shown on the display 20 .
  • each contour-bounds box is analyzed by the program 22 to determine if the contour-bounds box falls within a user-specified hit size range (process 19 ).
  • the objective of the program 22 is to detect a hit on the target 12 and the contours which indicate features of the target 12 that are larger or smaller than the expected hit size are discarded by the program 22 .
  • the remaining contours are stored for further processing by the program 22 .
  • a third hit detection step if the program 22 is being calibrated as previously discussed above (as shown by decision 21 to process 25 , or decision 21 to process 23 to process 25 ), no further processing of the image is done by the program 22 (as shown by process 25 to decision 27 to process 31 ). Instead, the contours found in the second hit detection step are stored by the program 22 to be used as a baseline value for determining changes to the target 12 (such as the creation of a new hit 18 on the target 12 ).
  • Target 12 calibration is complete either after a set number of iterations of these hit detection steps are performed, or until no new contours are detected by the program 22 , as determined by matching incoming contours to stored contours.
  • a fourth hit detection step once the calibration process is complete, monitoring of the target 12 for changes caused by a new hit 18 on the target 12 can be initiated.
  • the first and second hit detection steps are performed by the program 22 .
  • a detected contour (process 33 ) is compared to a list of existing contours (decision 35 ) compiled during the target 12 calibration process. If a contour match is detected by the program 22 (as determined by a weighing process involving both a size and position of the new hit 18 contour as compared to an old hit 16 contour), then that contour match is discarded as being an existing, or old hit 16 contour.
  • the contour is compared to the list of possible hit contours for a match (decision 37 ). If a match is found, then a hit detection counter for the contour is incremented accordingly (process 41 ). If the detection count for a contour is high enough (decision 43 ), then that contour is flagged as a new hit 18 contour (process 45 ) and is considered by the program 22 to be a new hit 18 on the target 12 . After all new contours are processed and either discarded as an old hit 16 contour, or marked either as a possible, or an actual, new hit 18 contour, then the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18 .
  • any contour which has been on the list N (wherein ‘N’ is a variable) image frames without being marked as a new hit is discarded or flagged as being a transient, or non-hit, contour (process 47 ).
  • the variable N can be any desired value determined and set either manually by the user or automatically by the program 22 .
  • the hit detection process continues with each new image frame from the camera 4 submitted to the program 22 until the monitoring of the target 12 is suspended or ceased by the user or shooter.
  • a second hit detection method comprises the following steps.
  • a first hit detection step the still or video image frame, which is in the form of a bitmap image, is obtained from the camera 4 (process 49 ), and is submitted to the program 22 which executes a procedure that implements a mapping algorithm of the image, such as the Canny contour mapping algorithm (process 51 ).
  • the algorithm provides a mapping of all detectable contours on the target 12 in the form of line segments and bounds boxes (process 53 ). By mapping these contours, a representation of the target 12 is constructed by the program 22 and shown on the display 20 .
  • a second hit detection step with the contours mapped out as a series of line segments and bounds boxes, the image consisting of these contours is stored to be used as a baseline value (process 63 ).
  • the baseline value is used later by the program 22 as a reference point to mask out all contours that existed at the time the baseline value was created. If the program 22 is being calibrated as previously discussed above (as shown by decision 55 to process 57 , or decision 55 to process 59 to process 61 ), no further processing of the image is done by the program 22 (as shown by process 59 to process 61 ).
  • a third hit detection step monitoring of the target 12 for changes caused by a new hit 18 on the target 12 can be initiated.
  • the first hit detection step of the second hit detection method is performed by the program 22 .
  • the baseline value is aligned with the incoming bitmap image using an image registration algorithm or other alignment process (process 63 ). Once aligned, the baseline value is used to eliminated all old contours by AND-ing the inverted baseline value with the incoming bitmap image (wherein AND-ing is an operation executed by the program 22 in which a bit of an incoming image is set to zero if the matching baseline value bit value is zero) (process 65 ).
  • any remaining contours (decision 67 ) of an appropriate size are considered to be either an actual new hit or a possible new hit (process 69 ).
  • a new hit 18 is added to the baseline value.
  • the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18 .
  • a third hit detection method comprises the following steps.
  • no baseline value or storage of image contours by the program 22 is necessary. Instead, any graphics on the target 12 must consist of only non-black colors and must exclude the use of black in the target 12 area.
  • Each incoming image frame from the camera 4 is filtered by the program 22 before the image is analyzed by an edge detection algorithm of the program 22 .
  • the image frame is taken directly from the camera 4 (process 79 ) and passed to the program 22 .
  • any pixel that represents a non-black color, such as red or blue, is set to the value of white, effectively blanking out the pixel (process 81 ).
  • Any pixels that equal black or near-black colors are passed through the filtering process unchanged. As any new hits are presumed to appear as black, this filtering process effectively allows only hits to be passed through to the contour detection process of the program 22 (as shown by process 83 to process 85 ).
  • the image frame is passed to the edge detection algorithm of the program 22 to find any new contours (process 93 ). Any remaining contours of an appropriate size that do not represent an old hit 16 (decision 95 to process 97 ) are considered to be an actual new hit 18 or possible new hit (as shown by process 99 to decision 101 to process 103 ). After all new contours are processed by the program 22 and marked either as an actual new hit 18 contour or a possible new hit contour, the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18 .s
  • This third hit detection method is advantageous, because no image alignment or target 12 tracking is required other than to ensure old hits 16 are adjusted for by the program 22 as based on any target 12 movement. This adjustment is necessary to prevent an old hit 16 from being flagged as a new hit 18 due to target 12 movement.
  • a target 12 not include black or near-black colors in the target 12 graphics. It is also necessary that the target 12 must be depicted with only appropriate colors within the set hit area and that a new hit 18 contour appear to the program 22 as black.
  • a fourth hit detection method comprises the following steps.
  • This fourth hit detection method involves using a combination of steps from both the second hit detection method and the third hit detection method.
  • the contour detection process steps of the third method are used as a pre-processing step before image frames are passed into the contour detection process steps of the second method (see FIG. 4 ).
  • This fourth hit detection method is advantageous, because the color black may be used in the target 12 area, while at the same time eliminating colored graphics from the target 12 , and thereby reducing false positives and improving the accuracy of the hit detection process of the system 2 .
  • Each incoming image frame from the camera 4 is processed by the program 22 before the image frame is analyzed by an edge detection algorithm of the program 22 .
  • An incoming image frame is taken directly from the camera 4 (process 107 ) and submitted to the program 22 .
  • the individual pixel values that make up the image frame are analyzed by the program 22 . Any pixel that represents a non-black color, such as red or blue, is set to the value of white, effectively blanking out the pixel (process 109 ). Any pixels that equal black or near-black colors are passed through the filtering process unchanged.
  • the image frame is passed on to the edge detection algorithm of the program 22 to find any new contours (process 111 ).
  • a baseline value created during a calibration process (as previously described above) is aligned with an incoming bitmap image using an image registration algorithm or other alignment process (process 113 ). Once aligned, the baseline value is used to eliminate all old contours by AND-ing the inverted baseline value with the incoming bitmap image. Any remaining contours are stored as bounds boxes of the contours (process 115 ).
  • the calibration mark 24 (if being used) is detected and a new position of the calibration mark 24 is compared to an old position of the calibration mark 24 .
  • the program 22 can use either a change between the new and old calibration mark 24 positions, a change among all hits, a change within the set hit area, a change of the bullseye 14 , or a change with regard to any other relevant indicator, to adjust the respective positions of the indicators accordingly to account for any extraneous movement of the target 12 or camera 4 (process 117 ).
  • the contour sizes are examined by the program 22 and any contours not within a desired contour size range are discarded (process 119 ).
  • the remaining contours are compared to the list of possible hit contours. If a remaining contour is on the list, then the detection count of the contour is incremented accordingly (process 121 ). If a hit contour has been detected a sufficient number of times, then it is added to the new hit list (process 123 ). If a hit contour has not been detected a sufficient number of times over N frames to register as a new hit, then the hit contour is discarded (process 125 ). Any new hit 18 is added to the baseline value.
  • the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18 (process 127 ).
  • the hit detection process continues with each new image frame from the camera 4 submitted to the program 22 until the monitoring of the target 12 is suspended or ceased by the user or shooter.
  • the program 22 is able to detect a single hit, as well as multiple, simultaneous hits, on the target 12 .
  • the ability of the program 22 to detect multiple, simultaneous hits on the target 12 is beneficial in an instance in which a user fires at the target 12 with a shotgun loaded with shot, because the program 22 can detect and report multiple hits from a shot pattern on the target 12 .
  • Monitoring of the target 12 can be started without first performing the target 12 calibration process previously described above. However, if this is done, then any contour that falls into the specified hit size range immediately will be flagged by the program 22 as a new hit 18 on the target 12 .
  • the user activates the monitor feature from the main control panel of the program 22 .
  • the hits will be reported by the program 22 both visually and audibly, and recorded in the output hits pane of the program 22 display 20 .
  • Hit indicators will appear on the display overlaying the old hits 16 and the new hit 18 on the target 12 .
  • the exact hit location, relative to a bullseye 14 is listed in the output hits pane of the program 22 display 20 . If the audio feature of the program 22 is active and the computer 10 has sound, then the new hit 18 location also will be audibly reported by the program 22 .
  • the bullseye 14 is used as a reference point to report a position of a new hit 18 .
  • a new hit 18 When a new hit 18 is reported, its position is provided by the program 22 in the format such as hit X inches left or right of and Y inches above or below the bullseye 14 .
  • the distances given are the distance from the new hit 18 to the center of the bullseye 14 . Using these distances, the user or shooter can determine how close a new hit 18 is to the bullseye 14 .
  • the user deactivates the monitor feature from the main control panel of the program 22 to suspend monitoring of the target 12 .
  • the bullseye 14 may be reset and the hit information registered by the program 22 may be cleared. By doing so, the target 12 display area is less cluttered with information and the cleared hits will not be registered again by the system 2 .
  • the user can manually add or remove hits from the target 12 area. By doing so, the user can remove false-positive or invalid hits, or add hits that the program 22 may have missed during processing of the target 12 .
  • the process of calibrating and monitoring the target 12 may be repeated as many times as the user wants.
  • a hit data set may be exported from the program 22 in a CSV file format.
  • the CSV file format is widely supported and once exported the hit data set can be analyzed or displayed in any program that accepts CSV files, such as Excel by Microsoft and Grapher by Golden Software.
  • the hit data in the exported CSV file includes the location of all hits 16 , 18 on the target 12 and the hit data enable the user or shooter to track his or her target shooting performance progress.
  • the hit data may be provided in two formats.
  • the first format for the hit data is in X, Y data pairs as standard axis coordinates in a two-dimensional plane graph, wherein X data are on a horizontal axis and Y data are on a vertical axis. These data pairs indicate the location of a hit 16 , 18 in the form of X inches to the left or right of the bullseye 14 , and Y inches above or below the bullseye 14 , wherein the bullseye 14 is at an intersection of the X and Y axes, or 0, 0.
  • a negative X value indicates a hit to the left of the bullseye 14 and a positive X value indicates a hit to the right of the bullseye 14 .
  • a negative Y value indicates a hit below the bullseye 14 and a positive Y value indicates a hit above the bullseye 14 .
  • the second format for the hit data is in R, A data pairs as standard radius and angle polar coordinates in a two-dimensional polar graph.
  • R is a straight line distance value from a hit 16 , 18 to the bullseye 14 (in inches, centimeters or other unit of measurement per the user's preference), as if a line was drawn from the hit 16 , 18 to the center of the bullseye 14 .
  • A is an angle value (in degrees) along an imaginary circle drawn around the bullseye 14 . On this circle, a zero angle value is at the top of the circle (the 12 o'clock position). Angle values increase by one degree clockwise along the circle with a total of 360 degrees around the circle.
  • Graphing a hit 16 , 18 using polar coordinates may be easier than, and may be preferred by a user, to the more conventional X, Y coordinate system.
  • additional data associated with a hit 16 , 18 on the target 12 can be time-stamp data.
  • This feature of the program 22 allows the shooter to determine how accurate the hits are within a set time frame.
  • a time window can also be set, wherein only hits that occur within a specific, fixed-time period are scored by the program 22 . This feature is useful for a shooting competition or qualification test in which the shooter has only a limited time to fire at the target 12 .
  • time-stamp feature of the program 22 is enabled, then included in the data set for each hit will be a time-stamp showing when the hit occurred.
  • a first hit made during a target 12 monitoring session is set to a start time of zero seconds.
  • Time-stamp data for a hit allows a user or shooter to know how quickly each new hit 18 on the target 12 is made in relation to the other, old hits 16 .
  • the program 22 can score the hits 16 , 18 on target 12 . Because the position of each hit relative to the bullseye 14 is known, each hit can be scored. For example, a hit within a one-inch circle around the bullseye 14 could be worth 10 points, a hit from one inch to two inches from the bullseye 14 could be worth nine points, etc. The scoring could be cumulative and could be used in scoring shooting competitions.
  • a simulated target 12 can be overlaid on the target view of the display 20 to graphically represent the scoring process. The simulated target may match that of a typical paper target with concentric rings around a bullseye, with each ring representing a different point value based on the distance of a hit from the bullseye.
  • all hit information is stored by the program 22 on a continuous basis into a database file (e.g., an MDB file type). As each hit is detected, the program 22 adds the hit to the database.
  • the information stored in the database may include shooter name, weapon name, ammunition caliber, hand being used to shoot with, range of target, hit location, time of hit, and other relevant data desired by a user. Storing this information in the program 22 allows a user to search the database based on the various data fields. For example, all hits by a particular shooter using a particular weapon could be extracted from the database.
  • the program 22 may have other optional features. Such an optional feature is that the user may use target overlays from within the program 22 , so it is not necessary for the target 12 being viewed through the camera 4 to be displayed in the program 22 . Rather, any image can be used as an overlay and placed in the target 12 viewing area of the program 22 . Accordingly, the user can place any digital image in the target 12 viewing area, such as a bullseye 14 , silhouette, or any other image file the user has available.
  • the program 22 also supports drawing a ringed target overlay, as is typically used in target shooting, over the target 12 .
  • the user can use virtually blank paper for the target 12 while at the same time allowing the shooter and any spectators to see scoring rings over the target 12 from within the program 22 display. Further optionally, as shown in FIG. 1 and FIG. 2 , the user can use the program 22 to draw a grid over the target 12 viewing area for reference.
  • an estimated distance range from the user's shooting position to the target 12 can be displayed by the program 22 .
  • the estimated distance range is accurate to within +/ ⁇ 10% and can be determined in any applicable unit of measurement, such as feet, yards, meters and the like. The user can determine at what range he or she is shooting at the target 12 without the need for an additional, separate range finder unit to be used with the system 2 , unless the greater accuracy obtained with a separate range finder is desired by the user.

Abstract

The target shooting system provides a target shooter with immediate feedback, both visual and auditory, on the shooter's performance. The system comprises a real-time imaging feed of a target being shot at by a target shooter; wherein the image is submitted to a computing device running a software program. The program processes the images in such a manner as to be able to present the shooter with both audio and visual feedback of where the shooter's shots are hitting the target. The audio feedback comprises real-time announcement of a hit location relative to a pre-set bullseye (for example: “Hit one inch high, point five inch right.”). The visual feedback comprises a real-time image of the target with any hits on the target highlighted by an indicator, such as an outline of concentric, colored circles, or any other suitable indicator.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The field of the present invention relates to target shooting and, in particular, to a target shooting system.
  • 2. Description of Related Art Including Information Disclosed Under 37 CFR 1.97 and 37 CFR 1.98
  • Various target shooting systems exist for analyzing the accuracy of a target shooter's shot on a target. The means by which these target shooting systems work vary widely as demonstrated by the following patents. In addition to standard target shooting equipment, namely a target and a means for projecting a projectile, such as a firearm, the systems disclosed in these patents require some sort of additional, special equipment.
  • U.S. Pat. No. 4,204,683 issued 27 May 1980, by Filippini et al. for Device and Method for Detection of the Shots on a Target from a Distance discloses a video system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 4,763,903 issued 16 Aug. 1988, by Goodwin et al. for Target Scoring and Display System and Method discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 4,949,972 issued 21 Aug. 1990, by Goodwin et al. for Target Scoring and Display System discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 5,577,733 issued 26 Nov. 1996, by Downing for Targeting System discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • U.S. Pat. No. 5,775,699 issued 7 Jul. 1998, by Orito et al. for Apparatus with Shooting Target and Method of Scoring Target Shooting discloses an apparatus for capturing shots on a target based upon light reflected through the point of penetration of a target by a projectile. This method requires a specialized target.
  • U.S. Pat. No. 5,924,868 issued 20 Jul. 1999, by Rod for Method and Apparatus for Training a Shooter of a Firearm discloses a video camera mounted on eyewear worn by a shooter to produce a displayed image of the target to assist the shooter in aiming the firearm. This method requires specialized eyewear integrated with a camera.
  • U.S. Pat. No. 7,158,167 issued 2 Jan. 2007 by Yerazunis et al. for Video Recording Device for a Targetable Weapon discloses a video image recording device which is mounted on a gun to record video images before and after firing of the gun. This system requires a specialized camera mounted on and integrated with the gun. Shooting Range discloses a system for capturing shots on a target based upon the point of penetration of a light field by a projectile. This system requires a specialized target.
  • US Patent Application 2002/0171924 published 21 Nov. 2002 by Varner et al. for Telescope Viewing System discloses a telescope viewing system with a camera attachable to an eyepiece of the telescope and a computer system in communication with the camera for displaying images, in particular, celestial images, recorded by the camera upon a display screen. This system provides only a telescope viewing system with a camera for capturing images.
  • US Patent Application 2003/0180038 published 25 Sep. 2003 by Gordon for Photographic Firearm Apparatus and Method discloses a telescopic firearm scope integrated with a camera to photograph a target at the instant the target is fired upon. This method requires a specialized scope integrated with a camera for use with a firearm.
  • US Patent Application 2004/0029642 published 12 Feb. 2004 by Akano for Target Practice Laser Transmitting/Receiving System, Target Practice Laser Transmitter, and Target Practice Laser Receiver discloses a target practice laser transmitting and receiving system to capture details of a shot fired upon a target including position, time, distance, ammunition type, weapon type and other variables of the fired shot. This system requires a specialized laser system to capture and analyze shots on a target.
  • US Patent Application 2005/0002668 published 6 Jan. 2005 by Gordon for Photographic Firearm Apparatus and Method discloses a telescopic firearm scope integrated with a camera to photograph a target at the instant the target is fired upon. This method requires a specialized scope integrated with a camera for use with a firearm.
  • US Patent Application 2006/0150468 published 13 Jul. 2006 by Zhao for A Method and System to Display Shooting-Target and Automatic-Identify Last Hitting Point by Digital Image Processing discloses a video-monitor system to capture and display the location of a shot fired on a target. This system requires a specialized camera.
  • US Patent Application 2006/0201046 published 14 Sep. 2006 by Gordon for Photographic Firearm Apparatus and Method discloses a telescopic firearm scope integrated with a camera to photograph a target at the instant the target is fired upon. This method requires a specialized scope integrated with a camera for use with a firearm.
  • US Patent Application 2008/0163536 published 10 Jul. 2008 by Koch et al. for Sighting Mechanism for Fire Arms discloses a sighting mechanism with cameras mounted on a firearm to capture shots fired on a target and to display the shots on a video screen. This system requires a specialized camera integrated with a firearm.
  • US Patent Application 2008/0233543 published 25 Sep. 2008 by Guissin for Video Capture, Recording and Scoring in Firearms and Surveillance discloses a video camera and recording device integrated with a weapon to record shots fired; wherein the camera may be mounted either on the firearm or within the bore of the firearm. This system requires a specialized camera integrated with a firearm.
  • BRIEF SUMMARY OF THE INVENTION
  • A target shooting system of the present invention comprises a target; wherein the target is viewed by a scope; wherein the scope is interfaced with a camera; wherein the camera is interfaced with a computer; and a program executed by the computer; further wherein the program determines a location of a hit on the target as a change between a baseline value (a map of the target created by the program) and a new hit value (determined by the program from data comprising a series of chronological images received from the camera).
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 is a perspective view of a target shooting system of the present invention.
  • FIG. 2 is a screen capture from a program.
  • FIG. 3 is a flow chart diagramming a first method of hit detection.
  • FIG. 4 is a flow chart of the system diagramming a second method of hit detection.
  • FIG. 5 is a flow chart diagramming a third method of hit detection of the system.
  • FIG. 6 is a flow chart diagramming a hit detection process of a fourth method of hit detection of the system.
  • PARTIAL LIST OF REFERENCE NUMERALS
      • 2 system
      • 4 camera
      • 6 scope
      • 8 mounting apparatus
      • 10 computer
      • 12 target
      • 14 bullseye
      • 16 old hit
      • 18 new hit
      • 20 display
      • 22 program
      • 24 calibration mark
    DETAILED DESCRIPTION OF THE INVENTION
  • The target shooting system of the present invention is capable of providing a target shooter or user with feedback, both visual and auditory, on the shooter's performance. This feedback, which, preferably, may be immediate, or delayed, allows the shooter to adjust his shooting fundamentals with each individual shot and enables him rapidly to improve his shooting skills. The system comprises an imaging feed, preferably, real-time imaging (such as, preferably, video imaging) of a target being shot at by a target shooter; wherein the images are submitted to a computing device running a software program. The program processes the images in such a manner as to be able to present the shooter with both audio and visual feedback reporting of where the shooter's shots are hitting the target.
  • The audio feedback reporting of the system comprises an announcement of a hit location on the target relative to a bullseye on the target (for example, “Hit one inch high, point five inch right.”). The audio feedback reporting can be set to report different types of information concerning the hit, such as location on the target, a time at which the target was hit, a hit score and the like.
  • The visual feedback reporting of the system comprises an image of the target as displayed with a computer, preferably in real-time, wherein any hit on the target is highlighted on the display by an indicator, such as an outline of concentric, colored circles, or any other suitable indicator. A visual hit indicator may be set to any color desired by a user. Also, an identification of each hit may be displayed by the computer, such as a lettered or numbered hit identification to reflect the order of the hits on the target. An illustrative display 20 is shown in FIG. 2 wherein hits 16, 18 on the target 12 are lettered in alphabetical order from an oldest old hit 16, letter A, to a newest new hit 18, letter F. (For clarity, on the “Hits” tab of the display 20, the hit numbers thereon correspond to the hit letters on the target 12 shown on the display 20 as follows: 1=A, 2=B, 3=C, 4=D, 5=E, 6=F.) Hit identification enables the user to determine whether a grouping of hits on the target was made as a result of several consecutive hits, or was made as a result of non-consecutive hits.
  • The system is compact, simple and does not need a special target. With a portable computing device, the program, a camera (preferably, a web camera or webcam), and a spotting scope (or telescope), a user of the system can have a personalized target shooting system at his fingertips. The system may be easily transported and may be set up in any suitable location. The system takes only a few minutes to set up and does not require any specialized computing device or target shooting knowledge.
  • The system may be used with any device or means of projecting a projectile, which projectile is capable of making an indication on a target by penetrating, creating a hole in, or temporarily or permanently leaving a mark on, a target (wherein any of these indications are defined as a hit). These devices or means may include firearms of any type, BB guns and airsoft guns, as well as other shooting weapons or means capable of projecting a projectile, such as a bow and arrow, dart, slingshot, hand-thrown object (such as a knife, playing card, pebble, marble or any other object), and the like. The system also is capable of detecting an indication or a hit on a target made by a laser (or other concentrated light source) shining on the target, such as would occur while target shooting with a laser device attached to a firearm or other weapon. The laser device is used to indicate where a hit would have occurred on the target had the firearm been fired using live ammunition.
  • The system is designed to work at a wide variety of distance ranges: from a short range of approximately 5 yards, such as for a pistol; up to a long range of approximately 100 yards, such as for a rifle. However, the system is limited only by the power and quality of the magnification means used with the system.
  • An advantage of the system is that it does not require a special target, or any special modification to a standard target setup, such as a paper target mounted by any suitable means. Accordingly, there is no need to modify a target stand, nor is a special target required. Blank paper, rather than specialized paper or targets, can be used as a practice target or on which to print a practice target. These advantages enable the system to be used readily at any existing target shooting or firing range, or other suitable location, and do not require any changes to the range or location, weapon or target equipment. These advantages also provide efficiencies to a user in terms of time and money savings.
  • As show in FIG. 1, the target shooting system 2 comprises: (a) a target 12 upon which a user fires hits; (b) a computer 10, which may be any computing means or device (preferably, the computer is a portable computing device, such as a laptop, netbook, tablet, smart phone or other suitable computing device) capable of interfacing with a camera 4 and receiving images (preferably real-time images) from the camera 4; (c) a software program 22 executed by the computer 10, which program 22 is capable of receiving and analyzing images from the camera 4 to determine a location of a hit on the target 12 (illustrative hits 16, 18 on the target 12 are shown lettered in alphabetical order from an oldest old hit 16, letter A, to a newest new hit 18, letter F); (d) a camera 4, which may be any means or device capable of image acquisition (preferably, a webcam, or a still frame or video digital camera, smart phone or other suitable image acquisition means or device capable of image acquisition, preferably in real-time) and interfacing with the computer 10; and (e) a scope 6 or other means or device capable of image magnification and interfacing with the camera 4. The scope 6 may interface with the camera 4 either as a unitary device integral with a camera 4 and having both image acquisition and image magnification capabilities (such as a camera with a high-powered zoom lens), or as a separate device from a camera 4, such as a spotting scope or telescope.
  • The program 22 for use with the system 2 is input with data consisting of a chronologic series of still or video frame images, preferably, in real-time, of the target 12 from the camera 4. The images captured by the camera 4 may be magnified or enhanced by first processing an image of the target 12 being monitored through the scope 6 before being submitted to the camera 4.
  • In addition to real-time imaging (such as with video imaging) of the target 12, a still image or photograph of the target 12 may be taken by the camera 4 at any time during use of the system 2. An image acquired by the camera 4 is submitted to the program 22 and may be stored in jpeg or other suitable file format to allow for easy viewing and sharing of the images by a user. Storing images of the target 12 enables the user to capture and preserve a particular target shooting session for the user to examine later or share with others.
  • An image submitted by the camera 4 to the program 22 also may be zoomed and panned within the program 22 which enables the user to examine closely on a display 20 (as shown on FIG. 1, FIG. 2) specific areas of the target 12 without a need for the user to relocate physically to the actual location of the target 12, which in turn enhances the safety of the user at a target shooting range by keeping the user out of harm's way and provides convenience to the user in the use of the system 2.
  • Any suitable type of target 12 may be used that can enables detection by the system 2 of a hit on the target 12. The target 12 may be any suitable material, such as paper.
  • To enable detection of a hit on the target 12 by the system 2, preferably, the target 12 has a calibration mark 24 on the target 12 which calibration mark 24 is visible within an area viewed by the scope 6. As shown on FIG. 1, the calibration mark 24 may be either an indicator which marks a known distance, such as the length of one actual inch on the target 12, or a symbol, such as a cross-hairs symbol. The calibration mark 24 may be added to the target 12 either when the target 12 is made, such as may be done with a pre-printed, paper target, or added to the target 12 post-manufacture by any suitable means such as a sticker, pen mark, or any other visible marker placed anywhere on the target 12. User confirmation of whether the calibration mark 24 is visible within an area viewed by the scope 6 either may be done by physical examination by the user looking through the scope 6, or may be done by means of the computer 10 display 20 after the program 22 is launched and operational.
  • For a target 12 bearing a calibration mark 24, it is preferred that the user not hit the calibration mark 24 during target shooting with the system 2. If the calibration mark 24 is hit, then the system 2 may lose calibration and its ability to detect a hit on the target 12 may be impaired.
  • To use the system 2, first the system 2 is setup. With reference to FIG. 1, to do so, a target 12 at which to shoot is setup on location a suitable distance downrange from a user or shooter; and a scope 6, camera 4 and computer 10 are setup out of the way, but near to, the user at the user's shooting position. The computer 10 preferably is pre-installed with the program 22 prior to system 2 setup on location. The camera 4 interfaces with the computer 10 by any suitable means or device, such as a USB cable or wireless connection.
  • If the system 2 uses a scope 6 which is a means or device separate from the camera 4, then the system 2 may further comprise a mounting apparatus 8 capable of interfacing the camera 4 and the scope 6, such that a lens of the camera 4 abuts an eyepiece of the scope 6. To interface the camera 4 and the scope 6, the user sets up the scope 6, by means of a tripod or the like. Next, the scope 6 is positioned to focus on the target 12 and the user looks through the eyepiece of the scope 6 and adjusts the scope 6 so that the target 12 is centered and in focus in the scope 6 at a desired magnification or zoom level. The magnification or zoom capability of the system 2 can range from approximately 1x to a maximum of approximately 75x (wherein ‘x’ is understood to be an unmagnified image at 100% of normal viewing power).
  • After the scope 6, camera 4 and computer 10 are setup and interfaced, the program 22 is launched or activated. Once activated, the program 22 is calibrated with the target 12 information.
  • To calibrate the system 2, the user insures that an image, preferably in focus, of the target 12 is visible on the computer 10 display 20. If a calibration mark 24 is on the target 12, then the calibration mark 24 also should be visible on the computer 10 display 20. The calibration mark 24 does not need to be visible within a magnified or zoomed viewing area of the computer 10 display 20 if a user is using the zoom feature of the program 22; however, the calibration mark 24 does need to be visible on the computer 10 display 20 when the magnification or zoom feature of the program 22 is set at 1x. As images, preferably real-time images, from the camera 4 are submitted to the program 22, a multi-step process to calibrate the target 12 is initiated by the user and performed by the program 22.
  • In a first calibration step, the user selects a caliber of ammunition or a projectile to be shot at the target 12 from the ammunition or projectile data parameters of the program 22. From the selected ammunition or projectile value, the program 22 can determine what hit size to detect to ensure accurate reporting of a hit on the target 12. As used herein, the term ‘ammunition’ refers to and encompasses any projectile.
  • In a second calibration step, the program 22 determines a measurement value; wherein the measurement value may be expressed as pixels-per-inch, which is the number of inline pixels on the display 20 that are equivalent to one linear inch on the target 12. Obviously, the measurement value may be expressed in any suitable distance data parameter, including metric, such as pixels-per-centimeter. The measurement value is used by the program 22 to determine a hit size and a location of a hit on the target 12.
  • The measurement value either (a) may be determined automatically by the program 22 through the use of a pre-existing, pre-defined calibration mark 24 on the target 12, such as a cross-hairs symbol of a known size, in which case no user action is required; or (b) may be determined manually by the user creating a calibration mark 24 on the target 12 and entering a definition of the calibration mark 24 into the program 22 from which the program 22 can determine the measurement value, wherein the definition may be a distance between at least two indicators on the target 12 marking a known distance, such as a length of one actual inch or centimeter on the target 12.
  • The program 22 automatically searches for the calibration mark 24 in an area of the target 12 viewable by the camera 4. Once the calibration mark 24 is found, then the program 22 is able to determine the measurement value, because the calibration mark 24 is defined. The program 22 determines the measurement value by comparing the number of pixels used to depict the calibration mark 24 on the display 20 to the definition of the calibration mark 24.
  • In a third calibration step, the program 22 determines a baseline value. After the measurement value is determined, the program 22 creates a map of the target 12 and stores the map for use as a baseline value. The baseline value is used by the program 22 to detect hits on the target 12. The program 22 detects a new hit 18 on the target 12 by comparing a change between the baseline value and a new hit value; wherein the new hit value is determined by the program 22 from data comprising a series of chronological images, such as still image frames or video image frames, received from the camera 4.
  • In a fourth calibration step, to complete calibration of program 22, the user activates a calibrate feature of the program 22. By means of an automatic calibration feature of the program 22, the program 22 can automatically calibrate itself. The program 22 calibrates itself by finding and locking on to a calibration mark 24 on the target 12. Once the calibration mark 24 is found and locked on, the program 22 scans the target 12 and stores the target 12 information to prepare the program 22 for detection of a new hit 18 fired on the target 12. Optionally, as discussed previously, the user may perform a manual calibration of the target 12 by means of the manual calibration feature of the program 22. When the calibration process is complete, the program 22 and the system 2 are ready to begin monitoring the target 12 for any new hit 18 fired on the target 12.
  • Alternatively, to calibrate the system 2, the calibration step order may be a first step of determining measurement value, a second step of determining a baseline value; a third step of selecting a caliber of ammunition; and a fourth step of activating a calibration feature of the system.
  • As a result of the calibration process, the target 12 viewing area on the display 20 is stabilized and the program 22 is able to compensate for any movement of the target 12. The compensation ability of the program 22 eliminates the detection of a false-positive hit, or a hit that is not really a hit, but rather an illusory hit resulting from extraneous movement or apparent movement of the target 12, such as from environmental vibration of the target 12 or scope 6.
  • Once a clear image of the target 12 is being displayed by the computer 10, optionally, the user may select a section of the display 20 to be monitored by the program 22. Preferably, the display 20 includes the actual target 12 area being shot at by the user and not any extraneous background around the target 12. To set the target 12 area within the display 20, the user activates a set-hit-area feature from the main control panel of the program 22 and then the user selects a desired target 12 area by means of the program 22 controls (for example, by using a mouse to click and drag a cursor to define a target 12 area). An outline of the target 12 area will remain visible on the display 20 for the user's reference. Optionally, the user may not use or may deactivate the set-hit-area feature of the program 22.
  • Use of the set-hit-area feature of the program 22 limits detection of hits on the target 12 to a specific area of the target 12 as set within the program 22 by the user. Rather than having the program 22 process the entire target 12 area viewable by the camera 4 for a new hit, the program 22 limits the target 12 monitoring to the set hit area. By setting a hit area, a sub-area of the displayed target 12 area will be monitored for hits. A set hit area may be changed, redefined, or removed at the user's preference by means of the program 22 controls. Additional benefits of setting a hit area are that the processing speed of the program 22 can be boosted and system 2 accuracy can be improved by eliminating false-positive hits and ignoring hits outside of the set hit area as invalid hits, either of which type of hit may occur as the result of undesired or interfering movement in the area of the target 12.
  • The user may set a bullseye 14 area or location within the target 12 area by activating the set-bullseye feature from the main control panel of the program 22 and then selecting a desired location and size for the bullseye 14 by means of the program 22 controls (for example, by using a mouse to click and drag a cursor to define a target 12 area to be the bullseye 14). The bullseye 14 may be located anywhere on the target 12, and the bullseye 14 size may be set by the user in accordance with the user's target shooting skill level. Selection of a bullseye 14 is optional, but preferable. The bullseye 14 is used by the program 22 to report the location of a hit on the target 12 relative to the bullseye 14. As one area of the target 12 becomes cluttered with multiple hits, the bullseye 14 may be relocated to a new location on the target 12, thus extending the useful life of and eliminating much of the need to replace the target 12.
  • As shown in FIG. 1 and FIG. 2, a target 12 is shown with old hits 16 and a new hit 18. A detection method executed by the program 22 is used by the system 2 to detect a new hit 18 on the target 12 and to locate the new hit 18 on the target 12 relative to the bullseye 14. The location of the new hit 18 on the target 12 is determined by the program 22 as a new hit value. A new hit value is determined by the program 22; wherein each image frame from the camera 4 received by the program 22 is processed to create a resultant value, which is a map of the target 12 with the new hit 18, and the resultant value is compared to the baseline value; wherein a difference between the resultant value and the baseline value is the new hit value.
  • As shown in FIG. 3, FIG. 4, FIG. 5 and FIG. 6, the program 22 is prepared or setup by process steps of selecting a caliber of ammunition (process 1), entering the scope zoom or magnification level (process 3), setting within the target 12 a hit area (process 5), setting within the target 12 a bullseye 14 (process 7), performing the program 22 calibration (process 9), and starting monitoring of the system 2 (process 11). However, also as shown in FIG. 3, FIG. 4, FIG. 5 and FIG. 6, with the system 2, the program 22 may detect a hit on the target 12 by various contour detection, calibration and monitoring processes. Hit detection by the system 2 may be accomplished by four different methods. Each of these hit detection methods are disclosed in turn.
  • As shown in FIG. 3, a first hit detection method comprises the following steps. In a first hit detection step, a still or video image frame, which is in the form of a bitmap image, is obtained from the camera 4 (process 13) and is submitted to the program 22 which executes a procedure that implements a mapping algorithm of the image, such as the Canny contour mapping algorithm (process 15). The algorithm provides a mapping of all detectable contours (a contour is defined as a border change, by color or intensity, between a series of adjacent pixels in the bitmap image) on the target 12 in the form of line segments and bounds boxes (a bounds box is defined as the smallest box that can be drawn around the line segments that constitute an individual contour) (process 17). By mapping these contours, a representation of the target 12 is constructed by the program 22 and shown on the display 20.
  • In a second hit detection step, with the contours mapped out as a series of line segments and bounds boxes, each contour-bounds box is analyzed by the program 22 to determine if the contour-bounds box falls within a user-specified hit size range (process 19). The objective of the program 22 is to detect a hit on the target 12 and the contours which indicate features of the target 12 that are larger or smaller than the expected hit size are discarded by the program 22. The remaining contours are stored for further processing by the program 22.
  • In a third hit detection step, if the program 22 is being calibrated as previously discussed above (as shown by decision 21 to process 25, or decision 21 to process 23 to process 25), no further processing of the image is done by the program 22 (as shown by process 25 to decision 27 to process 31). Instead, the contours found in the second hit detection step are stored by the program 22 to be used as a baseline value for determining changes to the target 12 (such as the creation of a new hit 18 on the target 12).
  • These three hit detection steps are repeated (as shown by process 25 to decision 27 to process 29) until the target 12 calibration is complete (as shown by decision 27 to process 31). Target 12 calibration is complete either after a set number of iterations of these hit detection steps are performed, or until no new contours are detected by the program 22, as determined by matching incoming contours to stored contours.
  • In a fourth hit detection step, once the calibration process is complete, monitoring of the target 12 for changes caused by a new hit 18 on the target 12 can be initiated. When target 12 monitoring is active, the first and second hit detection steps are performed by the program 22. After the second hit detection step, a detected contour (process 33) is compared to a list of existing contours (decision 35) compiled during the target 12 calibration process. If a contour match is detected by the program 22 (as determined by a weighing process involving both a size and position of the new hit 18 contour as compared to an old hit 16 contour), then that contour match is discarded as being an existing, or old hit 16 contour. If no match is found for the contour, then the contour is compared to the list of possible hit contours for a match (decision 37). If a match is found, then a hit detection counter for the contour is incremented accordingly (process 41). If the detection count for a contour is high enough (decision 43), then that contour is flagged as a new hit 18 contour (process 45) and is considered by the program 22 to be a new hit 18 on the target 12. After all new contours are processed and either discarded as an old hit 16 contour, or marked either as a possible, or an actual, new hit 18 contour, then the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18.
  • Once all incoming contours from the image are processed, all contours on the possible hit list are examined for the length of time the hits have been on the list (based on the image frame count starting from the initial detection of the contour). Any contour which has been on the list N (wherein ‘N’ is a variable) image frames without being marked as a new hit is discarded or flagged as being a transient, or non-hit, contour (process 47). The variable N can be any desired value determined and set either manually by the user or automatically by the program 22. The hit detection process continues with each new image frame from the camera 4 submitted to the program 22 until the monitoring of the target 12 is suspended or ceased by the user or shooter.
  • As shown in FIG. 4, a second hit detection method comprises the following steps. In a first hit detection step, the still or video image frame, which is in the form of a bitmap image, is obtained from the camera 4 (process 49), and is submitted to the program 22 which executes a procedure that implements a mapping algorithm of the image, such as the Canny contour mapping algorithm (process 51). The algorithm provides a mapping of all detectable contours on the target 12 in the form of line segments and bounds boxes (process 53). By mapping these contours, a representation of the target 12 is constructed by the program 22 and shown on the display 20.
  • In a second hit detection step, with the contours mapped out as a series of line segments and bounds boxes, the image consisting of these contours is stored to be used as a baseline value (process 63). The baseline value is used later by the program 22 as a reference point to mask out all contours that existed at the time the baseline value was created. If the program 22 is being calibrated as previously discussed above (as shown by decision 55 to process 57, or decision 55 to process 59 to process 61), no further processing of the image is done by the program 22 (as shown by process 59 to process 61).
  • In a third hit detection step, once the calibration process is complete, monitoring of the target 12 for changes caused by a new hit 18 on the target 12 can be initiated. When target 12 monitoring is active, the first hit detection step of the second hit detection method is performed by the program 22. After the first hit detection step, the baseline value is aligned with the incoming bitmap image using an image registration algorithm or other alignment process (process 63). Once aligned, the baseline value is used to eliminated all old contours by AND-ing the inverted baseline value with the incoming bitmap image (wherein AND-ing is an operation executed by the program 22 in which a bit of an incoming image is set to zero if the matching baseline value bit value is zero) (process 65). Any remaining contours (decision 67) of an appropriate size are considered to be either an actual new hit or a possible new hit (process 69). A new hit 18 is added to the baseline value. After all new contours are processed and marked as either an actual new hit 18 contour or possible new hit contour (as shown by decision 67 to process 69, or decision 67 to process 71 to decision 73 to process 75), the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18.
  • Once all incoming contours from the image frame are processed, all contours on the possible hit list are analyzed by the program 22 for the length of time the hits have been on the hit list, as based on an image frame count starting from the initial detection of a contour. Any contour which has been on the list N image frames without being marked as a new hit 18 contour is discarded or flagged as being a transient, or non-hit, contour (process 77). The second hit detection method continues with each new image frame from the camera 4 submitted to the program 22 until the monitoring of the target 12 is suspended or ceased by the user or shooter.
  • As shown in FIG. 5, a third hit detection method comprises the following steps. In this third method, no baseline value or storage of image contours by the program 22 is necessary. Instead, any graphics on the target 12 must consist of only non-black colors and must exclude the use of black in the target 12 area. Each incoming image frame from the camera 4 is filtered by the program 22 before the image is analyzed by an edge detection algorithm of the program 22. To filter an incoming image frame, the image frame is taken directly from the camera 4 (process 79) and passed to the program 22.
  • Next, the individual pixel values that make up the image frame are analyzed by the program 22. Any pixel that represents a non-black color, such as red or blue, is set to the value of white, effectively blanking out the pixel (process 81). Any pixels that equal black or near-black colors are passed through the filtering process unchanged. As any new hits are presumed to appear as black, this filtering process effectively allows only hits to be passed through to the contour detection process of the program 22 (as shown by process 83 to process 85).
  • With the colored pixels set to white, the image frame is passed to the edge detection algorithm of the program 22 to find any new contours (process 93). Any remaining contours of an appropriate size that do not represent an old hit 16 (decision 95 to process 97) are considered to be an actual new hit 18 or possible new hit (as shown by process 99 to decision 101 to process 103). After all new contours are processed by the program 22 and marked either as an actual new hit 18 contour or a possible new hit contour, the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18.s
  • Once all incoming contours from the image frame are processed, all contours on the possible hit list are examined for the length of time they have been on the list (based on an image frame count starting from the initial detection of the contour). Any contour that has been on the list N image frames without being marked as a new hit contour is discarded or flagged as being a transient, or non-hit, contour (process 105). The hit detection process continues with each new image frame from the camera 4 submitted to the program 22 until the monitoring of the target 12 is suspended or ceased by the user or shooter.
  • This third hit detection method is advantageous, because no image alignment or target 12 tracking is required other than to ensure old hits 16 are adjusted for by the program 22 as based on any target 12 movement. This adjustment is necessary to prevent an old hit 16 from being flagged as a new hit 18 due to target 12 movement. However, with this third hit detection method it is necessary that a target 12 not include black or near-black colors in the target 12 graphics. It is also necessary that the target 12 must be depicted with only appropriate colors within the set hit area and that a new hit 18 contour appear to the program 22 as black.
  • As shown in FIG. 6, a fourth hit detection method comprises the following steps. This fourth hit detection method involves using a combination of steps from both the second hit detection method and the third hit detection method. In the fourth hit detection method, the contour detection process steps of the third method (see FIG. 5) are used as a pre-processing step before image frames are passed into the contour detection process steps of the second method (see FIG. 4). This fourth hit detection method is advantageous, because the color black may be used in the target 12 area, while at the same time eliminating colored graphics from the target 12, and thereby reducing false positives and improving the accuracy of the hit detection process of the system 2.
  • Each incoming image frame from the camera 4 is processed by the program 22 before the image frame is analyzed by an edge detection algorithm of the program 22. An incoming image frame is taken directly from the camera 4 (process 107) and submitted to the program 22. Next, the individual pixel values that make up the image frame are analyzed by the program 22. Any pixel that represents a non-black color, such as red or blue, is set to the value of white, effectively blanking out the pixel (process 109). Any pixels that equal black or near-black colors are passed through the filtering process unchanged.
  • With the colored pixels set to white, the image frame is passed on to the edge detection algorithm of the program 22 to find any new contours (process 111). A baseline value created during a calibration process (as previously described above) is aligned with an incoming bitmap image using an image registration algorithm or other alignment process (process 113). Once aligned, the baseline value is used to eliminate all old contours by AND-ing the inverted baseline value with the incoming bitmap image. Any remaining contours are stored as bounds boxes of the contours (process 115).
  • Next, the calibration mark 24 (if being used) is detected and a new position of the calibration mark 24 is compared to an old position of the calibration mark 24. The program 22 can use either a change between the new and old calibration mark 24 positions, a change among all hits, a change within the set hit area, a change of the bullseye 14, or a change with regard to any other relevant indicator, to adjust the respective positions of the indicators accordingly to account for any extraneous movement of the target 12 or camera 4 (process 117).
  • Next, the contour sizes are examined by the program 22 and any contours not within a desired contour size range are discarded (process 119). The remaining contours are compared to the list of possible hit contours. If a remaining contour is on the list, then the detection count of the contour is incremented accordingly (process 121). If a hit contour has been detected a sufficient number of times, then it is added to the new hit list (process 123). If a hit contour has not been detected a sufficient number of times over N frames to register as a new hit, then the hit contour is discarded (process 125). Any new hit 18 is added to the baseline value. After all new contours are processed and marked as either an actual new hit 18, a possible new hit or a discarded hit, then the visual and audio feedback of the program 22 as previously described above is presented to the user or shooter indicating the position of the new hit 18 (process 127). The hit detection process continues with each new image frame from the camera 4 submitted to the program 22 until the monitoring of the target 12 is suspended or ceased by the user or shooter.
  • The program 22 is able to detect a single hit, as well as multiple, simultaneous hits, on the target 12. The ability of the program 22 to detect multiple, simultaneous hits on the target 12 is beneficial in an instance in which a user fires at the target 12 with a shotgun loaded with shot, because the program 22 can detect and report multiple hits from a shot pattern on the target 12.
  • Monitoring of the target 12 can be started without first performing the target 12 calibration process previously described above. However, if this is done, then any contour that falls into the specified hit size range immediately will be flagged by the program 22 as a new hit 18 on the target 12.
  • To begin target shooting and monitoring of hits, the user activates the monitor feature from the main control panel of the program 22. With reference to FIG. 1 and FIG. 2, as the user shoots at and projectiles hit the target 12, the hits will be reported by the program 22 both visually and audibly, and recorded in the output hits pane of the program 22 display 20. Hit indicators will appear on the display overlaying the old hits 16 and the new hit 18 on the target 12. The exact hit location, relative to a bullseye 14, is listed in the output hits pane of the program 22 display 20. If the audio feature of the program 22 is active and the computer 10 has sound, then the new hit 18 location also will be audibly reported by the program 22.
  • The bullseye 14 is used as a reference point to report a position of a new hit 18. When a new hit 18 is reported, its position is provided by the program 22 in the format such as hit X inches left or right of and Y inches above or below the bullseye 14. The distances given are the distance from the new hit 18 to the center of the bullseye 14. Using these distances, the user or shooter can determine how close a new hit 18 is to the bullseye 14.
  • When a target shooting session is complete, the user deactivates the monitor feature from the main control panel of the program 22 to suspend monitoring of the target 12. At the user's option, while monitoring is suspended, the bullseye 14 may be reset and the hit information registered by the program 22 may be cleared. By doing so, the target 12 display area is less cluttered with information and the cleared hits will not be registered again by the system 2. With the program 22, the user can manually add or remove hits from the target 12 area. By doing so, the user can remove false-positive or invalid hits, or add hits that the program 22 may have missed during processing of the target 12. The process of calibrating and monitoring the target 12 may be repeated as many times as the user wants.
  • With the system 2, the user can create graphs of hit data and can statistically analyze the hit data. A hit data set may be exported from the program 22 in a CSV file format. The CSV file format is widely supported and once exported the hit data set can be analyzed or displayed in any program that accepts CSV files, such as Excel by Microsoft and Grapher by Golden Software. The hit data in the exported CSV file includes the location of all hits 16, 18 on the target 12 and the hit data enable the user or shooter to track his or her target shooting performance progress.
  • The hit data may be provided in two formats. The first format for the hit data is in X, Y data pairs as standard axis coordinates in a two-dimensional plane graph, wherein X data are on a horizontal axis and Y data are on a vertical axis. These data pairs indicate the location of a hit 16, 18 in the form of X inches to the left or right of the bullseye 14, and Y inches above or below the bullseye 14, wherein the bullseye 14 is at an intersection of the X and Y axes, or 0, 0. A negative X value indicates a hit to the left of the bullseye 14 and a positive X value indicates a hit to the right of the bullseye 14. A negative Y value indicates a hit below the bullseye 14 and a positive Y value indicates a hit above the bullseye 14.
  • The second format for the hit data is in R, A data pairs as standard radius and angle polar coordinates in a two-dimensional polar graph. In this second format, R is a straight line distance value from a hit 16, 18 to the bullseye 14 (in inches, centimeters or other unit of measurement per the user's preference), as if a line was drawn from the hit 16, 18 to the center of the bullseye 14. In this second format, A is an angle value (in degrees) along an imaginary circle drawn around the bullseye 14. On this circle, a zero angle value is at the top of the circle (the 12 o'clock position). Angle values increase by one degree clockwise along the circle with a total of 360 degrees around the circle. Graphing a hit 16, 18 using polar coordinates may be easier than, and may be preferred by a user, to the more conventional X, Y coordinate system.
  • Optional, additional data associated with a hit 16, 18 on the target 12 can be time-stamp data. This feature of the program 22 allows the shooter to determine how accurate the hits are within a set time frame. A time window can also be set, wherein only hits that occur within a specific, fixed-time period are scored by the program 22. This feature is useful for a shooting competition or qualification test in which the shooter has only a limited time to fire at the target 12.
  • If the time-stamp feature of the program 22 is enabled, then included in the data set for each hit will be a time-stamp showing when the hit occurred. A first hit made during a target 12 monitoring session is set to a start time of zero seconds. Time-stamp data for a hit allows a user or shooter to know how quickly each new hit 18 on the target 12 is made in relation to the other, old hits 16.
  • Optionally, the program 22 can score the hits 16, 18 on target 12. Because the position of each hit relative to the bullseye 14 is known, each hit can be scored. For example, a hit within a one-inch circle around the bullseye 14 could be worth 10 points, a hit from one inch to two inches from the bullseye 14 could be worth nine points, etc. The scoring could be cumulative and could be used in scoring shooting competitions. A simulated target 12 can be overlaid on the target view of the display 20 to graphically represent the scoring process. The simulated target may match that of a typical paper target with concentric rings around a bullseye, with each ring representing a different point value based on the distance of a hit from the bullseye.
  • In addition to being able to export hit data, all hit information is stored by the program 22 on a continuous basis into a database file (e.g., an MDB file type). As each hit is detected, the program 22 adds the hit to the database. The information stored in the database may include shooter name, weapon name, ammunition caliber, hand being used to shoot with, range of target, hit location, time of hit, and other relevant data desired by a user. Storing this information in the program 22 allows a user to search the database based on the various data fields. For example, all hits by a particular shooter using a particular weapon could be extracted from the database.
  • The program 22 may have other optional features. Such an optional feature is that the user may use target overlays from within the program 22, so it is not necessary for the target 12 being viewed through the camera 4 to be displayed in the program 22. Rather, any image can be used as an overlay and placed in the target 12 viewing area of the program 22. Accordingly, the user can place any digital image in the target 12 viewing area, such as a bullseye 14, silhouette, or any other image file the user has available. The program 22 also supports drawing a ringed target overlay, as is typically used in target shooting, over the target 12. With the use of a target overlay, the user can use virtually blank paper for the target 12 while at the same time allowing the shooter and any spectators to see scoring rings over the target 12 from within the program 22 display. Further optionally, as shown in FIG. 1 and FIG. 2, the user can use the program 22 to draw a grid over the target 12 viewing area for reference.
  • Another optional feature of the program 22 is that an estimated distance range from the user's shooting position to the target 12 can be displayed by the program 22. The estimated distance range is accurate to within +/−10% and can be determined in any applicable unit of measurement, such as feet, yards, meters and the like. The user can determine at what range he or she is shooting at the target 12 without the need for an additional, separate range finder unit to be used with the system 2, unless the greater accuracy obtained with a separate range finder is desired by the user.
  • Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Although the present invention has been described with reference to specific embodiments, it is understood that modifications and variations of the present invention are possible without departing from the scope of the invention, which is defined by the claims.

Claims (15)

1. A target shooting system comprising:
a. A target; wherein the target is viewed by;
b. A scope; wherein the scope is interfaced with;
c. A camera; wherein the camera is interfaced with;
d. A computer; and
e. A program executed by the computer; wherein the program determines a location of a hit on the target as a change between a baseline value and a new hit value.
2. The target shooting system of claim 1, further wherein the scope is integral with the camera as a unitary device.
3. The target shooting system of claim 1, further wherein the scope is a separate device interfaced with the camera.
4. The target shooting system of claim 1, further wherein the system is calibrated and a baseline value is determined by means of the program comprising the steps of:
a. Capturing an image of the target by means of the camera;
b. Displaying the image of the target by means of the computer;
c. Selecting within the program a caliber of ammunition;
d. Determining by means of the program a measurement value;
e. Using the measurement value to create by means of the program a map of the target;
f. Storing within the program the map of the target as a baseline value; and
g. Activating by means of the program a calibration feature.
5. The target shooting system of claim 4, further wherein by means of the program a user selects an area of the target to be monitored by the system.
6. The target shooting system of claim 5, further wherein by means of the program a user selects a bullseye within the target area.
7. The target shooting system of claim 4, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Mapping all detectable contours on the target as line segments and bounds boxes;
c. Analyzing each detected contour-bounds box to determine if the contour-bounds box is within a specified range;
d. Comparing each detected contour-bounds box to existing contour-bounds boxes compiled during calibration of the program; and
e. Determining whether each detected contour-bounds box is a new hit value.
8. The target shooting system of claim 4, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Mapping all detectable contours on the target as line segments;
c. Analyzing each detected line segment to determine if the line segment is part of a baseline value;
d. Aligning the baseline value with the image; and
e. Determining whether each detected line segment is a new hit value.
9. The target shooting system of claim 4, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Filtering the image;
c. Detecting any new contour on the target; and
d. Determining whether each new contour is a new hit value.
10. The target shooting system of claim 4, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Filtering the image;
c. Detecting any new contour on the target;
d. Analyzing each detected contour to determine if the detected contour is part of a baseline value;
e. Aligning the baseline value with the image; and
f. Determining whether each detected contour is a new hit value.
11. A target shooting system comprising:
a. A target; wherein the target is viewed by;
b. A scope; wherein the scope is interfaced with;
c. A camera; wherein the camera is interfaced with;
d. A computer; and
e. A program executed by the computer; wherein the program determines a location of a hit on the target as a change between a baseline value and a new hit value; and
further wherein the system is calibrated and a baseline value is determined by means of the program comprising the steps of:
i. Capturing an image of the target by means of the camera;
ii. Displaying the image of the target by means of the computer;
iii. Selecting an area of the target to be monitored by the system;
iv. Selecting a bullseye within the target area;
v. Selecting within the program a caliber of ammunition;
vi. Determining by means of the program a measurement value;
vii. Using the measurement value to create by means of the program a map of the target;
viii. Storing within the program the map of the target as a baseline value; and
ix. Activating by means of the program a calibration feature.
12. The target shooting system of claim 11, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Mapping all detectable contours on the target as line segments and bounds boxes;
c. Analyzing each detected contour-bounds box to determine if the contour-bounds box is within a specified range;
d. Comparing each detected contour-bounds box to existing contour-bounds boxes compiled during calibration of the program; and
e. Determining whether each detected contour-bounds box is a new hit value.
13. The target shooting system of claim 11, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Mapping all detectable contours on the target as line segments;
c. Analyzing each detected line segment to determine if the line segment is part of a baseline value;
d. Aligning the baseline value with the image; and
e. Determining whether each detected line segment is a new hit value.
14. The target shooting system of claim 11, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Filtering the image;
c. Detecting any new contour on the target; and
d. Determining whether each new contour is a new hit value.
15. The target shooting system of claim 11, further wherein the new hit value is determined by means of the program comprising the steps of:
a. Obtaining an image of the target from the camera;
b. Filtering the image;
c. Detecting any new contour on the target;
d. Analyzing each detected contour to determine if the detected contour is part of a baseline value;
e. Aligning the baseline value with the image; and
f. Determining whether each detected contour is a new hit value.
US13/082,316 2011-04-07 2011-04-07 Target Shooting System Abandoned US20120258432A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/082,316 US20120258432A1 (en) 2011-04-07 2011-04-07 Target Shooting System

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/082,316 US20120258432A1 (en) 2011-04-07 2011-04-07 Target Shooting System

Publications (1)

Publication Number Publication Date
US20120258432A1 true US20120258432A1 (en) 2012-10-11

Family

ID=46966385

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/082,316 Abandoned US20120258432A1 (en) 2011-04-07 2011-04-07 Target Shooting System

Country Status (1)

Country Link
US (1) US20120258432A1 (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8613619B1 (en) * 2006-12-05 2013-12-24 Bryan S. Couet Hunter training system
US20130341869A1 (en) * 2012-01-18 2013-12-26 Jonathan D. Lenoff Target Shot Placement Apparatus and Method
US20140106311A1 (en) * 2012-10-16 2014-04-17 Nicholas Chris Skrepetos System, Method, and Device for electronically displaying one shot at a time from multiple target shots using one physical target
US20140182186A1 (en) * 2011-03-15 2014-07-03 David A. Stewart Video camera gun barrel mounting and programming system
US20140367918A1 (en) * 2013-05-21 2014-12-18 Gregory T. Mason Mason Target System
US20150123346A1 (en) * 2013-05-21 2015-05-07 Gregory T Mason Mason Target System
US20150140518A1 (en) * 2013-11-15 2015-05-21 Teresa Ann Horning Training device
WO2015054705A3 (en) * 2013-10-07 2015-05-28 Mil-Spec Designs Llc Optical relay system and method
EP3034987A1 (en) * 2014-12-18 2016-06-22 Cosmonio Ltd System for identifying a position of impact of a weapon shot on a target
US9435617B2 (en) 2014-10-29 2016-09-06 Valentin M. Gamerman Audible targeting system
US20160313097A1 (en) * 2015-01-20 2016-10-27 Brian D. Miller Electronic audible feedback bullet targeting system
RU2614204C1 (en) * 2016-01-19 2017-03-23 Федеральное государственное унитарное предприятие "Центральный институт авиационного моторостроения имени П.И. Баранова" Method of thrower angle aiming for projectile throwing
RU2614344C1 (en) * 2016-01-19 2017-03-24 Федеральное государственное унитарное предприятие "Центральный институт авиационного моторостроения имени П.И. Баранова" Method of thrower aiming for projectile throwing and system for its implementation
US9759530B2 (en) 2014-03-06 2017-09-12 Brian D. Miller Target impact sensor transmitter receiver system
US20180031353A1 (en) * 2012-10-16 2018-02-01 Nicholas Chris Skrepetos Systems, methods, and devices for electronically displaying individual shots from multiple shots on one physical target
US9891028B2 (en) 2016-06-22 2018-02-13 Rod Ghani Shooting game for multiple players with dynamic shot position recognition on a paper target
US20180353864A1 (en) * 2017-06-08 2018-12-13 Visual Shot Recognition Gaming, LLC Live Fire Gaming System
JP6445647B1 (en) * 2017-09-27 2018-12-26 株式会社Vision&Works Landing determination apparatus, landing determination method, program, and shooting training apparatus
US10260839B1 (en) 2018-09-03 2019-04-16 Rod Ghani Multiview display for aiming a weapon in accuracy training
US20190154401A1 (en) * 2016-07-17 2019-05-23 Erange Corporation Shooting training system
US20190226807A1 (en) * 2016-06-06 2019-07-25 Thomas R. Boyer System, method and app for automatically zeroing a firearm
US20200088500A1 (en) * 2018-09-19 2020-03-19 Andrew J. Lebbing Target verification and scoring system
US10648781B1 (en) * 2017-02-02 2020-05-12 Arthur J. Behiel Systems and methods for automatically scoring shooting sports
US10670373B2 (en) 2017-11-28 2020-06-02 Modular High-End Ltd. Firearm training system
US20200269117A1 (en) * 2020-05-07 2020-08-27 Eugene Mallory Golf Swing Improvement Aid
EP3034983B1 (en) 2014-12-19 2020-11-18 Diehl Defence GmbH & Co. KG Automatic gun
US10876819B2 (en) 2018-09-03 2020-12-29 Rod Ghani Multiview display for hand positioning in weapon accuracy training
US10982934B2 (en) 2017-01-27 2021-04-20 Robert Dewey Ostovich Firearms marksmanship improvement product and related system and methods
CN113048844A (en) * 2021-03-09 2021-06-29 山东大学 Low-power consumption intelligent target shooting identification method and system based on audio signal control
US20210396499A1 (en) * 2020-02-03 2021-12-23 Focaltron Corporation Smart shooting system based on image subtraction and knowledge-based analysis engine and method therefor
US11433313B2 (en) 2017-06-08 2022-09-06 Visual Shot Recognition Gaming, LLC Live fire gaming system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5991043A (en) * 1996-01-08 1999-11-23 Tommy Anderson Impact position marker for ordinary or simulated shooting
US20020012898A1 (en) * 2000-01-13 2002-01-31 Motti Shechter Firearm simulation and gaming system and method for operatively interconnecting a firearm peripheral to a computer system
US20030082502A1 (en) * 2001-10-29 2003-05-01 Stender H. Robert Digital target spotting system
US7292262B2 (en) * 2003-07-21 2007-11-06 Raytheon Company Electronic firearm sight, and method of operating same
US7329127B2 (en) * 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US20080163536A1 (en) * 2005-03-18 2008-07-10 Rudolf Koch Sighting Mechansim For Fire Arms

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5991043A (en) * 1996-01-08 1999-11-23 Tommy Anderson Impact position marker for ordinary or simulated shooting
US20020012898A1 (en) * 2000-01-13 2002-01-31 Motti Shechter Firearm simulation and gaming system and method for operatively interconnecting a firearm peripheral to a computer system
US7329127B2 (en) * 2001-06-08 2008-02-12 L-3 Communications Corporation Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US20030082502A1 (en) * 2001-10-29 2003-05-01 Stender H. Robert Digital target spotting system
US7292262B2 (en) * 2003-07-21 2007-11-06 Raytheon Company Electronic firearm sight, and method of operating same
US20080163536A1 (en) * 2005-03-18 2008-07-10 Rudolf Koch Sighting Mechansim For Fire Arms

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8613619B1 (en) * 2006-12-05 2013-12-24 Bryan S. Couet Hunter training system
US9267761B2 (en) * 2011-03-15 2016-02-23 David A. Stewart Video camera gun barrel mounting and programming system
US9546846B2 (en) 2011-03-15 2017-01-17 David A. Stewart Video camera gun barrel mounting system
US20140182186A1 (en) * 2011-03-15 2014-07-03 David A. Stewart Video camera gun barrel mounting and programming system
US20130341869A1 (en) * 2012-01-18 2013-12-26 Jonathan D. Lenoff Target Shot Placement Apparatus and Method
US10247517B2 (en) * 2012-10-16 2019-04-02 Nicholas Chris Skrepetos Systems, methods, and devices for electronically displaying individual shots from multiple shots on one physical target
US9829286B2 (en) * 2012-10-16 2017-11-28 Nicholas Chris Skrepetos System, method, and device for electronically displaying one shot at a time from multiple target shots using one physical target
US20180031353A1 (en) * 2012-10-16 2018-02-01 Nicholas Chris Skrepetos Systems, methods, and devices for electronically displaying individual shots from multiple shots on one physical target
US20140106311A1 (en) * 2012-10-16 2014-04-17 Nicholas Chris Skrepetos System, Method, and Device for electronically displaying one shot at a time from multiple target shots using one physical target
US20150123346A1 (en) * 2013-05-21 2015-05-07 Gregory T Mason Mason Target System
US20140367918A1 (en) * 2013-05-21 2014-12-18 Gregory T. Mason Mason Target System
WO2015054705A3 (en) * 2013-10-07 2015-05-28 Mil-Spec Designs Llc Optical relay system and method
US20150140518A1 (en) * 2013-11-15 2015-05-21 Teresa Ann Horning Training device
US9759530B2 (en) 2014-03-06 2017-09-12 Brian D. Miller Target impact sensor transmitter receiver system
US9435617B2 (en) 2014-10-29 2016-09-06 Valentin M. Gamerman Audible targeting system
EP3034987A1 (en) * 2014-12-18 2016-06-22 Cosmonio Ltd System for identifying a position of impact of a weapon shot on a target
EP3034983B2 (en) 2014-12-19 2024-01-24 Diehl Defence GmbH & Co. KG Automatic gun
EP3034983B1 (en) 2014-12-19 2020-11-18 Diehl Defence GmbH & Co. KG Automatic gun
US20160313097A1 (en) * 2015-01-20 2016-10-27 Brian D. Miller Electronic audible feedback bullet targeting system
US10458758B2 (en) * 2015-01-20 2019-10-29 Brian D. Miller Electronic audible feedback bullet targeting system
RU2614344C1 (en) * 2016-01-19 2017-03-24 Федеральное государственное унитарное предприятие "Центральный институт авиационного моторостроения имени П.И. Баранова" Method of thrower aiming for projectile throwing and system for its implementation
RU2614204C1 (en) * 2016-01-19 2017-03-23 Федеральное государственное унитарное предприятие "Центральный институт авиационного моторостроения имени П.И. Баранова" Method of thrower angle aiming for projectile throwing
US20190226807A1 (en) * 2016-06-06 2019-07-25 Thomas R. Boyer System, method and app for automatically zeroing a firearm
US9891028B2 (en) 2016-06-22 2018-02-13 Rod Ghani Shooting game for multiple players with dynamic shot position recognition on a paper target
US10060713B2 (en) 2016-06-22 2018-08-28 Rod Ghani Shooting game for multiple players with dynamic shot position recognition on a paper target
US10502531B2 (en) * 2016-07-17 2019-12-10 Erange Corporation Shooting training system
US20190154401A1 (en) * 2016-07-17 2019-05-23 Erange Corporation Shooting training system
US10982934B2 (en) 2017-01-27 2021-04-20 Robert Dewey Ostovich Firearms marksmanship improvement product and related system and methods
US10648781B1 (en) * 2017-02-02 2020-05-12 Arthur J. Behiel Systems and methods for automatically scoring shooting sports
US20180353864A1 (en) * 2017-06-08 2018-12-13 Visual Shot Recognition Gaming, LLC Live Fire Gaming System
US11433313B2 (en) 2017-06-08 2022-09-06 Visual Shot Recognition Gaming, LLC Live fire gaming system
JP6445647B1 (en) * 2017-09-27 2018-12-26 株式会社Vision&Works Landing determination apparatus, landing determination method, program, and shooting training apparatus
US10670373B2 (en) 2017-11-28 2020-06-02 Modular High-End Ltd. Firearm training system
US10876819B2 (en) 2018-09-03 2020-12-29 Rod Ghani Multiview display for hand positioning in weapon accuracy training
US10260839B1 (en) 2018-09-03 2019-04-16 Rod Ghani Multiview display for aiming a weapon in accuracy training
US20200088500A1 (en) * 2018-09-19 2020-03-19 Andrew J. Lebbing Target verification and scoring system
US20210396499A1 (en) * 2020-02-03 2021-12-23 Focaltron Corporation Smart shooting system based on image subtraction and knowledge-based analysis engine and method therefor
US20200269117A1 (en) * 2020-05-07 2020-08-27 Eugene Mallory Golf Swing Improvement Aid
CN113048844A (en) * 2021-03-09 2021-06-29 山东大学 Low-power consumption intelligent target shooting identification method and system based on audio signal control

Similar Documents

Publication Publication Date Title
US20120258432A1 (en) Target Shooting System
US11391542B2 (en) Apparatus and method for calculating aiming point information
US9829286B2 (en) System, method, and device for electronically displaying one shot at a time from multiple target shots using one physical target
US8651381B2 (en) Firearm sight having an ultra high definition video camera
US20190063884A1 (en) Systems and methods for automated shooting evaluation
US10247517B2 (en) Systems, methods, and devices for electronically displaying individual shots from multiple shots on one physical target
EP3034987A1 (en) System for identifying a position of impact of a weapon shot on a target
SE506468C2 (en) Hit position marker for shotgun shooting
EA031066B1 (en) Firearm aiming system (embodiments) and method of operating the firearm
US20160298930A1 (en) Target practice system
US20070254266A1 (en) Marksmanship training device
US20180202775A1 (en) Shooting Game for Multiple Players with Dynamic Shot Position Recognition and Remote Sensors
CN109839035B (en) Accurate positioning method of target scoring system and target scoring system
ATE9514T1 (en) TARGETING AND SHOOTING TRAINING DEVICE.
US20190226807A1 (en) System, method and app for automatically zeroing a firearm
US10876819B2 (en) Multiview display for hand positioning in weapon accuracy training
CN213599936U (en) Day and night intelligent sighting device
RU88790U1 (en) MULTIMEDIA INTERACTIVE RUNNING DASH
EP2746716A1 (en) Optical device including a mode for grouping shots for use with precision guided firearms
KR102011765B1 (en) Method and apparatus for aiming target
KR20210155931A (en) Method for Aiming Moving Target and Apparatus for Aiming them
WO2024049898A1 (en) Camera detection of point of impact of a projectile with a physical target
GB2622946A (en) Method of and apparatus for adding digital functionality to a scope
WO2023042195A1 (en) Smart aiming device with built-in training system for marksmanship and firearm operation
KR20200084077A (en) impact point detection method of shooting system with bullet ball pellet

Legal Events

Date Code Title Description
AS Assignment

Owner name: OUTWEST SYSTEMS, INC., COLORADO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WEISSLER, PAUL;REEL/FRAME:026094/0169

Effective date: 20110404

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION