GB2617532A - Targets - Google Patents

Targets Download PDF

Info

Publication number
GB2617532A
GB2617532A GB2112974.7A GB202112974A GB2617532A GB 2617532 A GB2617532 A GB 2617532A GB 202112974 A GB202112974 A GB 202112974A GB 2617532 A GB2617532 A GB 2617532A
Authority
GB
United Kingdom
Prior art keywords
target
state
sensors
domain
moves
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2112974.7A
Other versions
GB202112974D0 (en
Inventor
Harrison Matthew
Crowley James
Megawarne Justin
Stasik Bartosz
Jacobs Timothy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
4GD Ltd
Original Assignee
4GD Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 4GD Ltd filed Critical 4GD Ltd
Priority to GB2112974.7A priority Critical patent/GB2617532A/en
Publication of GB202112974D0 publication Critical patent/GB202112974D0/en
Priority to PCT/IB2022/058517 priority patent/WO2023037312A1/en
Publication of GB2617532A publication Critical patent/GB2617532A/en
Pending legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J9/00Moving targets, i.e. moving when fired at
    • F41J9/02Land-based targets, e.g. inflatable targets supported by fluid pressure
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/06Acoustic hit-indicating systems, i.e. detecting of shock waves
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J7/00Movable targets which are stationary when fired at
    • F41J7/04Movable targets which are stationary when fired at disappearing or moving when hit
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J7/00Movable targets which are stationary when fired at
    • F41J7/06Bobbing targets, i.e. targets intermittently or unexpectedly appearing

Abstract

A target comprises one or more domain sensors and one or more domain actuators 202. The domain actuators 202 control actions of the target. The target has a current behavioural state selected from a plurality if possible behavioural states. The one or more domain actuators 202 control the actions of the target at least in part based on the current behavioural state of the target, The current behavioural state may be selected based at least in part on one more outputs from one or more of the domain sensors.

Description

TARGETS
Field of Invention
[0001] The present invention relates to targets for training, in particular, although not necessarily exclusively, to targets for projectiles fired by non-ballistic ammunition such as paint marker rounds.
Some embodiments of the invention are particularly suited for use in simulation of military activities, including for example for use in urban warfare training facilities.
Background
[0002] Armed forces use training facilities to train their units for combat in the various environments in which they are likely to encounter action. For urban warfare training or other close quarters combat training, facilities are known that are designed to simulate urban and other close quarters environments. These training facilities make use of targets, positioned in the training environment, to represent enemy personnel.
[0003] It is desirable to simulate events in training facilities in a manner that is as realistic as possible, to provide an immersive experience for users of the facility.
[0004] The targets used in existing training facilities are typically very basic. Generally, they will either be static, or 'twist' or 'flip up' to engage a user of the training facility and respond to an impact by a projectile (e.g. from live or non-ballistic ammunition) by twisting back to side-on or flipping back down. The specific location of hits on the target will often be determined after the event by marks on the target, for example holes in or paint marks on a paper target. More realistic targets and targets which are capable of automatically collecting information about the fall of shot would be beneficial.
Summary
[0005] In general, embodiments of the invention are concerned with addressing at least some of the shortcomings of known targets by providing targets that behave and respond in a more realistic manner and automatically collect information about the point of impact of a projectile or projectiles.
[0006] In some embodiments, more realistic target behaviour is achieved using a system of sensors on the target that can be used to determine, in 'real-time' (i.e. within a few milliseconds), where on the target a projectile has impacted. This location determination can, for example, be used to control subsequent behaviour of the target.
[0007] In the same and/or some other embodiments, more realistic target behaviour is achieved by controlling actions of the target based on a current behavioural state of the target, where the behavioural state is itself determined based on inputs from one or more sensors and/or other external inputs.
[0008] In a first aspect, the invention provides a physical target for detecting the impact of a projectile, the target comprising: a body; a plurality of sensors mounted on the body for detecting an impact of a projectile on the body, each sensor providing an output representing a detected impact; and a processor configured to receive the sensor outputs and to determine from the sensor outputs a location at which the projectile impacts the body.
[0009] In some embodiments, at least some of the plurality of sensors are acoustic wave sensors to detect acoustic waves caused by impact of the projectile on the body. The sensors may, for example, be piezoelectric sensors and/or microphones.
[0010] The target body may take any of a number of appropriate shapes and forms. In some embodiments the target body may be flat. In some other embodiments, the target body may have shape to approximate the shape of the whole or one or more portions of a human body.
[0011] In some embodiments a surface of the body includes two or more defined zones and the determination of the location of the impact is a determination of which one of the two or more zones the projectile has impacted. The target may, for example, include two or more discrete panels, each of which corresponds to one of the two or more defined zones (with each panel having one or more of the sensors mounted thereon). The processor may, for example, determine the zone that the projectile has impacted based on the location of the sensor or sensors that have the highest magnitude response to the impact of the projectile.
[0012] In some other embodiments, the determination of the location of the impact is a determination of a specific location on the body. For example, the processor may calculate the specific location on the body from the sensor outputs using techniques such as multilateration. This approach can also be used to provide a determination of which zone on a body the projectile impacted, for example by first determining the location of the impact and then determining which of a plurality of zones that location falls within.
[0013] In some embodiments in which the projectile is determined to have impacted the body of the target in a predefined zone, the zones are categorised as one of a plurality of defined categories and an output from the processor includes a category for the zone that the projectile has impacted. The zones may, for example, include one or more of a kill zone category, a maim zone category and a pain zone category. In some embodiments, a behaviour of the target subsequent to impact by a projectile is determined at least in part by the category of the zone impacted by the projectile.
[0014] In some embodiments, the processor may also determine a magnitude of the projectile impact based on the sensor outputs. In this case, a behaviour of the target subsequent to impact may take into account the magnitude of the impact.
[0015] In some embodiments, the target further comprises one or more environmental sensors, which may, for example, be mounted on the target body. The sensors may include, for example, one or more of sensors for detecting sound in the proximity of the target; sensors for detecting motion in the proximity of the target; sensors for detecting objects in the proximity of the target; and sensors for capturing images of the target surroundings.
[0016] In some embodiments, the environmental sensors include one or more microphones. These may, for example, be used to detect sounds in the environment around the target, including sounds such as foot fall of an approaching person, talking, shouting or gunfire. A plurality of microphones may be used, for example, to determine a direction from which a sound is received.
[0017] In some embodiments, the environmental sensors include one or more cameras, which can be used, for example, to detect motion in the vicinity of the target. In some cases, two (or more) cameras and/or a depth-sensing camera can be used to determine the distance of perceived objects from the target as well as their direction of movement.
[0018] In some embodiments, the target includes one or more actuators to control actions of the target. The actions controlled by the actuators may, for example, include one or more of movement of the target; sound emission from the target; and firing ammunition from the target.
[0019] The actuators can, for example, cause movement of the target body in any of a number of different directions, for instance rotation, yaw, pitch, roll or translation along or relative to one or more axes. In some embodiments, the movements include one or more of rotation of the body about a vertical axis of the target body, pitch of the target relative to the vertical axis and change in vertical position of the body along the vertical axis. The term vertical axis used here is intended to be taken as an axis that extends vertically through the body in its normal orientation, typically from a base of the target to a head of the target, and not necessarily vertical in space. A combination of movements can be employed, for example, to move the target in a more realistic manner, in response to inputs, for instance turning towards a sound or moving object, 'peeking' around a corner (by pitching forwards or sidewards), as well as dropping vertically, for example when hit by a projectile in a kill zone.
[0020] In some embodiments, the actuators are controlled to cause actions of the target in response to outputs from one or more of the sensors on the target. The target can, for example, be assigned one of a plurality of target states and the actions of the target in response to outputs from the sensors controlled based at least in part on a current state of the target. The current state of the target can, itself, be determined at least in part based on earlier outputs from one or more of the sensors on the target.
[0021] Whilst embodiments of the invention can be used with a great variety of projectiles, some embodiments are particularly suited for use with projectiles fired by non-ballistic ammunition, for example paint marker rounds, non-marking rounds, plastic airsoft rounds, air rifle pellets, rubber bullets and bean bag rounds.
[0022] In a second aspect the invention provides a target comprising one or more domain sensors and one or more domain actuators, the domain actuators controlling actions of the target; the target having a current behavioural state, selected from a plurality of possible behavioural states; and the one or more domain actuators controlling the actions of the target at least in part based on the current behavioural state of the target.
[0023] In some embodiments, the current behavioural state of the target is selected based at least in part on one or more outputs from one or more of the domain sensors.
[0024] In some embodiments, the target moves from one state to another state based on at least in part on one or more of the outputs from the one or more domain sensors.
[0025] In some embodiments, the target moves from one state to another based at least in part on an elapsed period of time.
[0026] In some embodiments, the target moves from one state to another based on an external input, for example from a control device or another target.
[0027] In some embodiments, in addition to control of the actuators being based (in part) on the behavioural state of the target, they are also based (at least in part) on one or more outputs from the one or more domain sensors. In some cases, the specific response based on the sensor outputs may be modified based on the current state of the target.
[0028] In some embodiments, the possible target states include any one or more of idle; alert; attacking; stunned; weakly attacking; subdued (injured or surrendered); and dead (killed), as described further below.
[0029] In some embodiments, the target moves from an idle state to an alert state based on the domain sensors having detected sound or movement.
[0030] In some embodiments, the target moves from the alert state back to the idle state after a predetermined period of time has elapsed at the idle state.
[0031] In some embodiments, the target moves from an alert state to an attacking state based on the domain sensors having detected further movement or further sound.
[0032] In some embodiments, the target moves to a stunned state based on the domain sensors having detected a flashbang detonation in the proximity of the target. The flashbang detonation may, for example, be detected based on a combination of outputs from the domain sensors, including one or more domain sensors detecting light (e.g. image intensity) above a predetermined intensity at the same time as one or more domain sensors detecting sound in a predetermined frequency range associated with the detonation of a flashbang.
[0033] In some embodiments, the target moves from the stunned state to a weakly attacking state after a predefined period of time has elapsed from detection of the flashbang condition.
[0034] In some embodiments, the target moves to a subdued state based at least in part on one or more of the domain sensors detecting an impact from a projectile to a maim zone on the target.
[0035] In some embodiments, the target moves to a subdued state based at least in part on a surrender decision. The surrender decision may, in some cases, be made by the target autonomously. Alternatively, the surrender decision may be provided by an external input or based on predefined criteria, such as time.
[0036] In some embodiments, the target moves to a killed state based at least in part on one or more of the domain sensors detecting an impact or impacts from a projectile or projectiles to a kill zone on the target.
[0037] Similarly to the first aspect above, the domain sensors may include any one or more of: sensors for detecting an impact of a projectile on the target; sensors for detecting sound in the proximity of the target; sensors for detecting motion in the proximity of the target; sensors for detecting objects in the proximity of the target; and sensors for capturing images of the target surroundings.
[0038] Also similarly to the first aspect above, the actions of the target controlled by the domain actuators may include any one or more of: movement of the target; sound emission from the target; and firing ammunition from the target.
[0039] With regards to both of the aspects discussed above, in some embodiments it will be desirable to record data from the sensors and/or the actuators. The data could be recorded locally at the target, for later upload to a server or for later direct interrogation from the target. Alternatively or additionally, the data could be uploaded to a server as it is collected, e.g. in real time. In some embodiments, some or all of the data is timestamped, for example so that the specific times of the projectile impacts, other sensor triggers and target actions are known and can be synchronised in time with data collected elsewhere in the training facility.
[0040] In this way, targets in accordance with some embodiments of the invention can serve as data collection devices, collecting a rich, time-stamped, set of data either for providing instant feedback and/or triggering of one or more events, or for later analysis, for example to monitor the performance of subjects using the training environment in which the targets are installed or to monitor the operation of the training facility itself.
[0041] Targets in accordance with some embodiments of one or both aspects of the present invention can be used in facilities such as those described in our co-pending UK patent application no. GB20008971.0, [0042] Targets in accordance with some embodiments of one or both aspects of the present invention can also be used in conjunction with systems such as those described in our co-pending UK patent applications nos. GB2101083.0 and GB2101084.8 in the context of monitoring the behaviour of subjects and allowing virtual and real events to take place contemporaneously and to influence each other.
[0043] The skilled person will appreciate that the features described and defined in connection with the aspects of the invention and the embodiments thereof may be combined in any combination, regardless of whether the specific combination is expressly mentioned herein. Thus, all such combinations are considered to be made available to the skilled person.
Brief Description of the Drawings
[0044] Embodiments of the invention will be described, by way of example only and with reference to the following drawings, in which: [0045] Figure 1 illustrates a smart target in accordance with an embodiment of the present invention; [0046] Figures 2a and 2b respectively show a section through the smart target of figure 1 and a view of the target with its outer shell removed; [0047] Figure 3 schematically illustrates components of the smart target of figure 1; [0048] Figure 4 schematically illustrates components for fall of shot detection in the smart target of figure 1; [0049] Figure 5 is a flowchart showing how fall of shot is detected by the smart target of figure 1; [0050] Figure 6 is a flowchart showing an alternate approach to fall of shot detection for the smart target of figure 1; and [0051] Figures 7a and 7b show behavioural states and end states for the smart target of figure 1. Detailed Description [0052] Embodiments of the present invention are described below by way of example only.
[0053] In the exemplified embodiments, a "smart" target is provided that is able to respond in a more realistic manner, both to the impact of projectiles and to inputs sensed from the target's surrounding environment, including sounds and movements that indicate, for example, approaching combatants. This is achieved using a sensor system, comprising a plurality of domain sensors, in combination with an actuator system, comprising multiple actuators to control actions of the target, as explained below.
[0054] To further improve the realism of the target response, a series of behavioural states are employed. The targets are capable at any given time of being assigned one of a number of different behavioural states, based for example on previously received sensor inputs and other parameters. The actions of the target in response to further inputs are determined, at least in part, by the current behavioural state of the target, as explained further below.
[0055] Figures 1, 2a and 2b illustrate one exemplar target in accordance with an embodiment of the invention.
[0056] The target 101 includes a base 102 and a body 103, which is mounted on the base by way of a support post 104. The body 103 comprises a support frame 201 and an outer shell 105 mounted on the frame 201. As seen in the figures, in this example the shell 105 is shaped to mimic the shape of a human torso and head.
[0057] In this example, the body 103 is able to rotate about its vertical axis, lower and raise along the same axis and also flip down (and back up).
[0058] More specifically, the support post 104 is mounted for rotation (about its longitudinal axis, corresponding in this case to a vertical axis of the target) relative to the base 102 on which it is mounted. An actuator (not shown) is housed in the base to drive rotation of the support post 104, which in turn causes the target body 103 to rotate. In addition, a further actuator (also not shown) is operable to flip the base relative to the ground, about pivot 106, to lay the target flat on the floor (and flip it back to a standing position) as is known with existing targets. Vertical movement of the target body 103 on the support post, to lower and raise the body 103 relative to the base 102, is controlled by an actuator 202 (shown in figs. 2a and 2b) mounted within the body 103.
[0059] In other embodiments, additional or alternative movements may be implemented using further actuators.
[0060] In addition to movement, the target includes further domain actuators to provide for other actions including sound generation and firing of (non-ballistic) projectiles from the target. More specifically, a speaker 203 mounted on the support frame 201 within the body 103 is driven to project sounds (e.g. shouting or screeching) from the target and a modified BB or Airsoft gun 204 can be mounted within the target body 103 and actuated to fire projectiles outwardly from the target in the direction in which the target is facing.
[0061] The target also incorporates a plurality of domain sensors for: detecting an impact of a projectile on the target; detecting sound in the proximity of the target; detecting motion in the proximity of the target; for detecting objects in the proximity of the target; and capturing images of the target surroundings (including simple detection of light intensity).
[0062] More specifically, the exemplified target includes an array of piezoelectric sensors (not shown) mounted in the outer shell 105 of the body to detect the impact of projectiles on the outer shell 105, more specifically the location of the impact, as explained in more detail below.
[0063] A camera 205 is mounted facing forwards in the head portion of the target body 103, which can be used to detect motion, detect specific objects (using appropriate image processing techniques, for instance suitable image processing for person detection) and to capture images (including detecting light intensity, e.g. above a predetermined intensity threshold). An array of microphones (not shown) is also incorporated into the body, for example mounted at the top of the head portion of the outer shell 105, to detect sound in the proximity of the target. By using an array of microphones, it is also possible to determine the direction from which the sound is arriving at the target.
[0064] The outputs from the sensors can be used to control actions of the target in response to external stimulus. For example, the target could be controlled to turn towards sounds that are detected by the microphones, to track movement of a target using images captured by the camera, and to screech, speak, scream and/or fire projectiles at a detected object. The target can also be controlled to react to projectile impacts in different ways depending on the location of the impact, as explained below.
[0065] The exemplified target is intended for use with non-ballistic ammunition, so in this case the outer shell 105 is formed from a material that is resistant to impact from projectiles such as paint marker rounds, non-marking rounds, plastic airsoft rounds, air rifle pellets, rubber bullets and bean bag rounds. The material is selected, however, to provide adequate transmission of acoustic waves, caused by impact of a projectile, through the shell 105 to the piezoelectric sensors mounted on the shell 105. One suitable material is a polycarbonate.
[0066] Figure 3 schematically illustrates main components of the smart target.
[0067] The base unit, as already discussed above, includes two actuators for respectively rotating and flipping the support post (and the target body connected to it), referred to as a twist motor and flip motor. The base also includes a battery for providing power to the motors and for powering the other electrical and electronic components of the target. Alternatively or additionally, the base unit may be connected to an external source of power. The base unit also includes a control unit that provides connections between various system components as well as wireless communication with the facility in which the target is located. This enables, for example, communication with a central controller, e.g. sending time-stamped telemetry data enabling later analysis of the operation of the target.
[0068] As noted above, the target body is mounted on the support post and the vertical position of the body on the post can be controlled to fall (i.e. drop), for example to simulate the target crouching or when the target is "killed", and to raise again. The actuator for this motion may, for example, include an electromagnetic 'clamp' that holds the body towards the top of the post and which can be released to allow the body to drop down the post.
[0069] The base unit also provides power to and routes control signals to an electrically actuated BB or Airsoft gun mounted within the body (referred to in figure 3 as the "SimStriker Distractor").
[0070] The outer shell of the target body (referred to in figure 3 as the "SimStriker Panel" is, in this example, configured with five distinct hit zones, one in the head area (Zone 1), one either side of the torso (Zones 2), one in the centre of the torso (Zone 3) and one in the abdomen area (Zone 4). These zones may be provided by discrete panels on the outer shell or as delineated zones within a single panel. In other embodiments there may be more zones or fewer zones than illustrated here.
[0071] Each zone includes at least one sensor for detecting impacts on the outer shell of the target, in this example piezoelectric sensors. In practice, it will normally be desirable for each zone to include multiple piezoelectric sensors. Output signals from the sensors are processed by a microcontroller, in the manner described further below, to determine the location of a projectile impact on the outer shell (either the zone or a more precise location on the shell).
[0072] The head region of the target body includes environmental sensors for detecting sound and movement in the vicinity of the target, specifically a camera and an array of microphones. A speaker is also incorporated, as noted above, to output sounds, such as a simulated shout or screech.
[0073] The output signals from the environmental sensors are transmitted to a processing unit (labelled "SimStriker Detector" in figure 3). The processing unit is programmed to control the various actuators associated with the target (movement actuators, speaker and Distractor) based on sensor inputs, including the environmental sensors and the sensors detecting impacts of projectiles on the target.
[0074] Figure 4 shows in more detail the components that participate in detecting projectile impacts (sometimes referred to as "fall of shot') on the outer shell of the target body. In this example, each of the hit zones on the shell are shown to have four piezoelectric sensors. The signals from these sensors are provided as inputs to a microcontroller (the "Piezo pC" of figure 4 with its "Zone Disambiguator") that, in this example, operates to determine, from the sensor signals, which of the zones has been hit. In this example, a dedicated microcontroller is used exclusively for this function, which can provide benefits in terms of processing speeds. In other embodiments, a more general microprocessor, or other processing device may be used, for example a processor that performs other functions too.
[0075] The output from the microcontroller, identifying the hit zone, is transmitted to the base unit, from where it can be routed to the main processing unit ("SimStriker Detector') or to elsewhere in the facility.
[0076] The process for identifying and reporting the zone impacted by a projectile is explained in more detail with reference to figure 5. In this figure, the steps outlined in dashed lines are carried out by the microcontroller and the steps outlined in solid lines are carried out by the target's main processing unit.
[0077] As shown, the outputs from the sensors on the body shell are monitored until a voltage spike is detected on any one of the sensors. If the voltage spike is below a predetermined threshold, no further action is taken, unless and until there is another spike. If, on the other hand, the voltage spike is above the predetermined threshold, signifying a projectile impact, the output signal (sensor power) from each sensor is aggregated for a predetermined period of time (X microseconds) and then the output signal for a subsequent predetermined period of time (Y microseconds) for each sensor is discarded so as to ignore reflections of the acoustic waves following the impact. The aggregated sensor powers for each sensor are then ranked and, based on this ranking, the most likely hit zone is identified and reported to the target's main processing unit. The main processing unit then determines appropriate behaviour for the target, based on the reported hit zone and, where appropriate, instigates one of more target actions.
[0078] As discussed further below, the actions may be determined in part by a current state of the target as dictated by a behavioural state machine that provides an input to the main processing unit. The hit may also be reported to the facility more generally, for example for later review.
[0079] Figure 6 shows an alternative approach to determine the zone on the body that has been hit by a projectile. As in figure 5, the steps outlined in dashed lines are carried out by the dedicated microcontroller and the steps outlined in solid lines are carved out by the target's main processing unit.
[0080] The process starts in the same way to detect a voltage spike on any detector above a threshold, signifying a projectile impact. However, rather than aggregate the sensor outputs to determine the most likely hit zone, this approach instead uses a multilateration technique to more precisely pinpoint the location of the hit from the sensor output signals.
[0081] This is achieved in the approach illustrated in figure 6 by, having identified a spike above the threshold, sampling all of the sensors at high frequency for a predetermined time period (X milliseconds). This fine-grained data is timestamped and sent to the target's main processing unit, where hit contours are calculated based on the sensor data and the known positions of the sensors on the outer shell of the target body.
[0082] In the next step, the calculated hit contours are combined with calibration data to determine a quantized location on the shell for the hit. The calibration data is retrieved from a calibration lookup table that is created in a calibration procedure carried out before the target is deployed.
[0083] The quantized location may, in some cases, be used directly by the main processing unit to determine subsequent actions. Alternatively, as shown in figure 6, there may be a subsequent step of determining the zone that has been hit, with the hit zone then being used to determine subsequent actions, as in the approach described above with reference to figure 5. In yet other cases, the hit zone may be derived directly from the calculated hit contours and appropriate calibration data.
[0084] To further enhance the realistic behaviour of the target, it can be assigned one of a number of distinct behavioural states, with the target actions being influenced, in part, by the current state of the target. The table below sets out a set of currently envisaged states that can be assigned to the target.
Table 1 -Exemplar Target Behavioural States State Definition Idle Target is doing nothing and unaware of any opposing force Alert Target is aware of movement or sound but not necessarily aware of opposing force Stunned Target has been flashbanged and is stunned for the next few seconds Weakly Attacking Target is partially recovered from flashbang and attempting to attack, poorly Attacking Target is acting aggressively (shouting, etc) and/or firing Distractor Subdued Target has been injured (maimed) or has surrendered and is no longer attacking Dead / Killed Target has been killed [0085] The target can transition between behavioural states based on domain sensor inputs or other factors, as will now be illustrated with reference to figure 7a.
[0086] The target will generally start in an 'idle' state. If the target's environmental sensors detect sound or movement above certain thresholds, the target's state will be changed from 'idle' to 'alert'.
The intention is to simulate a state in which a person has heard a sound or spotted some movement but is not yet sure whether any action is required in response. The target may also be controlled in this scenario to turn towards the direction of the movement or sound with appropriate operation of the movement actuators.
[0087] If no further sound or movement is detected for a period of time (for example 10 to 20 seconds), the target's state will be changed back to 'idle'. The target has, in effect, 'decided' that the previous noise or motion they detected does not represent a threat.
[0088] On the other hand, if a further noise or movement above a threshold (which may be the same or different from the initial threshold) is detected by the environmental sensors, the target's state is escalated to 'attacking'. In this state, as noted in the table above, the target may be controlled to perform actions including firing in the direction from which the sound has been detected or the movement seen, and shouting or screeching. The target will also continue to track the noise and movement, turning to face the direction they are coming from.
[0089] When in the 'idle' or 'alert' state (and also potentially when in the 'attacking' state, in the event of a flashbang detonation, the target's state will be changed to 'stunned'. This simulates the effect of a flashbang on a human combatant. The target may detect the detonation of a flashbang, for example, based on the combination of the camera detecting light above a predetermined intensity at the same time as the microphones detecting sound in a predetermined frequency range associated with the detonation of a flashbang. In this 'stunned' state, the target will be non-responsive to further sensor inputs. It will not, for example, move to the 'attacking' state and may, for example, cease to track motion and sound in its vicinity.
[0090] After a predetermined initial recovery period following transition to the 'stunned' state, determined by an initial recovery timer, the target's state will be changed to 'weakly attacking'. In this state the target will start to respond to environmental sensor inputs but its actions will be compromised. For example, the ability of the target to turn towards the direction of a detected sound or movement may be limited or the sound and movement levels the target will respond to may be higher than in the 'alert' or 'attacking' states for example. In this 'weakly attacking' state the target may fire projectiles but the target's aim will be controlled to be less accurate than when in the 'attacking' state.
[0091] After a further period of time, determined by a full recovery timer, the target will be set to the attacking' state.
[0092] From any of the states discussed above, a projectile hit on the target, detected in the manner discussed above, may cause the target to move from its current state to a 'subdued' state or a 'killed' (or 'dead') state.
[0093] The appropriate change in state may be determined based on a categorisation of the hit zones on the target as, for example, a 'kill zone' or a 'maim zone'. A hit to a 'kill zone', for example a zone representing the targets head, will cause the targets state to be changed to 'killed'. Whereas a hit to a 'maim' zone would indicate that the target is badly injured but not dead and it would be moved to the 'subdued' state.
[0094] Transition of the target to the 'killed' state can trigger the target to drop down the support post or to flip down, as described above.
[0095] In the 'subdued' state, the target would cease to attack and likely would play no further part in any ongoing exercise.
[0096] A target may also be moved to the 'subdued' state based on a surrender decision. This decision may, for example, be taken autonomously by the target (based on policies that determine the targets possible behaviours) or be triggered by an external input.
[0097] In some embodiments, the magnitude of the impact of the projectile on the outer shell of the target body may also be measured (based on the amplitude of the impact sensor signal, for example).
The actions of the target, including changes in target state, can then also be influenced by the magnitude of the impact. For example, an impact with a magnitude below a predefined threshold value could be determined to be "just a scratch" and not cause a change in state even if the hit is to a 'kill zone' or 'maim zone'. As another example, an impact to a 'kill zone' below a particular magnitude threshold could cause the target to move to a 'subdued' state or a 'weakly attacking' state, rather than a 'killed' state.
[0098] As the skilled person will appreciate, this combination of detection of the location (or zone) at which a projectile impacts a target, along with the ability to use behavioural states to influence the actions of a target, provides for much more realistic target behaviour than has been previously possible.
[0099] In some embodiments, the target may be configured to implement behaviours/actions autonomously, for example based on one or more policies determining how it should behave in different scenarios. The policies can determine the autonomous behaviour of the target based, for example, on the current behavioural state of the target, as well as on a combination of sensor signals. For example, instead of receiving an instruction over the network from an external controller (e.g. network computer / server), a target may "decide" on an action itself in response to a potential threat, for example to shout for help or crouch and remain quiet. Alternatively, where the target is not autonomous or in addition to certain autonomous behaviour, actions of the target may be initiated by input received from an external controller. In some cases, targets may be semi-autonomous where a policy is provisioned by the external controller but decided by the target. In other words, the target may be configured based on instructions from the server or it could be entirely server driven, with the target behaving like a "dumb" set of sensors and actuators.
[00100] Targets in accordance with embodiments of the invention may be deployed, for instance, in training facilities that simulate urban and other close environments for use in close quarters combat training.
[00101] The targets may include geolocation technology so that their placement, e.g. within a training facility can be tracked.
[00102] Typically, multiple targets will be used in a single facility and may be controlled together from a control device (e.g. a network computer) that communicates with the targets via a communications network (which may be wireless or wired or both).
[00103] The targets described and exemplified above, as well as other targets in accordance with embodiments of the present invention, can capture the data generated by the sensors and actuators, including for example recording the times and locations of projectile impacts, recording images captured by the camera(s), recording sounds detected by the microphone(s), recording detected motion, recording the presence of specific objects in the vicinity of the target and recording the actions of the target. The captured data can be stored locally by the target and/or transmitted to another device, for example over a wired or wireless network. The captured data is also time-stamped, so that it can, for example, be synchronised with data collected by other devices in the training facility.
[00104] Data collected in this way can be used for a number of different purposes, both in real time and later in time. Uses can include, for example, simulation, performance analysis and the provision of inputs for the control of other targets (or other devices) within the training environment or facility.
[00105] For example, in the case where a smart target in accordance with an embodiment of the invention is deployed in a training facility in which virtual and real events to take place contemporaneously, the collected data can be used in real time to update the synthetic (virtual) representation of the live environment. In other words, when the target moves, shoots or shouts, its avatar (or Virtual twin') in the virtual environment will behave in a corresponding way. Similarly, when the target records a hit, the virtual twin will be hit in the relevant place and an observer in the virtual environment will see a wound appear on the virtual twin.
[00106] The recorded data can be later used, for example, to generate virtual replays of the live action as well as in other ways for performance analysis.
[00107] For instance, the performance of personnel (e.g. military troops) using the facility may be analysed using the data collected by the target in conjunction with other data, for example data identifying the location of troops and the timing of the discharge of their weapons. As just one example, projectile impacts on the target could be assigned to a specific person based on the weapon discharge timing and location data in conjunction with the time recorded for by the target for the impact of the projectile. This would enable the individual's accuracy to be assessed both for a specific scenario and over time.
[00108] The skilled person will appreciate that the targets described herein could be used in other types of facility, for example gaming facilities.
[00109] It will be understood that the above description of preferred embodiments is given by way of example only and that various modifications may be made by those skilled in the art. What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable modification and alteration of the above devices or methods for purposes of describing the aforementioned aspects, but one of ordinary skill in the art can recognize that many further modifications and permutations of various aspects are possible. Accordingly, the described aspects are intended to embrace all such alterations, modifications, and variations that fall within the spirit and scope of the appended claims.

Claims (21)

  1. CLAIMS1. A target comprising one or more domain sensors and one or more domain actuators, the domain actuators controlling actions of the target; the target having a current behavioural state, selected from a plurality of possible behavioural states; and the one or more domain actuators controlling the actions of the target at least in part based on the current behavioural state of the target.
  2. 2. A target according to claim 1, wherein the current behavioural state of the target is selected based at least in part on one or more outputs from one or more of the domain sensors.
  3. 3. A target according to claim 1 or claim 2, wherein the target moves from one state to another state based on at least in part on one or more of the outputs from the one or more domain sensors.
  4. 4. A target according to any one of the preceding claims, wherein the target moves from one state to another based at least in part on an elapsed period of time.
  5. 5. A target according to any one of the preceding claims, wherein the target moves from one state to another based on an external input.
  6. 6. A target according to any one of the preceding claims, wherein the domain actuators are controlled based at least in part on one or more outputs from the one or more domain sensors.
  7. 7. A target according to claim 6, wherein the control of the domain actuators in response to the domain sensor outputs is modified based on the current state of the target.
  8. 8. A target according to any one of the preceding claims, wherein the possible target states include any one or more of idle; alert; attacking; stunned; weakly attacking; subdued; and dead.
  9. 9. A target according to claim 8, wherein the target moves from an idle state to an alert state based on the domain sensors having detected sound or movement.
  10. 10. A target according to claim 9, wherein the target moves from the alert state back to the idle state after a predetermined period of time has elapsed at the idle state.
  11. 11. A target according to claim 9, wherein the target moves from an alert state to an attacking state based on the domain sensors having detected further movement or further sound.
  12. 12. A target according to any one of claims 8 to 11, wherein the target moves to a stunned state based on the domain sensors having detected a flashbang detonation in the proximity of the target.
  13. 13. A target according to claim 12, wherein the flashbang detonation is detected based on a combination of outputs from the domain sensors, including one or more domain sensors detecting light above a predetermined intensity at the same time as one or more domain sensors detecting sound in a predetermined frequency range associated with the detonation of a flashbang.
  14. 14. A target according to claim 12 or claim 13, wherein the target moves from the stunned state to a weakly attacking state after a predefined period of time has elapsed from detection of the flashbang condition.
  15. 15. A target according to any one of claims 8 to 14, wherein the target moves to a subdued state based at least in part on one or more of the domain sensors detecting an impact from a projectile to a maim zone on the target.
  16. 16. A target according to any one of claims 8 to 15, wherein the target moves to a subdued state based at least in part on a surrender decision.
  17. 17. A target according to claim 16, wherein the surrender decision is made by the target autonomously.
  18. 18. A target according to any one of claims 8 to 17, wherein the target moves to a killed state based at least in part on one or more of the domain sensors detecting an impact from a projectile to a kill zone on the target.
  19. 19. A target according to any one of the preceding claims, wherein the domain sensors include any one or more of: sensors for detecting an impact of a projectile on the target; sensors for detecting sound in the proximity of the target; sensors for detecting motion in the proximity of the target; sensors for detecting objects in the proximity of the target; and sensors for capturing images of the target surroundings.
  20. 20. A target according to any one of the preceding claims, wherein the actions of the target controlled by the domain actuators include any one or more of: movement of the target; sound emission from the target; and firing ammunition from the target.
  21. 21. A target according to any one of the preceding claims, wherein sensor outputs and/or actuator actions are recorded in the target for later retrieval and/or sent to an external server.
GB2112974.7A 2021-09-10 2021-09-10 Targets Pending GB2617532A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2112974.7A GB2617532A (en) 2021-09-10 2021-09-10 Targets
PCT/IB2022/058517 WO2023037312A1 (en) 2021-09-10 2022-09-09 Targets

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2112974.7A GB2617532A (en) 2021-09-10 2021-09-10 Targets

Publications (2)

Publication Number Publication Date
GB202112974D0 GB202112974D0 (en) 2021-10-27
GB2617532A true GB2617532A (en) 2023-10-18

Family

ID=78149395

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2112974.7A Pending GB2617532A (en) 2021-09-10 2021-09-10 Targets

Country Status (1)

Country Link
GB (1) GB2617532A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120091660A1 (en) * 2010-10-18 2012-04-19 Lockheed Martin Corporation Target impact-point sensing system
US20120171644A1 (en) * 2009-09-23 2012-07-05 Marathon Robotics Pty Ltd Methods and systems for use in training armed personnel
US10527392B2 (en) * 2016-09-27 2020-01-07 Tactical Trim E.K. Target

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120171644A1 (en) * 2009-09-23 2012-07-05 Marathon Robotics Pty Ltd Methods and systems for use in training armed personnel
US20120091660A1 (en) * 2010-10-18 2012-04-19 Lockheed Martin Corporation Target impact-point sensing system
US10527392B2 (en) * 2016-09-27 2020-01-07 Tactical Trim E.K. Target

Also Published As

Publication number Publication date
GB202112974D0 (en) 2021-10-27

Similar Documents

Publication Publication Date Title
US11112204B2 (en) Firearm simulators
JP6646792B2 (en) target
US5823779A (en) Electronically controlled weapons range with return fire
US4934937A (en) Combat training system and apparatus
CN107121019A (en) A kind of group's confrontation fire training system
US10234247B2 (en) Projectile weapon training apparatus using visual display to determine targeting, accuracy, and/or reaction timing
CN106225556A (en) A kind of many people shot strategy training system followed the tracks of based on exact position
WO2005026643A2 (en) Archery laser training system and method of simulating weapon operation
US8777226B1 (en) Proxy target system
GB2617532A (en) Targets
GB2610615A (en) Targets
WO2023037312A1 (en) Targets
CN206989824U (en) A kind of intelligent mobile target assembly
US20020173940A1 (en) Method and apparatus for a simulated stalking system
JP2007247939A (en) Target system
US8894412B1 (en) System and method for mechanically activated laser
JP2001124497A (en) Shooting training device
CN110145959A (en) A kind of supersonic image fire training system
US20240087469A1 (en) System for behaviour monitoring
US20240054260A1 (en) System for behaviour monitoring
US20210372729A1 (en) Apparatus and Method for Measuring and Training Movement for Combat Firearms Performance
KR200325705Y1 (en) Digital movable target assembly for battle practice
CN116798288A (en) Sentry terminal simulator and military duty training assessment simulation equipment
WO2023154027A2 (en) Shooting range system having blank cartridge and blank trigger with laser image processing
CN105403099A (en) Actual combat shooting training system