WO2020167530A1 - Dispositifs embarqués avec dispositifs de visualisation connectés en réseau permettant un suivi simultané d'une cible par de multiples autres dispositifs - Google Patents

Dispositifs embarqués avec dispositifs de visualisation connectés en réseau permettant un suivi simultané d'une cible par de multiples autres dispositifs Download PDF

Info

Publication number
WO2020167530A1
WO2020167530A1 PCT/US2020/016619 US2020016619W WO2020167530A1 WO 2020167530 A1 WO2020167530 A1 WO 2020167530A1 US 2020016619 W US2020016619 W US 2020016619W WO 2020167530 A1 WO2020167530 A1 WO 2020167530A1
Authority
WO
WIPO (PCT)
Prior art keywords
scope
target
target position
presumed
location
Prior art date
Application number
PCT/US2020/016619
Other languages
English (en)
Inventor
Douglas FOUGNIES
Robert A. PRESSMAN
Larry L. DAY
Taylor J. CARPENTER
Mark J. Howell
Original Assignee
Fougnies Douglas
Pressman Robert A
Day Larry L
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/272,733 external-priority patent/US10408573B1/en
Application filed by Fougnies Douglas, Pressman Robert A, Day Larry L filed Critical Fougnies Douglas
Priority to KR1020217028143A priority Critical patent/KR20210133972A/ko
Priority to EP20755165.6A priority patent/EP3924683A4/fr
Priority to CN202080013865.5A priority patent/CN113424012B/zh
Publication of WO2020167530A1 publication Critical patent/WO2020167530A1/fr

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/02Aiming or laying means using an independent line of sight
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G1/00Sighting devices
    • F41G1/38Telescopic sights specially adapted for smallarms or ordnance; Supports or mountings therefor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/04Aiming or laying means for dispersing fire from a battery ; for controlling spread of shots; for coordinating fire from spaced weapons
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/06Aiming or laying means with rangefinder
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/145Indirect aiming means using a target illuminator
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/14Indirect aiming means
    • F41G3/16Sighting devices adapted for indirect laying of fire
    • F41G3/165Sighting devices adapted for indirect laying of fire using a TV-monitor
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G9/00Systems for controlling missiles or projectiles, not provided for elsewhere
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41HARMOUR; ARMOURED TURRETS; ARMOURED OR ARMED VEHICLES; MEANS OF ATTACK OR DEFENCE, e.g. CAMOUFLAGE, IN GENERAL
    • F41H7/00Armoured or armed vehicles
    • F41H7/005Unmanned ground vehicles, i.e. robotic, remote controlled or autonomous, mobile platforms carrying equipment for performing a military or police role, e.g. weapon systems or reconnaissance sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/53Determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/0009Transmission of position information to remote stations
    • G01S5/0018Transmission from mobile station to base station
    • G01S5/0027Transmission from mobile station to base station of actual mobile position, i.e. position determined on mobile
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/0009Transmission of position information to remote stations
    • G01S5/0045Transmission from base station to mobile station
    • G01S5/0054Transmission from base station to mobile station of actual mobile position, i.e. position calculation on base station

Definitions

  • a sight or optical viewer which incorporate lenses to magnify an image or simply passes through light without magnification also referred to as a“scope,” is a sighting device that is based on an optical refracting telescope or other optical viewing device. It includes some form of graphic image pattern (a reticle or cross-hairs) mounted in an optically appropriate position in its optical system to give an accurate aiming point.
  • Telescopic sights are used with all types of systems that require accurate aiming but are most commonly found on firearms, particularly rifles.
  • a telescopic sight may include an integrated rangefinder (typically, a laser rangefinder) that measures distance from the observer’s sighting device to a target.
  • a compass is an instrument used for navigation and orientation that shows direction relative to the geographic“cardinal directions,”, or“points.”.
  • A“compass rose” diagram shows the directions north, south, east, and west as abbreviated initials marked on the compass.
  • the rose can be aligned with the corresponding geographic directions, so, for example, the "N" mark on the rose really points to the north.
  • angle markings in degrees may be shown on the compass. North corresponds to zero degrees, and the angles increase clockwise, so east is 90 degrees, south is 180, and west is 270. These numbers allow the compass to show azimuths or bearings, which are commonly stated in this notation.
  • GPS data typically provides a three-dimensional location (latitude, longitude, and altitude (elevation)). For example, a sample GPS of a location in Philadelphia is as follows:
  • Miniaturized GPS devices are known that include a GPS receiver for providing GPS location data and an orientation sensor for providing attitude data.
  • the orientation sensor may derive its data from an accelerometer and a geomagnetic field sensor, or another combination of sensors.
  • One such miniaturized GPS device that is suitable for use in the present invention is a device that is commercially available from Inertial Sense, LLC located in Salem, Utah. This device is marketed as“pINS” and“pINS-2” (“INS” is an industry abbreviation for“Inertial Navigation System.”)
  • the pINS” and pINS-2 are GPS-aided Inertial Navigation Systems (GPS/INS).
  • GPS/INS uses GPS satellite signals to correct or calibrate a solution from an inertial navigation system (INS).
  • nodes can be formed into a network using a variety of network topologies including hub and spoke and mesh.
  • nodes communicate through one or more base stations which in turn are directly or indirectly connected to a mobile switching center (MSC).
  • MSCs are interconnected based on industry standards which enable nodes in a cellular network to communicate with other nodes that are connected to different based stations.
  • GSM Global System for Mobile communications
  • LTE Long Term Evolution
  • CDMA Code Division Multiple Access
  • a common feature in cellular networks is the capability of allowing nodes to connect to the Internet.
  • Wireless Local Area Network (WLAN) technology allows nodes to establish a network.
  • WLAN standards including 802.11a, b, g and n.
  • 802.11s is a WIFI based mesh networking standard.
  • Bluetooth ® is another standard for connecting nodes in a network and mesh networking capability has recently been added to the Bluetooth LE standard by the Bluetooth Special Interest Group. Accordingly, through various standards, it is possible to implement point to point, point to multipoint and mesh WLAN, all of which are suitable for use with the present invention.
  • Mesh network topology has significant advantages for mobile devices, particularly in remote areas where there is limited cellular service since each node can be connected to multiple other nodes and there is no required path from any node in the network to any other node.
  • a further advantage of a mesh network is that as long as any one node in the mesh network has access to the Internet such as by way of a cellular or satellite connection, all of the nodes in the mesh network have access.
  • a representative wireless mesh networking chipset that is suitable for use with the present invention is the RC17xx(HP)TM (TinymeshTM RF Transceiver Module), which is commercially available from Radiocrafts AS and Tinymesh, both located in Norway.
  • the chipset incorporates the Tinymesh application for the creation of mesh networks.
  • the ideal mesh network chipset for the present invention is small, and has high power and a long range, and should operate in unlicensed spectrum.
  • a network of scopes including one or more lead scopes and one or more follower scopes, is provided to allow scope operators of the respective scopes to track the same presumed target.
  • a lead scope locates a target and communicates target position data of the presumed target to the follower scope.
  • the follower scope uses the target position data and its own position data to electronically generate indicators for use to prompt the operator of the follower scope to make position movements so as to re-position the follower scope from its current target position to move towards the target position defined by the target position data received from the lead scope.
  • a network of scopes including one or more lead scopes and one or more follower scopes, is provided to allow the respective scopes to track the same presumed target.
  • a lead scope locates a target and communicates target position data of the presumed target to the follower scope.
  • the follower scope uses the target position data and its own position data to electronically generate indicators for use to allow the follower scope to make position movements so as to re-position the follower scope from its current target position to move towards the target position defined by the target position data received from the lead scope.
  • At least the second scope is mounted to, or integrated into, a vehicle, which uses the target position data to move to a new position so as to allow the second scope to better view the target.
  • FIGS 1 A, IB, 2 and 3 are schematic diagrams of system components in accordance with preferred embodiments of the present invention.
  • Figures 4A-4C are optical sights in accordance with preferred embodiments of the present invention.
  • Figure 5 shows a sample preset list that may be displayed on a display of scope in accordance with one preferred embodiment of the present invention.
  • Figure 9A is a schematic diagram of a surveillance environment having a plurality of scopes, some of which are vehicle-based.
  • Figure 9B is a schematic diagram of a vehicle having vehicle-based devices in the surveillance environment of Figure 9A.
  • Figure 10 is a flowchart in accordance with another preferred embodiment of the present invention.
  • FIGS 11 A-l ID show surveillance environments having a plurality of scopes and a presumed target in accordance with preferred embodiments of the present invention.
  • Preferred embodiments of the present invention provide for devices having network- connected scopes which are designed to hone in on the same target, which may be a still or moving target.
  • a“lead scope” identifies a target and communicates location data regarding the target to a“follower scope” which uses the location data from the lead scope and its own location and orientation data to hone in the target.
  • the lead scope and the follower scope communicate through any available wireless data communication technology including cellular, satellite or one or more WLAN technologies.
  • a first scope identifies a target and communicates location data regarding the target to a plurality of other scopes which use the location data from the first scope and their respective location and orientation data to hone in on the target.
  • additional scopes locate the target, they communicate their location data regarding the target to a network server which amalgamates location data that is accumulated from each scope that identified the target to define successively more precise location data of the target (i.e., more data points increase the precision of the location), which is then communicated to the scopes that have not yet located the target.
  • the scopes that have previously reported the location of the target may also receive the latest location data of the target to assist in tracking the target.
  • the scopes in this embodiment can be connected using any available WLAN technology but in the preferred embodiment, a mesh networking technology is used to enable the plurality of scopes to communicate with each other. It is understood that any one of the scopes can perform the functions of the network server or the functions of the network server can be distributed among the plurality of scopes for redundancy in case one of the scopes loses connectivity to the WLAN. Ideally, at least one of the scopes is connected to the Internet and the other scopes in the network are thus able to access the Internet through the connected scope by way of the mesh network.
  • the target may be a moving object
  • the location data of the target for the scopes that have identified the target is continuously streamed to the scopes that have not yet located the target.
  • the location of the target is only sent when the lead scope activates a switch that designates the target.
  • the scope and/or the network server will predict the future location of the target assuming continued movement in the same direction using known techniques.
  • device The device is the object that a scope is integrated into. Examples of such devices include a rifle, gun, binoculars, smart eyeglasses or goggles, helmet visor and a drone. Certain types of devices are themselves“scopes,” such as binoculars, telescopes and spotting scopes.
  • the device may be handheld or may be mounted on a land, aerial or water based vehicle.
  • target The target is the object of interest. It may be a person, animal or object, and may either be stationary or moving.
  • lead scope The lead scope is the first scope that identifies a target. In the first embodiment, there is only one lead scope. In the second embodiment, the lead scope is only the first scope that located the target.
  • any scope in the network can function as a lead scope.
  • follower scope The follower scope is a scope that attempts to hone in on the same target that the lead scope identified. In the first embodiment, there may be one or more follower scopes. In the second embodiment, the follower scopes include all of the scopes that have not yet honed in on the target that the previous set of scopes (including the lead scope) has identified. In one preferred embodiment, any scope in the network can function as a follower scope.
  • scopes can function as either a lead or follower scope.
  • certain scopes may be dedicated to a lead or follower role, and certain scopes may have more or less functionality than other scopes.
  • a device having a scope includes each of the following measurement devices (or their equivalents):
  • GPS/INS device provides location data of the device
  • location data of the device (could be implemented as two or more distinct devices such as a GPS receiver, gyroscope and accelerometer)
  • compass (provides the direction of the target relative to the position of the scope (north, south, east, and west)).
  • the compass may be a standalone device, or may be incorporated into the GPS/INS and determine direction using GPS compassing. GPS compasses often have two antennas and if the device is a pair of binoculars, one option would be to place an antenna on each barrel. Accuracy can be increased by increasing the separation of the antennas used by the GPS compass such as through the use of one or more fold-out arms, booms, lighter than air balloons or other mechanical means to obtain separation, or by connecting a second antenna through an RF or optical connection.
  • Data from these measurement devices are used to calculate the position of the target, which may be expressed in GPS coordinates or the like.
  • the operator of a device that contains the lead scope identifies a presumed target.
  • the rangefinder is activated and the data from the measurement devices is stored in memory.
  • the system includes a network server and the network server receives the raw data from the measurement devices transmitted by the lead scope, it calculates the target position and stores the data. If the network server receives the calculated target position, it stores this data and forwards it to other scopes. It is understood that the system can be operated without a network server and that the features described as being performed by the network server could be performed by any scope or device in the network or by a remote network server to which the scopes are connected via the Internet.
  • the device containing the follower scope also includes the same set of measurement devices (or their equivalents).
  • the follower scope uses its own location data and the target position to calculate the bearing and attitude where the follower scope should aim so as to be pointing at the same target position as the lead scope.
  • the follower scope could include a reduced set of measurement devices and operate with reduced functionality. For example, if the rangefinder was not included in the follower scope, it would have limited functionality as a lead scope.
  • Additional options include the ability of the lead scope to capture a digital image of the target using a digital image sensor incorporated into or attached onto the scope and transmit the digital image to the follower scope so that the operator of the follower scope would know what it is looking for.
  • a further option would be for the follower scope to signal back to the lead scope that it sees the target and to transmit a digital image of its view of the target. Capturing a digital image of the target could have unique applications in military and law enforcement. For example, if at least one of the scopes is connected to the Internet and the digital image is a human face, the digital image could be transmitted through the Internet to a database that would attempt to match the face using facial recognition. If a match is identified, additional information about the target could be provided to each of the scopes.
  • other biometric measures can be captured and transmitted such as gait and facial blood vessel patterns which when used with a thermal imager can form a digital fingerprint of a human face.
  • Figure 1 A shows a system view wherein a plurality of devices 10 (devicei-devicen) and non-device/non-scope nodes 12 (nodei-noden) are in communication with a network server 16 via wireless communication and an electronic network 18.
  • the electronic network 18 is represented by the solid lines connecting the devices 10 to the network server 16.
  • the electronic network 18 may be implemented via any suitable type of wireless electronic network (e.g., local area network, wide area network (the Internet)).
  • the functions of the one or more non- device/non-scope nodes 12 ((nodei-noden) are described below.
  • at least the network server 16 is connected to the Internet 20.
  • Figure IB shows the topology of a mesh network 22 that is suitable for use in preferred embodiments of the present invention.
  • the plurality of devices 10 and the network server 16 are nodes 24 in the mesh network 22, and thus these elements are labeled as nodes 24 in Figure 1 A.
  • each of the nodes 24 are capable of being in communication with each other via the mesh network 22.
  • either the network server 16 becomes another node 24 in the mesh network 22, or there is no network server 16, or one or more of the device scopes perform functions herein described as being performed by the network server 16.
  • at least one of the nodes 24 is connected to the Internet 20. Additionally, there may be one or more nodes 26 that are outside of the mesh network 22, but which can
  • devices/nodes may connect to the network in different fashions.
  • five of the nodes could be in range of the mesh network 22.
  • the sixth node could be out of range and connected to the network by a cellular or network signal via the Internet 20.
  • orientation sensor 38 (attitude)
  • scope 42 (the structure of the scope will depend upon the type of device)
  • audiovisual display device 44 (which can be either standalone or integrated into the scope)
  • network interface 46 in communication with a wired or wireless communication transceiver 48
  • the audiovisual display device 44 is the element that provides prompts/messages and indicators to the user. In follower scopes, information provided by the audiovisual display device 44 assists the user in honing in on the target. Depending upon the type of device 10 and the environment that the device 10 is used in, there may be only video, only audio, or both audio and video provided by the audiovisual display device 44.
  • FIG. 3 shows elements of the network server 16, including a processor 52, memory 54, image analysis and manipulation software (IAMS) 56 which can implemented using artificial intelligence software, and a network interface 58 in communication with a wired or wireless communication transceiver 60.
  • IAMS image analysis and manipulation software
  • processor functions of the individual devices 10 and the network server 16 depend upon the system architecture and the distribution of computing functions. As described herein, some of these functions can be performed at either processor 30 or 52, whereas other functions may be performed by the network server’s processor 52.
  • Figures 4A-4C each show an optical sight (scope) for a rifle having an integrated audiovisual display device.
  • the display device is located at the zero degree position and presently reads“MOVE LEFT.”
  • the display device has four separate areas, at zero, 90, 180 and 270 degrees.
  • the display device in Figure 4B is currently indicating to move left (left arrow at 270 degrees is on indicated by a solid line, whereas the other three arrows for up, right and down, are off, as indicated by dashed lines).
  • Figure 4C is similar to Figure 4A, except that it includes an additional display element that shows the image that the user should be trying to locate. The direction prompts in these figures indicates that this rifle is presently functioning as a following scope. III. ADDITIONAL CONSIDERATIONS
  • the lead scope and follower scope(s) When calculating a presumed target position from GPS data and the other measurement devices, there are known, quantifiable errors introduced by the lead scope and follower scope(s), which can be represented by discrete values (e.g., +/- 20 cm). Certain types of errors will be consistent from scope to scope based on inherent limitations of the measurement devices. Other types of errors may depend upon signal strength, such as the strength of a GPS signal or number of satellites used to calculate the position of the lead scope. For each calculated target position, the lead scope, follower scope and/or network server identifies the error value. When amalgamating and accumulating target positions from multiple scopes to calculate an updated target position, the error values may be used to weight the strength given to each target position.
  • target positions with the lowest error values may be more highly weighted.
  • a target position with a very high error values compared to other target position error values may be deleted from the calculation.
  • One way to use the additional data to more accurately predict the position of the target would be to place points representing each estimated target position on a 3 -dimensional grid and estimate the center point or average location of the data representing the estimated targets. The center point can be adjusted based on weighting as discussed above.
  • a temporal factor may be used. For example, the most recently observed target positions may be given greater weighting. Certain target positions may be eliminated entirely from the weighting after a predetermined period of time has passed from the observation time.
  • the temporal factor may also be affected by the nature of the target for embodiments where the type of target is determined (e.g., car, person, deer) by the IAMS and/or by the scope.
  • the temporal factor is likely to be more important for fast-moving targets compared to slow- moving targets.
  • a fast moving target e.g., a car
  • the most recently observed target positions may be given significantly greater weighting, and older target positions would likely be eliminated more quickly from the weighting compared to slower moving targets.
  • the IAMS may also use various algorithms to determine if the target is actually moving, and if so, at what speed. This calculation may then be used for the temporal factor. For example, if a target appears to be stationary, then no temporal factor will be applied to the weightings. The algorithm may look at multiple observed target positions and if they are relatively similar after factoring in their respective error values, and were observed at significantly different time intervals (i.e., not very close in time), it can be concluded that the target is stationary.
  • the visual indicator visually communicates the error information in a form that is useful to the device operator.
  • an error box may by overlaid around the dot so that the operator of the device knows that the target may be in any of the areas within the error box, and is not necessarily exactly where the dot is showing.
  • the error box presumably becomes smaller as more target positions are identified by a succession of follower scopes.
  • the exact manner in which the error information is communicated depends upon how the presumed target position is displayed on a follower device.
  • the target is represented by a one-dimensional object on a display screen, such as a dot.
  • the target is represented by a simulated two- dimensional or three-dimensional image on the display screen. If a digital image is captured and transmitted, the actual image of the target may be displayed on the screen.
  • IAMS image analysis and manipulation software
  • AI artificial intelligence
  • the simulation process allows for the target to be rotated so that it appears properly positioned with respect to the follower scope.
  • a lead scope identifies a deer (target) that is a quarter-mile away and is facing the device head-on.
  • the target position of the deer and a physical image of the deer is captured by the scope and communicated to the network server.
  • the IAMS in the network server or remotely accessed via the Internet identifies key visual features within the image and compares these features with known objects to categorize the target as a front view of the deer and retrieves a simulated image of a deer from its database.
  • a follower scope receives target position data regarding the deer, and it is determined that the follower scope is also about a quarter-mile from the deer but is 90 degrees off compared to the lead scope.
  • the IAMS can then rotate the simulated deer by 90 degrees and communicate a side view of the deer for display on the follower scope so that the follower scope knows what the deer is likely to look like.
  • the IAMS could attempt to match the target image to a person using facial recognition or other biometric techniques. If there is a match, information about the target could be returned to the scopes.
  • a further application of an image display system incorporated into the scopes would be the ability of the follower scope to retrieve a high-resolution aerial image or topographical map and display the aerial image or map on the display of the follower scope together with some indicia of the approximate location of the target. If error information is known, a box can be displayed on the aerial image or topographical map showing the area in which the target may be located.
  • one embodiment includes a transparent display overlay that is activated to highlight a target in a particular color or draw a box around the target. If the follower scope has a visual display, the matched target is designated as described above.
  • a lead scope identifies a deer (target) that is a quarter-mile away and is facing the device head-on.
  • the target position of the deer and a physical image of the deer is captured by the scope and communicated to the network server.
  • the IAMS in the network server or remotely accessed via the Internet uses computer vision techniques to segment the image, separating the target from the background image.
  • the IAMS generates a set of key identifiable features within the image segment, such as the points on the deer’s antlers and a white patch on its side.
  • the IAMS performs pattern matching on the incoming follower scope images, comparing key features within the image with the target feature-set generated from the target scope and adjusted for the follower scope’s viewing angle. If a pattern match occurs, the location of the target, within the follower scope field-of-view, is transmitted to the follower scope.
  • the follower scope knows its GPS coordinates and it has received the approximate GPS coordinates of the target from the lead scope or network server (or calculated the target position based on directly or indirectly wirelessly receiving the raw measurement data from the lead scope. With this information, the follower scope (or the network server or another node in the network) calculates a route between the two GPS coordinates. Unlike a vehicle route where you are effectively only determining a two-dimensional direction from point A to Point B, the follower scope also determines a precise vector and range from its position to the position of the target. Since the follower scope also has a GPS/INS device, it uses the information concerning the calculated vector to the target to direct the user to point the follower scope in alignment with the vector to the target.
  • the preferred embodiment transmits the image and/or heat signature to the other devices in the system
  • at least a portion of the devices may not have visual displays.
  • the follower scope may rely simply on directional arrows or other indicia to direct the user of the follower scope to the target.
  • a connection between the follower scope and a pair of headphones may be used which directs the use to move the device (e.g., up, down, left, right).
  • range information from the rangefinder is not used for identifying the target at the follower scope. Since optical scopes and binoculars focus for variable distances, the guidance to target information may also include indicia to allow the user to know the correct distance to look at or focus on. In the audio embodiment, commands may be provided to focus nearer or further, look closer, or the like. Stated another way, the user is already looking along a vector calculated based on the known target location and the known location of the follower scope. The rangefinder can be used to get an idea of whether you are too far or too close to the target. For example, the target may be one mile away, but the user is currently looking 1.5 miles away.
  • the lead scope may incorporate cross-hairs or other target selection indicia such as a reticle to mark the target.
  • the rangefinder detects the distance to the target and the system determines the coordinates of the target and notifies the follower scopes of the target position as described above or communicates with an available network server to store the coordinates of the target.
  • the lead scope may incorporate the switch to send the information to follower scopes into a sensor on or adjacent to the trigger.
  • a more complex follower scope may include a higher resolution display and utilize augmented reality techniques to overlay visual information received from the lead scope and indicia directing the follower scope to the target onto an optical field-of-view of the follower scope.
  • An overlay may be implemented by a heads-up display or equivalent or by switching to a complete digital display.
  • terrain features may be on the path of the vector between the follower scope and the target. For example, if the lead scope is one mile due north of the target and the follower scope is two miles due south, there could be a hill between the follower scope and the target.
  • Detailed topographic maps and navigational tools are readily available. For example, software products such as Terrain Navigator Pro, commercially available from Trimble® subsidiary MyTopoTM, Billings, Montana, provides detailed topographical maps of the entire U.S. and Canada and incorporates U.S. Geological survey maps at various scales.
  • either a computer in the lead scope or a computer in an intelligent node in the network of connected scopes can overlay the vector between the follower scope and the target onto a topographical map of the area and determine if the vector passes through a terrain feature that would make it impossible for the follower scope to see the target. If an obstruction is present, an indicia that the target is blocked from view may be presented to the user of the follower scope.
  • the follower scope could support the simultaneous tracking of multiple targets of interest. Instead of selecting a single target of interest from a list of available targets, the user of a follower scope would have the ability to toggle each available target as shown or hidden. If an available target is set to show, indicia would be added to the follower scope overlay, annotated with a label indicating which target of interest it is guiding towards.
  • the user of a scope could make a mistake and improperly indicate that it has selected a target previously designated by a lead scope when in fact the scope is actually designating a different target. This could occur for a variety of reasons with one example being the same type of animal being within the error box.
  • the IAMS would have the capability of comparing the two images and determining that the target images have a low probability of being the same target and the that scope is acting as a lead scope and sending data associated with a new target.
  • an IAMS-enabled lead scope having object classification functionality the operator can select the type of target they are looking for from a preset list (e.g. car, person, deer), at which point an image is captured from the lead scope and the IAMS highlights any objects within the view that match the specified object type, such as with a bounding box or highlighted image segment.
  • the lead scope can then be pointed at one of the highlighted potential targets and activated to designate the target.
  • the image processing can be continuous, such that as the lead scope is moved around, any objects that are found to match the specified object type are highlighted.
  • the automatic target detection is extended to one or more follower scopes using features described in the image simulation and display of section C above.
  • the IAMS calculates how the target image should appear based on the location of a specific follower scope with respect to the lead scope.
  • the appearance factors in the angle e.g., same angle (head on), rotated +/-90 degrees (left or right side view), rotated 180 degrees (butt view)
  • distance e.g., same, bigger, or smaller in size, depending upon distance to the target.
  • An image is captured from the field-of-view of the follower scope and automated pattern identification is performed to determine if the expected target image from the lead scope, as it was calculated to appear by the follower scope, is actually in the field-of-view of the follower scope. For example, if a deer is supposed to appear rotated +90 degrees, a deer that is facing the follower scope head on, as determined from the automated pattern recognition, would not likely be the correct target. However, if the deer is supposed to appear rotated +90 degrees, and a deer is determined to be in the field-of-view of the follower scope and is also determined to be rotated +90 degrees, as determined from the automated pattern recognition, the deer is likely to be the correct target.
  • the IAMS receives data regarding the focal lengths of the respective scopes so that any such adjustments can be made.
  • Preferred embodiments of the present invention may be implemented as methods, of which examples have been provided.
  • the acts performed as part of the methods may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though such acts are shown as being sequentially performed in illustrative embodiments.
  • Figure 6 is a flowchart of a process for tracking a single presumed target by a first scope and a second scope located remotely from one another and being moved by separate scope operators, wherein each of the scopes include a plurality of measurement devices configured to provide current target position data.
  • the process is implemented by at least the following steps:
  • the first scope electronically communicates to the second scope the current target position data regarding the presumed target identified by the operator of the first scope.
  • the second scope identifies its current target position data of the second scope’s current target position using its plurality of measurement devices.
  • the processor of the second scope outputs electronically generated indicators for use by the second scope to prompt the operator of the second scope to make the position movements.
  • the operator of the second scope uses the indicators to re-position the scope from its current target position so as to move towards the target position defined by the current target position data received from the first scope.
  • the first scope electronically communicates to the network server the current target position data regarding the presumed target identified by the operator of the first scope.
  • the network server communicates to the remaining scopes the current target position data regarding the presumed target identified by the operator of the first scope.
  • the network server electronically communicates the updated current target position data regarding the presumed target to the remaining scopes that have not yet located the presumed target.
  • Figure 8 is a flowchart of a process for tracking a plurality of presumed targets by a plurality of lead scopes and one or more follower scopes located remotely from one another and being moved by separate scope operators, wherein each of the scopes include a plurality of measurement devices configured to provide current target position data, and each of the scopes are in electronic communication with a network server.
  • the process is implemented by at least the following steps: 800:
  • the plurality of lead scopes identify current target position data regarding a presumed target that is located by an operator of the respective lead scope, using the plurality of measurement devices in the respective lead scope.
  • the plurality of lead scopes electronically communicate to the network server (i) the current target position data regarding the presumed target identified by the operator of the respective lead scope, and (ii) information regarding each of the presumed targets.
  • the network server communicates to the one or more follower scopes (i) the current target position data regarding the presumed targets identified by the operators of the lead scopes, and (ii) the information regarding each of the presumed targets.
  • additional voice and data information may be exchanged by the hunters, such as verification that the specific target of interest (here, the prey) is within the legal limits for hunting.
  • an error box may be overlaid around the presumed target position on a display screen of the device.
  • the error box is based on the combination of the errors introduced by the lead scope and further error introduced by the follower scope.
  • the error introduced by the lead scope and follower scope is a function of, among other things, the accuracy of the sensors for position, range and orientation, range to target and the optical characteristics of each scope.
  • a night vision goggle viewable laser may be used to mark the target. If the follower scope has night vision capability, once the follower scope is pointed at the correct area of interest, it would be able to verify that it was looking at the correct target by observing the laser on the target.
  • a second retractable telescoping structure 96 having a second set of surveillance equipment 97 mounted thereon, and being mounted completely inside of the vehicle when fully retracted, and extending partially through the sunroof 93 when in use.
  • the second set of surveillance equipment 97 may also include one of the devices 10.
  • the second retractable telescoping structure 96 In its fully extended, upright position, the second retractable telescoping structure 96 also effectively functions as a mast, and the second set of surveillance equipment 97 is preferably mounted at or near a top portion of the mast.
  • the first and/or second set of surveillance equipment 95, 97 may also include the plurality measurement devices described above that are necessary to provide current target position data. Accordingly, in this embodiment, either or both sets of surveillance equipment 95, 97 may include one of the devices 10.
  • a fixed tower 101 may include its own fixed device 10 having a scope integrated therein.
  • a fixed tower 101 may receive data from one or more of the vehicle-mounted devices 10 and handheld devices 10 for subsequent relaying to a network server.
  • This type of fixed tower is a non-device/non-scope node 12, as described above with respect to Figures 1 A and IB.
  • each of the devices 10 may function as a node 24 in the wireless communication and electronic network 18 described above.
  • the GPS coordinates of any of the devices 10 may be shared.
  • the devices 10 are shown in close proximity to each other. However, this is just for illustration purposes so as to show a plurality of different types of devices in the same surveillance environment.
  • the devices 10 may actually be miles away from each other, such as 5-10 miles from each other.
  • the sensors on the devices 10 may have large ranges, such as up to 7.5 miles for target detection. Accordingly, Figure 9A is not to scale.
  • a first scope scans an area and identifies a stationary or moving target (i.e., object of interest), and reports position data of the target either directly to a second scope, or to a network server that the second scope is in communication with so as to obtain the position data.
  • a stationary or moving target i.e., object of interest
  • the second scope obtains the position data and is provided with position movement (re-positioning data) so as to locate the target.
  • the vehicle that the second scope is mounted to or integrated into is directed to move to a new and“better location” (improved location) for the second scope to view the target.
  • a better location may be defined by one or more factors, such as being closer to the target, having a less obstructed view of the target, being at a higher elevation to view the target, or being at the best position for capturing biometric data of the target (e.g., a face of a person or animal).
  • the improved location may be improved relative to the vehicle’s current position, and/or improved relative to the current location of the first scope.
  • the second scope also reports back the target position data directly to the first scope, or to a network server that the first scope is in communication with so as to obtain the position data.
  • the first scope may then use this position data to assist in better identifying position data of the target.
  • the truck operator may receive directions (position movements) regarding where to move the truck, so that a mast-mounted scope can better see the target. Once the truck is in a better location, it may still be necessary for the scope to be re-oriented/repositioned.
  • the process for getting the second scope into the best position to view the target may involves two separate and processes, namely (1) moving the vehicle (that the second scope is mounted to or integrated into) to a better location, and (2) re orienting/repositioning the second scope.
  • This process may be iterative, in that the second scope may be continuously re-oriented/repositioned as the vehicle position changes.
  • a first scope scans an area and identifies a stationary or moving target (i.e., object of interest), and reports position data of the target either directly to a vehicle which is remote from the first scope and that includes the second scope mounted to or integrated therein, or to a network server that the vehicle is in communication with so as to obtain the position data.
  • a stationary or moving target i.e., object of interest
  • the vehicle that the second scope is mounted to or integrated into obtains the position data and is provided with position movement data so as to move the vehicle to a particular location (e.g., the“better location” described above) that would allow the second scope to view the target.
  • a particular location e.g., the“better location” described above
  • the second scope attempts to locate the target using the position data from the first scope.
  • the vehicle and/or the second scope may then be iteratively moved or re-positioned in the same manner as described above in Example 1.
  • Example 2 differs from Example 1 in that the second scope does not attempt to locate the target until the vehicle is first moved to a new location based on the position data of the target received from the first scope.
  • FIG. 1 A and IB This example illustrates another embodiment that relies upon a network of scopes, as shown in Figures 1 A and IB.
  • the first scope or the network server has knowledge of the position of the other scopes. 1.
  • a first scope which initially acts as a lead scope, scans an area and identifies a stationary or moving target (i.e., object of interest), but the first scope has a poor view of the target.
  • the first scope or the network server directs the second scope to locate the target using the position data from the first scope.
  • the second scope then takes over as the lead scope, and sends its newly collected target position data to the other scopes (including the first scope) so that the other scopes can better locate and track the target.
  • the scope with the best view may be a scope within the network of scopes that is closest to the target, has the least obstructed view of the target, is at the best elevation to view the target, is in the best position for capturing biometric data of the target (e.g., a face of a person or animal), or is in the best position to shoot a projectile (e.g., bullet) at the target or at a specific part of the target.
  • a projectile e.g., bullet
  • the operator of the second scope uses indicators to re-position the second scope from its current target position so as to move towards the target position defined by the current target position data received from the first scope.
  • the second scope uses electronic control signals to re-position the second scope from its current target position so as to move towards the target position defined by the current target position data received from the first scope. This may involve physically or electronically rotating and/or pivoting the second scope with respect to its mounting, such as by using a pan-tilt mechanism described below, and/or by changing optical parameters of the second scope. An operator may direct such re-positioning movements by viewing a display of the second scope, and causing appropriate electronic control signals to be generated.
  • GOOGLE Maps GOOGLE Maps, APPLE Maps
  • conventional prompts may be given to the vehicle operator to move the vehicle to the improved location for allowing the second scope to view the target from the improved location.
  • topographical maps may be used and the vehicle is repositioned using the shortest path to the improved position that is feasible based on any determined terrain obstructions that are identified as being between the vehicle and the target location.
  • the improved (better) location of a vehicle having a second scope mounted to or integrated into the vehicle will meet one or more of the following conditions relative to the vehicle’s first position, or the position of the first scope:
  • the target is a person or animal (a“person” is used in the following description for convenience of explanation), and it is necessary for the second scope to see facial details of the person so as to track the person and/or perform facial recognition of the person.
  • the goal, or at least the initial goal, is not to come right up to the target, but instead the goal is to be positioned at a sufficiently close distance so that the target can be viewed, typically in a covert manner. Thus, there may be a minimum distance that should be kept between the scope and the target, such as 50 meters.
  • the first step in the process is to calculate how close the scope must be to the person so as to capture sufficient facial features that would allow the algorithm to obtain an accurate facial signature. This will depend on algorithm inputs since different algorithms use different facial features, and it will also depend upon scope optics such as lens quality, optical zoom, and the quality of any digital zoom. This distance may be determined experimentally before a scope is deployed in a surveillance environment. Consider an example wherein a scope containing very high quality optics can create accurate facial signatures at distances up to 150 meters. This means that the scope (and thereby the vehicle that has the scope mounted to or integrated therein) should be positioned 150 meters or less from the target.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optics & Photonics (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

La présente invention concerne un réseau de dispositifs de visualisation, comprenant un ou plusieurs dispositifs de visualisation principaux et un ou plusieurs dispositifs de visualisation suiveurs, permettant aux dispositifs de visualisation respectifs de suivre la même cible présumée. Un dispositif de visualisation principal localise une cible et communique des données de position cible de la cible présumée au dispositif de visualisation suiveur. Le dispositif de visualisation suiveur utilise les données de position cible et ses propres données de position pour générer des signaux de commande électroniques destinés à être utilisés par le dispositif de visualisation suiveur pour qu'il effectue des mouvements de positionnement de façon à repositionner le dispositif de visualisation suiveur à partir de sa position cible courante pour l'amener en direction de la position cible définie par les données de position cible reçues du dispositif de visualisation principal. Au moins le second dispositif de visualisation est monté sur un véhicule, ou intégré dans un véhicule, qui utilise les données de position cible pour se déplacer vers un nouvel emplacement de façon à permettre au second dispositif de visualisation de mieux visualiser la cible.
PCT/US2020/016619 2019-02-11 2020-02-04 Dispositifs embarqués avec dispositifs de visualisation connectés en réseau permettant un suivi simultané d'une cible par de multiples autres dispositifs WO2020167530A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020217028143A KR20210133972A (ko) 2019-02-11 2020-02-04 타겟을 여러 다른 디바이스에서 동시에 추적할 수 있도록 네트워크로 연결된 스코프가 있는 차량 탑재 장치
EP20755165.6A EP3924683A4 (fr) 2019-02-11 2020-02-04 Dispositifs embarqués avec dispositifs de visualisation connectés en réseau permettant un suivi simultané d'une cible par de multiples autres dispositifs
CN202080013865.5A CN113424012B (zh) 2019-02-11 2020-02-04 具有网络连接瞄准镜以允许多个其他装置同时跟踪目标的车载装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/272,733 2019-02-11
US16/272,733 US10408573B1 (en) 2017-08-11 2019-02-11 Vehicle-mounted device with network-connected scopes for allowing a target to be simultaneously tracked by multiple other devices

Publications (1)

Publication Number Publication Date
WO2020167530A1 true WO2020167530A1 (fr) 2020-08-20

Family

ID=72045013

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/016619 WO2020167530A1 (fr) 2019-02-11 2020-02-04 Dispositifs embarqués avec dispositifs de visualisation connectés en réseau permettant un suivi simultané d'une cible par de multiples autres dispositifs

Country Status (4)

Country Link
EP (1) EP3924683A4 (fr)
KR (1) KR20210133972A (fr)
CN (1) CN113424012B (fr)
WO (1) WO2020167530A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11378358B2 (en) * 2018-10-15 2022-07-05 Towarra Holdings Pty. Ltd. Target display device
US11821996B1 (en) * 2019-11-12 2023-11-21 Lockheed Martin Corporation Outdoor entity and weapon tracking and orientation
CN117237560A (zh) * 2023-11-10 2023-12-15 腾讯科技(深圳)有限公司 一种数据处理方法和相关装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI791313B (zh) * 2021-10-28 2023-02-01 為昇科科技股份有限公司 雷達自我校正裝置及雷達自我校正方法
CN114285998A (zh) * 2021-12-24 2022-04-05 申通庞巴迪(上海)轨道交通车辆维修有限公司 一种车厢动态人像抓取与定位追随的视屏监控系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4255013A (en) * 1979-05-17 1981-03-10 John E. McNair Rifle scope having compensation for elevation and drift
US4949089A (en) 1989-08-24 1990-08-14 General Dynamics Corporation Portable target locator system
US5568152A (en) 1994-02-04 1996-10-22 Trimble Navigation Limited Integrated image transfer for remote target location
US20040134113A1 (en) * 2002-08-02 2004-07-15 Deros Mark A. Adjustable gun rest apparatus
US20070050139A1 (en) 2005-04-27 2007-03-01 Sidman Adam D Handheld platform stabilization system employing distributed rotation sensors
US9612088B2 (en) * 2014-05-06 2017-04-04 Raytheon Company Shooting system with aim assist
US20170302852A1 (en) 2016-04-13 2017-10-19 Jason Tze Wah Lam Three Axis Gimbals Stabilized Action Camera Lens Unit
US9813618B2 (en) 2012-11-02 2017-11-07 Diversified Innovations Fund, Lllp Wide area imaging system and method
US20190049219A1 (en) * 2017-08-11 2019-02-14 Douglas FOUGNIES Devices with network-connected scopes for allowing a target to be simultaneously tracked by multiple devices

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080118104A1 (en) * 2006-11-22 2008-05-22 Honeywell International Inc. High fidelity target identification and acquisition through image stabilization and image size regulation
US8020769B2 (en) * 2007-05-21 2011-09-20 Raytheon Company Handheld automatic target acquisition system
EP2511658A1 (fr) * 2011-04-14 2012-10-17 Hexagon Technology Center GmbH Système de mesure et procédé de détermination de nouveau point
EP2557392A1 (fr) * 2011-08-11 2013-02-13 Leica Geosystems AG Dispositif de mesure et procédé doté d'une fonctionnalité de ciblage échelonnable et basée sur l'orientation d'une unité de télécommande
DE102013008568A1 (de) * 2013-05-17 2014-11-20 Diehl Bgt Defence Gmbh & Co. Kg Verfahren zur Zieleinweisung einer Flugkörper-Abschussanlage
WO2015199780A2 (fr) * 2014-04-01 2015-12-30 Baker Joe D Système mobile d'affichage pour la visée d'une cible et de traitement balistique
CN106643700B (zh) * 2017-01-13 2018-05-15 中国人民解放军防空兵学院 一种定位定向监测系统及方法
CN107014378A (zh) * 2017-05-22 2017-08-04 中国科学技术大学 一种视线跟踪瞄准操控系统及方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4255013A (en) * 1979-05-17 1981-03-10 John E. McNair Rifle scope having compensation for elevation and drift
US4949089A (en) 1989-08-24 1990-08-14 General Dynamics Corporation Portable target locator system
US5568152A (en) 1994-02-04 1996-10-22 Trimble Navigation Limited Integrated image transfer for remote target location
US20040134113A1 (en) * 2002-08-02 2004-07-15 Deros Mark A. Adjustable gun rest apparatus
US20070050139A1 (en) 2005-04-27 2007-03-01 Sidman Adam D Handheld platform stabilization system employing distributed rotation sensors
US9813618B2 (en) 2012-11-02 2017-11-07 Diversified Innovations Fund, Lllp Wide area imaging system and method
US9612088B2 (en) * 2014-05-06 2017-04-04 Raytheon Company Shooting system with aim assist
US20170302852A1 (en) 2016-04-13 2017-10-19 Jason Tze Wah Lam Three Axis Gimbals Stabilized Action Camera Lens Unit
US20190049219A1 (en) * 2017-08-11 2019-02-14 Douglas FOUGNIES Devices with network-connected scopes for allowing a target to be simultaneously tracked by multiple devices
US10267598B2 (en) * 2017-08-11 2019-04-23 Douglas FOUGNIES Devices with network-connected scopes for allowing a target to be simultaneously tracked by multiple devices

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3924683A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11378358B2 (en) * 2018-10-15 2022-07-05 Towarra Holdings Pty. Ltd. Target display device
US11821996B1 (en) * 2019-11-12 2023-11-21 Lockheed Martin Corporation Outdoor entity and weapon tracking and orientation
CN117237560A (zh) * 2023-11-10 2023-12-15 腾讯科技(深圳)有限公司 一种数据处理方法和相关装置
CN117237560B (zh) * 2023-11-10 2024-02-23 腾讯科技(深圳)有限公司 一种数据处理方法和相关装置

Also Published As

Publication number Publication date
CN113424012B (zh) 2023-04-25
CN113424012A (zh) 2021-09-21
EP3924683A4 (fr) 2022-11-16
EP3924683A1 (fr) 2021-12-22
KR20210133972A (ko) 2021-11-08

Similar Documents

Publication Publication Date Title
US11226175B2 (en) Devices with network-connected scopes for allowing a target to be simultaneously tracked by multiple devices
US11555671B2 (en) Devices with network-connected scopes for allowing a target to be simultaneously tracked by multiple other devices
US11423586B2 (en) Augmented reality vision system for tracking and geolocating objects of interest
CN113424012B (zh) 具有网络连接瞄准镜以允许多个其他装置同时跟踪目标的车载装置
US9335121B2 (en) System and method of locating prey
US10408574B2 (en) Compact laser and geolocating targeting system
US20070127008A1 (en) Passive-optical locator
US20190014225A1 (en) Method and system for integrated optical systems
Neuhöfer et al. Adaptive information design for outdoor augmented reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20755165

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020755165

Country of ref document: EP

Effective date: 20210913