WO2007074444A1 - Imaging system for viewing around an obstruction - Google Patents

Imaging system for viewing around an obstruction Download PDF

Info

Publication number
WO2007074444A1
WO2007074444A1 PCT/IL2006/001482 IL2006001482W WO2007074444A1 WO 2007074444 A1 WO2007074444 A1 WO 2007074444A1 IL 2006001482 W IL2006001482 W IL 2006001482W WO 2007074444 A1 WO2007074444 A1 WO 2007074444A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging unit
cameras
scene
viewing
camera
Prior art date
Application number
PCT/IL2006/001482
Other languages
French (fr)
Inventor
Shimon Simhony
Original Assignee
Israel Aerospace Industries Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Israel Aerospace Industries Ltd. filed Critical Israel Aerospace Industries Ltd.
Publication of WO2007074444A1 publication Critical patent/WO2007074444A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • This invention relates to imaging systems and more particularly to systems for imaging a scene behind an obstacle, especially imaging for reconnaissance purposes.
  • the camera may be shot from a firearm or a mortar.
  • the present invention provides an imaging system.
  • the imaging system of the invention includes an imaging unit and a viewing unit.
  • the imaging unit comprises two or more cameras having different, fixed orientations in the imaging unit.
  • the cameras may be videos or still cameras, digital or analog cameras.
  • the imaging unit may be cubical in shape and have six cameras, with a different camera viewing through an aperture in a different face of the cube. Images obtained by each camera are transmitted to the viewing unit for viewing by an operator.
  • the imaging unit may be placed in the scene.
  • the imaging unit may be fixed at the end of a pole, or suspended from a tree or other stationary object.
  • the imaging unit is adapted to be sent into a trajectory through_the air in the vicinity of the scene to be imaged and viewed.
  • the imaging unit may be thrown into the air by hand at the scene where images are to be obtained or shot into the air by a device that may be, for example, a spring loaded mortar or a sling or a firearm such as a rifle or standard mortar.
  • a device may be, for example, a spring loaded mortar or a sling or a firearm such as a rifle or standard mortar.
  • the system of the invention may be used, for example, to obtain images of a scene located on a second side of a wall, where a person's view of the scene is obstructed by the wall.
  • the imaging unit is sent into a trajectory located in the person's side of the wall.
  • the imaging unit is located in its trajectory at a height above the ground that is greater than the height of the wall one or more of the cameras associated with the imaging will have an orientation in space allowing the camera to obtain images of at least a portion of the scene located on the other side of the wall.
  • images obtained by the cameras are transmitted to the viewing unit and displayed on one or more display screens on the viewing unit.
  • a software provides the observer spatial orientation.
  • the invention provides an imaging system for obtaining images of a scene, comprising: (a) an imaging unit comprising a housing containing two or more cameras, each camera having a different orientation inside the housing, and further including a transmitter transmitting signals indicative of images obtained by the cameras,
  • a receiver receiving signals transmitted by the transmitter; and (ii) one or more viewing screens displaying images obtained by one or more of the cameras; and.
  • the invention provides a method for obtaining images of a scene comprising: (a) sending into a trajectory in the air an imaging unit comprising an imaging unit comprising a housing containing two or more cameras, each camera having a different orientation inside the housing,
  • Fig. 1 is an imaging system in accordance with one embodiment of the invention
  • Fig. 2 shows an imaging unit for use in the system of Fig. 1 having a cubic housing (Fig. 2a), a tetrahedral housing (Fig. 2b) and an aerodynamically efficient housing (Fig. 2c);
  • Fig. 2d shows catapulting devices that may be used with the imaging system of the invention;
  • Fig. 3 shows use of the imaging system of Fig. 1 in imaging a scene.
  • FIG. 1 shows schematically an imaging system 1, in accordance with one embodiment of the invention.
  • the system 1 includes an imaging unit 2 and a viewing unit 4.
  • the imaging unit 2 comprises two or more cameras (four cameras in this case) 6.
  • Each camera 6 outputs a respective digital or analog signal 8 indicative of an image or images captured by the camera that is input to a signal conditioning and multiplexing unit 9 that sequentially inputs the images obtained by the cameras 6 to a transmitter 10 having an antenna 13.
  • the transmitter 10 is configured to convert each signal 8 into a corresponding electromagnetic signal 11 that is transmitted to the viewing unit 4.
  • the signals 11 include an identification of the camera 6 that obtained the image encoded by the signal 11.
  • the viewing unit 4 includes a receiver 12 having an antenna 15 configured to receive the signals 11 transmitted from the transmitter 10 and convert them into digital signals 18.
  • the receiver 12 is associated with a processing unit 16 that includes a memory 14 for storing images received by the receiver 12 from the transmitter 10.
  • the processor 16 is configured to determine from a signal 18 the identity of the camera that obtained the image encoded in a signal 18 and to mosaic signals 18 into a single image and to display the images as a mosaic on the display screen 20.
  • images obtained by the camera, 6a, 6b, 6c or 6d will be displayed on the field 2Oa 5 20b, 20c or 2Od, respectively on the viewing screen 20.
  • Fig. 2 shows various possible configurations of the imaging unit 2.
  • Fig. 2a shows an imaging unit 2a that comprises a cubic housing 22 (shown in broken lines in Fig. 2) containing six cameras 6.
  • Each of the six surfaces 24 of the cubic housing 22 has an aperture 26 through which one of the cameras 6 obtains images of the scene through its respective lens 27 (only three of the six surfaces 24 and six apertures 26 are visible in the perspective of Fig. 2).
  • the six cameras 6 are thus orientated in six different directions, so that up to six different perspectives of the scene may be obtained simultaneously by the imaging unit 2a.
  • the lens 27 is selected so that a complete spherical view around the imaging unit 2 is obtained. For example, in the case of the cubic imaging unit at Fig.
  • the imaging unit 2a also includes a transmitter 10 and a signal conditioning and multiplexing unit 9.
  • Fig. 2b shows an imaging unit 2b that comprises a tetrahedral housing 28 (shown in broken lines in Fig. 2b) containing four cameras 6.
  • Each of the four surfaces 30 of the tetrahedral housing 28 has an aperture 32 through which one of the cameras 6 obtains images of the scene (only three of the surfaces 30 and aperture 32 are visible in the perspective of Fig. 2b).
  • the four cameras 6 are thus oriented in four different directions so that up to four different perspectives of the scene may be obtained simultaneously of the scene by the imaging unit 2b.
  • the imaging unit 2b also includes a transmitter 10, and a signal conditioning and multiplexing unit 9.
  • the imaging units 2a and 2b are configured to be thrown into the air at the scene where images are to be obtained.
  • Fig. 2c shows an imaging unit 2c configured to be shot into the air by a catapulting device 34.
  • the catapulting device may be a spring- loaded mortar 34a, a sling 34b, a crossbow 34c, or a firearm 34d.
  • the imaging unit 2c includes an aerodynamically efficient housing 36 that allows the imaging unit 2c to be thrown or fired from the shooting or catapulting device 34 and to travel through the air in a desired orientation.
  • the imaging unit 2c also has fins 39 to orient the head 41 in the direction of flight.
  • the housing 36 contains four cameras 6.
  • Each of the four cameras 6 obtains images of the scene through a respective aperture 37 in the housing 36.
  • the four cameras 6 have different fixed orientations in the housing 36, so that images of the scene may be obtained simultaneously from up to four different perspectives with the imaging unit 2c.
  • a fifth camera is mounted so that it points down during flight (this camera provided the spatial orientation).
  • the imaging unit 2c also includes a transmitter 10 or a recording device.
  • Fig. 3 shows use of the imaging system 1 to obtain images of a scene 45.
  • An operator 42 standing on a first side 44 of a wall 40 wishes to obtain images of a portion of the scene located on the second side 48 of the wall 40.
  • the operator's view of the scene 45 is obstructed by the wall 40.
  • the imaging unit 2 is sent into a trajectory 50 located on the operator's side 44 of the wall 40.
  • the imaging unit 4 may be sent into title trajectory 50 by the operator 42 throwing the imaging unit 4 into the air as shown in Fig. 3.
  • the imaging unit 4 may be sent into the trajectory 50 by firing the imaging unit 4 from a catapulting device (not shown), that may be, for example, any one of the catapulting devices mentioned in reference to Fig. 2.
  • a catapulting device not shown
  • the imaging unit 2 is located at a height above the ground that is greater than the height of the wall 40 (e.g.
  • one or more of the cameras 6 associated with the imaging unit 2 will have an orientation in space allowing the camera to obtain images of at least a portion of the scene 45.
  • the imaging unit 2 may tumble, so that the subset of cameras 6 associated with imaging unit 2 that have an orientation in space allowing them to obtain images of at least a portion of the scene 45 may change.
  • the viewing unit 4 is a hand-held unit that allows the operator 42 to simultaneously view the images obtained by the cameras 6 in relevant time.
  • the control unit 4 may be provided with an input device, such as a touch screen or a keypad 54 containing a number of keys 56 (see also Fig. 1).
  • a key 56 is pressed or the field in the touch screen is touched identifying the viewing field 20 on which the image is displayed and the selected image of interest is displayed on the screen 20 alone and fills the entire screen.
  • the imaging unit 2 tumbles in its trajectory 50, at least one of the cameras 6 will be oriented so as to obtain images of the ground below the imaging device (i.e. on the first side 44 of the wall 40). These images may contain landmarks identifiable by the operator.
  • an input is made identifying the field of the viewing screen 20 on which the image is displayed.
  • the processor 14 is configured to determine, on the basis of the camera 6 that is currently oriented towards the ground on the first side 44 of the wall 40, which of the cameras 6 are currently oriented towards the scene and to display images from that camera on the screen 20.
  • the imaging unit 2 After the imaging unit 2 has landed on the ground (the position 53 in the trajectory), it may be retrieved by the operator and used again.
  • the imaging unit 2 may be provided with a tether (not shown) that tethers the imaging unit to the operator or other object. The imaging unit may then be drawn to the operator by pulling on the tether.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an imaging system and method for obtaining images of a scene. Images of the scene are obtained by an imaging unit containing two or more cameras, where each camera has a different orientation in the imaging unit, and a transmitter transmitting signals indicative of images obtained by the cameras. Images obtained by the imaging unit are viewed on a viewing unit having a receiver receiving signals transmitted by the transmitter; and one or more viewing screens. The imaging unit is sent into a trajectory in the air and a camera in the imaging unit having a reference orientation is identified. One or more cameras viewing the scene are then determined.

Description

IMAGING SYSTEM FOR VIEWING AROUND AN OBSTRUCTION
FIELD OF THE INVENTION
This invention relates to imaging systems and more particularly to systems for imaging a scene behind an obstacle, especially imaging for reconnaissance purposes.
BACKGROUND OF THE INVENTION There are instances when it is necessary or desirable to be able to view or obtain an image of a scene under circumstances where the view of the scene is obstructed, for example, by a building, wall or hill. Infantry and other ground forces in hostile territory, for example, may require visual reconnaissance information at short range and in real time relating to a scene whose view is obstructed. It is known in these cases to use a periscope for viewing around a corner of a building, over a building or wall, or beyond a hill. US Patent 4,123,151 to Aurin discloses a periscope allowing viewing along two 180° -opposed viewing axes.
It is also known to send a camera into a trajectory to obtain images of a scene. The camera may be shot from a firearm or a mortar.
SUMMARY OF THE INVENTION
The present invention provides an imaging system. The imaging system of the invention includes an imaging unit and a viewing unit. The imaging unit comprises two or more cameras having different, fixed orientations in the imaging unit. The cameras may be videos or still cameras, digital or analog cameras. For example, the imaging unit may be cubical in shape and have six cameras, with a different camera viewing through an aperture in a different face of the cube. Images obtained by each camera are transmitted to the viewing unit for viewing by an operator.
The imaging unit may be placed in the scene. For example, the imaging unit may be fixed at the end of a pole, or suspended from a tree or other stationary object. In one preferred embodiment the imaging unit is adapted to be sent into a trajectory through_the air in the vicinity of the scene to be imaged and viewed. The imaging unit may be thrown into the air by hand at the scene where images are to be obtained or shot into the air by a device that may be, for example, a spring loaded mortar or a sling or a firearm such as a rifle or standard mortar. As the imaging unit travels in its trajectory, images are obtained of the scene simultaneously from different perspectives by the cameras. The system of the invention may be used, for example, to obtain images of a scene located on a second side of a wall, where a person's view of the scene is obstructed by the wall. In order to obtain images of the scene, the imaging unit is sent into a trajectory located in the person's side of the wall. When the imaging unit is located in its trajectory at a height above the ground that is greater than the height of the wall one or more of the cameras associated with the imaging will have an orientation in space allowing the camera to obtain images of at least a portion of the scene located on the other side of the wall. While the imaging unit is traveling in its trajectory, images obtained by the cameras are transmitted to the viewing unit and displayed on one or more display screens on the viewing unit. A software provides the observer spatial orientation.
Thus, in one of its aspects, the invention provides an imaging system for obtaining images of a scene, comprising: (a) an imaging unit comprising a housing containing two or more cameras, each camera having a different orientation inside the housing, and further including a transmitter transmitting signals indicative of images obtained by the cameras,
(b) a viewing unit comprising
(i) a receiver receiving signals transmitted by the transmitter; and (ii) one or more viewing screens displaying images obtained by one or more of the cameras; and.
(c) a processor configured to: a receive an input indicating a camera having a reference orientation; b determine from the input one or more cameras viewing the scene. In another of its aspects, the invention provides a method for obtaining images of a scene comprising: (a) sending into a trajectory in the air an imaging unit comprising an imaging unit comprising a housing containing two or more cameras, each camera having a different orientation inside the housing,
(b) identifying a camera having a reference orientation; and i (c) determining from the camera having the reference orientation one or more cameras viewing the scene.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to understand the invention and to see how it may be carried out in practice, a preferred embodiment will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
Fig. 1 is an imaging system in accordance with one embodiment of the invention;
Fig. 2 shows an imaging unit for use in the system of Fig. 1 having a cubic housing (Fig. 2a), a tetrahedral housing (Fig. 2b) and an aerodynamically efficient housing (Fig. 2c); Fig. 2d shows catapulting devices that may be used with the imaging system of the invention; and
Fig. 3 shows use of the imaging system of Fig. 1 in imaging a scene.
DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS Fig. 1 shows schematically an imaging system 1, in accordance with one embodiment of the invention. The system 1 includes an imaging unit 2 and a viewing unit 4. The imaging unit 2 comprises two or more cameras (four cameras in this case) 6. Each camera 6 outputs a respective digital or analog signal 8 indicative of an image or images captured by the camera that is input to a signal conditioning and multiplexing unit 9 that sequentially inputs the images obtained by the cameras 6 to a transmitter 10 having an antenna 13. The transmitter 10 is configured to convert each signal 8 into a corresponding electromagnetic signal 11 that is transmitted to the viewing unit 4. The signals 11 include an identification of the camera 6 that obtained the image encoded by the signal 11. The viewing unit 4 includes a receiver 12 having an antenna 15 configured to receive the signals 11 transmitted from the transmitter 10 and convert them into digital signals 18. The receiver 12 is associated with a processing unit 16 that includes a memory 14 for storing images received by the receiver 12 from the transmitter 10. The processor 16 is configured to determine from a signal 18 the identity of the camera that obtained the image encoded in a signal 18 and to mosaic signals 18 into a single image and to display the images as a mosaic on the display screen 20. Thus, for example, images obtained by the camera, 6a, 6b, 6c or 6d will be displayed on the field 2Oa5 20b, 20c or 2Od, respectively on the viewing screen 20.
Fig. 2 shows various possible configurations of the imaging unit 2. Fig. 2a shows an imaging unit 2a that comprises a cubic housing 22 (shown in broken lines in Fig. 2) containing six cameras 6. Each of the six surfaces 24 of the cubic housing 22 has an aperture 26 through which one of the cameras 6 obtains images of the scene through its respective lens 27 (only three of the six surfaces 24 and six apertures 26 are visible in the perspective of Fig. 2). The six cameras 6 are thus orientated in six different directions, so that up to six different perspectives of the scene may be obtained simultaneously by the imaging unit 2a. The lens 27 is selected so that a complete spherical view around the imaging unit 2 is obtained. For example, in the case of the cubic imaging unit at Fig. 2, a lens on each camera having an angular field of view of over 90° would provide a spherical view with some overlap at the fields of view of neighboring cameras. The imaging unit 2a also includes a transmitter 10 and a signal conditioning and multiplexing unit 9.
Fig. 2b shows an imaging unit 2b that comprises a tetrahedral housing 28 (shown in broken lines in Fig. 2b) containing four cameras 6. Each of the four surfaces 30 of the tetrahedral housing 28 has an aperture 32 through which one of the cameras 6 obtains images of the scene (only three of the surfaces 30 and aperture 32 are visible in the perspective of Fig. 2b). The four cameras 6 are thus oriented in four different directions so that up to four different perspectives of the scene may be obtained simultaneously of the scene by the imaging unit 2b. The imaging unit 2b also includes a transmitter 10, and a signal conditioning and multiplexing unit 9.
The imaging units 2a and 2b are configured to be thrown into the air at the scene where images are to be obtained. Fig. 2c shows an imaging unit 2c configured to be shot into the air by a catapulting device 34. The catapulting device may be a spring- loaded mortar 34a, a sling 34b, a crossbow 34c, or a firearm 34d. The imaging unit 2c includes an aerodynamically efficient housing 36 that allows the imaging unit 2c to be thrown or fired from the shooting or catapulting device 34 and to travel through the air in a desired orientation. The imaging unit 2c also has fins 39 to orient the head 41 in the direction of flight. The housing 36 contains four cameras 6. Each of the four cameras 6 obtains images of the scene through a respective aperture 37 in the housing 36. The four cameras 6 have different fixed orientations in the housing 36, so that images of the scene may be obtained simultaneously from up to four different perspectives with the imaging unit 2c. A fifth camera is mounted so that it points down during flight (this camera provided the spatial orientation). The imaging unit 2c also includes a transmitter 10 or a recording device. Fig. 3 shows use of the imaging system 1 to obtain images of a scene 45. An operator 42 standing on a first side 44 of a wall 40 wishes to obtain images of a portion of the scene located on the second side 48 of the wall 40. The operator's view of the scene 45 is obstructed by the wall 40. In order to obtain images of the scene 45, the imaging unit 2 is sent into a trajectory 50 located on the operator's side 44 of the wall 40. The imaging unit 4 may be sent into title trajectory 50 by the operator 42 throwing the imaging unit 4 into the air as shown in Fig. 3. Alternatively, the imaging unit 4 may be sent into the trajectory 50 by firing the imaging unit 4 from a catapulting device (not shown), that may be, for example, any one of the catapulting devices mentioned in reference to Fig. 2. When the imaging unit 2 is located at a height above the ground that is greater than the height of the wall 40 (e.g. the position 52 of the imaging unit 2) one or more of the cameras 6 associated with the imaging unit 2 will have an orientation in space allowing the camera to obtain images of at least a portion of the scene 45. As the imaging unit 2 travels through the trajectory 50, it may tumble, so that the subset of cameras 6 associated with imaging unit 2 that have an orientation in space allowing them to obtain images of at least a portion of the scene 45 may change.
While the imaging unit 2 is traveling in the trajectory 50, images obtained by the cameras 6 associated with the imaging unit 2 are processed, multiplexed and transmitted to the viewing unit 4 and then displayed in one or more of the fields 20a to 20f on the display screen 20 associated with the viewing unit 4. In this embodiment, the viewing unit 4 is a hand-held unit that allows the operator 42 to simultaneously view the images obtained by the cameras 6 in relevant time. The control unit 4 may be provided with an input device, such as a touch screen or a keypad 54 containing a number of keys 56 (see also Fig. 1). When the operator identifies an image of interest from among the displayed images, a key 56 is pressed or the field in the touch screen is touched identifying the viewing field 20 on which the image is displayed and the selected image of interest is displayed on the screen 20 alone and fills the entire screen. At any moment, as the imaging unit 2 tumbles in its trajectory 50, at least one of the cameras 6 will be oriented so as to obtain images of the ground below the imaging device (i.e. on the first side 44 of the wall 40). These images may contain landmarks identifiable by the operator. When the operator identifies in an image a view of the first side 44 of the wall 49, an input is made identifying the field of the viewing screen 20 on which the image is displayed. In this image an identifiable feature can be observed such as the operator with an extended hand pointing in the direction of the scene 45. The operator then touches the image in two places, for example, the image of his head and the tip of his outstreched arm, thus pointing to the desired observation direction. The processor 14 is configured to determine, on the basis of the camera 6 that is currently oriented towards the ground on the first side 44 of the wall 40, which of the cameras 6 are currently oriented towards the scene and to display images from that camera on the screen 20.
After the imaging unit 2 has landed on the ground (the position 53 in the trajectory), it may be retrieved by the operator and used again. The imaging unit 2 may be provided with a tether (not shown) that tethers the imaging unit to the operator or other object. The imaging unit may then be drawn to the operator by pulling on the tether.

Claims

CLAIMS:
1. An imaging system for obtaining images of a scene, comprising:
(a) an imaging unit comprising a housing containing two or more cameras, each camera having a different orientation inside the housing, and further including a transmitter transmitting signals indicative of images obtained by the cameras,
(b) a viewing unit comprising
(i) a receiver receiving signals transmitted by the transmitter; and
(ii) one or more viewing screens displaying images obtained by one or more of the cameras; and. (c) a processor configured to : a receive an input indicating a camera having a reference orientation; b determine from the input one or more cameras viewing the scene.
2. The system according to Claim 1 further comprising a signal conditioning and multiplexing unit.
3. The imaging system according to Claim 1 wherein the housing of the imaging unit is cubic or tetrahedral in shape.
4. The system according to Claim 1 or 2 wherein the viewing unit has a viewing screen having a number of viewing fields equal to the number of cameras.
5. The system according to Claim 1 wherein the input is input to the processor by touching a field displaying an image having the first orientation.
6. The system according to any one of the previous claims further comprising a device for sending the imaging unit into a trajectory in the air.
7. The system according to Claim 6 wherein the device for sending the imaging unit into a trajectory is a firearm, catapult or sling.
8. A method for obtaining images of a scene comprising:
(a) sending into a trajectory in the air an imaging unit comprising an imaging unit comprising a housing containing two or more cameras, each camera having a different orientation inside the housing,
(b) identifying a camera having a reference orientation; and (c) determining from the camera having the reference orientation one or more cameras viewing the scene.
9. The method according to Claim 8 further displaying one or more images obtained by one or more of the cameras observing the scene.
10. The method according to Claim 8 or 9 wherein the imaging unit is sent into a trajectory in the air by being thrown by an operator into the air.
11. The method according to Claim 8 or 9 wherein the imaging unit is sent into a trajectory in the air by being shot into the air by a firearm or catapult.
12. The method according to Claim 14 wherein the step of identifying a camera having a reference orientation includes touching a field in a viewing unit displaying an image obtained by a camera having the reference orientation.
13. The method according to any one of Claims 8 to 12 further comprising a step of inputting a direction from a user to the scene.
The method according to Claim 13 wherein the step of inputting a direction from a user to the scene includes indicating a first point and a second point in an image obtained by a camera having the reference orientation.
PCT/IL2006/001482 2005-12-29 2006-12-26 Imaging system for viewing around an obstruction WO2007074444A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL172904 2005-12-29
IL172904A IL172904A (en) 2005-12-29 2005-12-29 Imaging system for viewing around an obstruction

Publications (1)

Publication Number Publication Date
WO2007074444A1 true WO2007074444A1 (en) 2007-07-05

Family

ID=37882265

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2006/001482 WO2007074444A1 (en) 2005-12-29 2006-12-26 Imaging system for viewing around an obstruction

Country Status (2)

Country Link
IL (1) IL172904A (en)
WO (1) WO2007074444A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109425867A (en) * 2017-08-28 2019-03-05 意法半导体(鲁塞)公司 For determining the device and method of presence and/or the movement of the intracorporal object of shell

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005048586A1 (en) * 2003-11-12 2005-05-26 Bae Hun Kim Camera device for 360-degree panorama shot and operation method thereof
US6924838B1 (en) * 2000-07-31 2005-08-02 Charlton Nieves Grenade cam
US20050206729A1 (en) * 2001-07-11 2005-09-22 Chang Industry, Inc. Deployable monitoring device having self-righting housing and associated method
US20050207487A1 (en) * 2000-06-14 2005-09-22 Monroe David A Digital security multimedia sensor

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050207487A1 (en) * 2000-06-14 2005-09-22 Monroe David A Digital security multimedia sensor
US6924838B1 (en) * 2000-07-31 2005-08-02 Charlton Nieves Grenade cam
US20050206729A1 (en) * 2001-07-11 2005-09-22 Chang Industry, Inc. Deployable monitoring device having self-righting housing and associated method
WO2005048586A1 (en) * 2003-11-12 2005-05-26 Bae Hun Kim Camera device for 360-degree panorama shot and operation method thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109425867A (en) * 2017-08-28 2019-03-05 意法半导体(鲁塞)公司 For determining the device and method of presence and/or the movement of the intracorporal object of shell
CN109425867B (en) * 2017-08-28 2023-10-03 意法半导体(鲁塞)公司 Apparatus and method for determining the presence and/or movement of an object within a housing

Also Published As

Publication number Publication date
IL172904A (en) 2010-12-30
IL172904A0 (en) 2007-02-11

Similar Documents

Publication Publication Date Title
US10114127B2 (en) Augmented reality visualization system
US10341162B2 (en) Augmented reality gaming system
US6933965B2 (en) Panoramic aerial imaging device
US7787012B2 (en) System and method for video image registration in a heads up display
KR101572896B1 (en) Tank around the battlefield situational awareness system
US20150054826A1 (en) Augmented reality system for identifying force capability and occluded terrain
CN105300175B (en) The sniperscope that a kind of infrared and low-light two is blended
US20090292467A1 (en) System, method and computer program product for ranging based on pixel shift and velocity input
EP3098624A1 (en) A method and apparatus for determining a deviation between an actual direction of a launched projectile and a predetermined direction
US20110291918A1 (en) Enhancing Vision Using An Array Of Sensor Modules
US8279266B2 (en) Video system using camera modules to provide real-time composite video image
WO2016115619A1 (en) A sensor pack for firearm
KR102125299B1 (en) System and method for battlefield situation recognition for combat vehicle
RU2697047C2 (en) Method of external target designation with indication of targets for armament of armored force vehicles samples
GB2117609A (en) Field of view simulation for weapons training
WO2014111947A1 (en) Gesture control in augmented reality
KR101846993B1 (en) Naval gun zero point control system using drone
CN113848992A (en) Target detection location and automatic shooting system based on unmanned aerial vehicle and armed beating robot
WO2007074444A1 (en) Imaging system for viewing around an obstruction
RU2403526C2 (en) System for aiming firing from shelter
US11460270B1 (en) System and method utilizing a smart camera to locate enemy and friendly forces
CN112704875A (en) Virtual item control method, device, equipment and storage medium
CN110274522A (en) A kind of unmanned battlebus simulated testing system
KR101866218B1 (en) Naval gun zero point control method using drone
KR101779199B1 (en) Apparatus for recording security video

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06821664

Country of ref document: EP

Kind code of ref document: A1