EP2912835A1 - Remote surveillance sensor apparatus - Google Patents
Remote surveillance sensor apparatusInfo
- Publication number
- EP2912835A1 EP2912835A1 EP13849824.1A EP13849824A EP2912835A1 EP 2912835 A1 EP2912835 A1 EP 2912835A1 EP 13849824 A EP13849824 A EP 13849824A EP 2912835 A1 EP2912835 A1 EP 2912835A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- data
- remote sensor
- sensor apparatus
- portable receiver
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 claims abstract description 75
- 238000012545 processing Methods 0.000 claims abstract description 61
- 229920001971 elastomer Polymers 0.000 claims description 15
- 239000000806 elastomer Substances 0.000 claims description 7
- 238000006073 displacement reaction Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 claims description 4
- 238000002156 mixing Methods 0.000 claims description 3
- 230000001133 acceleration Effects 0.000 claims description 2
- 230000000694 effects Effects 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 description 25
- 238000010586 diagram Methods 0.000 description 19
- 238000004891 communication Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 10
- 238000013461 design Methods 0.000 description 6
- LELOWRISYMNNSU-UHFFFAOYSA-N hydrogen cyanide Chemical compound N#C LELOWRISYMNNSU-UHFFFAOYSA-N 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 239000007789 gas Substances 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 5
- 238000012546 transfer Methods 0.000 description 5
- 238000007689 inspection Methods 0.000 description 4
- 239000004033 plastic Substances 0.000 description 4
- UGFAIRIUMAVXCW-UHFFFAOYSA-N Carbon monoxide Chemical compound [O+]#[C-] UGFAIRIUMAVXCW-UHFFFAOYSA-N 0.000 description 3
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 229910002091 carbon monoxide Inorganic materials 0.000 description 3
- 229940100603 hydrogen cyanide Drugs 0.000 description 3
- 229910001416 lithium ion Inorganic materials 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003139 buffering effect Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- 238000001746 injection moulding Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 239000000126 substance Substances 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 229920000049 Carbon (fiber) Polymers 0.000 description 1
- 101150034459 Parpbp gene Proteins 0.000 description 1
- 241000269400 Sirenidae Species 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000000981 bystander Effects 0.000 description 1
- 239000004917 carbon fiber Substances 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000000748 compression moulding Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000009413 insulation Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 239000001301 oxygen Substances 0.000 description 1
- 229910052760 oxygen Inorganic materials 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 239000000941 radioactive substance Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035939 shock Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- -1 temperature Chemical compound 0.000 description 1
- 239000002341 toxic gas Substances 0.000 description 1
- 238000012783 upstream development Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/04—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19617—Surveillance camera constructional details
- G08B13/19621—Portable camera
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19654—Details concerning communication with a camera
- G08B13/1966—Wireless systems, other than telephone systems, used to communicate with a camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- the disclosed methods and apparatus relate to a surveillance system including a remote sensor apparatus for collecting and transmitting image and other sensor data of a remote location and a receiver for processing and displaying the collected image and sensor data.
- a method for collecting and processing data related to a surrounding environment of a remote sensor apparatus including image data for a scene external to the housing of the remote sensor apparatus for a surveillance system including ihe remote sensor apparatus for collecting data and a portable receiver for receiving data collected by at least one sensor installed in the remote sensor apparatus.
- the portable receiver is capable of ranning an application program for processing the received data and displaying the processed data on a display screen of the portable receiver.
- the remote sensor apparatus has a spherical housing for containing a processing unit, a plurality of image sensors coupled to the processor, at least one swappable sensor, a plurality of light sources, an inertial measurement unit and a wireless transceiver and each of the plurality of image sensors is fitted with a wide-angle lens.
- the method includes receiving, at the processing unit, raw image data simultaneously from the plurality of image sensors as the raw image data is being captured by the plurality of image sensors.
- the raw image data is captured by each of the plurality of image sensors through a wide-angle lens and at least one of the plurality of light sources is placed around the wide-angle lens.
- a method for a surveillance system including a plurality of remote sensor apparatus that each is configured for collecting data and a portable receiver for receiving data collected by one or more of the plurality of remote sensor apparatus.
- the portable receiver includes a radio-frequency identification (RFID) reader and is capable of running an application program for processing the recei v ed data and displaying the processed data on a display screen of the portable receiver.
- the remote sensor apparatus has a housing for containing a processing unit, a plurality of sensors coupled to the processor, an RFID unit and a wireless transceiver.
- the method includes reading, at the portable receiver, radio-frequency identification (RFID) stored in the RFID unit of one or more of the plurality of remote sensor apparatus using the RFID reader.
- the method also includes registering, at the portable receiver, the one or more of the plurality of remote sensor apparatus using the received RFID.
- the method further includes receiving, at the portable receiver, first data from a first active remote sensor apparatus.
- the received first data includes first sensor data collected from at least one of a first plurality of sensors installed in the first active remote sensor and RFID of the first active remote sensor apparatus.
- the method also includes processing, at the portable receiver, the first sensor data received from the first active remote sensor apparatus, if the RFID of the first active remote sensor apparatus matches the RFID of any of the one or more of the plurality of remote sensor apparatus.
- the method further includes registering, at the portable recei v er, the one or more of the plurality of remote sensor apparatus using the ID code.
- the method also includes receiving, at the portable receiver, first data from a first active remote sensor apparatus.
- the received first data includes first sensor data collected from at least one of a first plurality of sensors installed in the first active remote sensor and ID code of the first active remote sensor apparatus.
- the method also includes processing, at the portable receiver, the first sensor data received from the first active remote sensor apparatus, if the ID code of the first active remote sensor apparatus matches the ID code of any of the one or more of the plurality of remote sensor apparatus.
- a method for a surveillance system including a plurality of remote sensor apparatus that each is configured for collecting data and a portable receiver for receiving data collected by one or more of the plurality of remote sensor apparatus.
- the portable receiver includes a keyboard for receiving user inputs and is capable of running an application program for processing the received data and displaying the processed data on a display screen of the portable receiver.
- the remote sensor apparatus has a housing for containing a processing unit, a plurality of sensors coupled to the processor and a wireless transceiver.
- the method includes receiving, at the portable receiver, identification (ID) codes of one or more of the plurality of remote sensor apparatus using the keyboard and registering the one or more of the plurality of remote sensor apparatus using the received ID codes.
- ID identification
- the method also includes receiving, at the portable receiver, first data from a first active remote sensor apparatus, wherein the received first data includes first sensor data collected from at least one of a first plurality of sensors installed in the first active remote sensor and ID code of the first active remote sensor apparatus.
- the method further includes processing, at the portable receiver, the first sensor data received from the first active remote sensor apparatus, if the ID code of the first active remote sensor apparatus matches the ID code of any of the one or more of the plurality of remote sensor apparatus.
- a method for generating a panoramic view of a scene external to the housing of a remote sensor apparatus for a surveillance system including the remote sensor apparatus for collecting data and a portable receiver for receiving data collected by at least one sensor installed in the remote sensor apparatus.
- the portable receiver is capable of running an application program for processing the received data and displaying the processed data on a display screen of the portable receiver.
- the remote sensor apparatus has a spherical housing for containing a processing unit, a plurality of image sensors coupled to the processor and a wireless transceiver and wherein each of the plurality of image sensors is fitted with a wide-angle lens.
- the method includes receiving, at the portable receiver, first image data collected simultaneously from the plurality of image sensors installed in the remote sensor apparatus.
- FIG. 1 illustrates a diagram of a remote surveillance system including a remote sensor apparatus and a portable receiver in accordance with one embodiment of the disclosed subject matter.
- FIG. 2 illustrates a diagram of a remote sensor apparatus in accordance with one embodiment of the disclosed subject matter.
- FIG . 3 illustrates a diagram of deployment methods for deploying a remote sensor apparatus in accordance with one embodiment of the disclosed subject matter.
- FIG. 4 illustrates a diagram showing a high-level view of the internals of a remote sensor apparatus in accordance with one embodiment of the disclosed subject matter.
- FIG. 5 illustrates a diagram showing a board including an image sensor and a wide-angle lens in accordance with one embodiment of the disclosed subject matter.
- FIG. 6 illustrates a diagram showing a top view and a bottom view of a central printed circuit board incl ded in a remote sensor apparatus in accordance with one embodiment of the disclosed subject matter.
- FIG. 7 illustrates a block diagram of a remote sensor apparatus in accordance with one embodiment of the disclosed subject matter.
- FIG. 8 illustrates a block diagram showing a processor simultaneously receiving image data from a plurality of image sensors using multiplexors in accordance with one embodiment of the disclosed subject matter.
- FIG. 9 illustrates a block diagram showing a high-level view of the firmware implementation of a remote sensor apparatus in accordance with one embodiment of the disclosed subject matter.
- FIG. 10 illustrates a block diagram showing communication modules included in a remote sensor apparaiits and a receiver device in accordance with one embodiment of the disclosed subject matter.
- FIG. 1 1 illustrates a diagram showing a receiving device in accordance with one embodiment of the disclosed subject matter.
- FIG. 12 illustrates a diagram showing a distribution of fish-eye projection having a wide-angle of 100 on a reference sphere in accordance with one embodiment of the disclosed subject matter.
- FIG. 13 illustrates a diagram showing an image coverage of a spherical field of view (FOV) with 140 ° of horizontal field of view (HFOV) and 89 of vertical field of view (VFOV) in accordance with one embodiment of the disclosed subject matter.
- FOV spherical field of view
- HFOV horizontal field of view
- VFOV vertical field of view
- FIG. 14 illustrates a block diagram for processing image data to generate a panoramic image in accordance with one embodiment of the disclosed subject matter.
- FIG. 15 illustrates a diagram of an internal view of a remote sensor apparatus showing a motor and a counterweight in accordance with one embodiment of the disclosed subject matter.
- FIG. 1 provides a high-level overview of a surveillance system 100.
- Sensor unit 101 is a multi-sensor platform incorporating a reinforced housing, multiple image sensors with wide-angle lenses, infrared/near-mfrared light-emitting diodes (USDs), batteries, a processor, and additional sensors and is described in more detail below.
- USDs infrared/near-mfrared light-emitting diodes
- Sensor unit 101 transmits data gathered by its image sensors and sensors over a wireless connection 1 02 to a receiver unit 103.
- the wireless connection is under the wireless fidelity (WiFi) 802.1 lb protocol.
- the wireless connection can be achieved via other WiFi protocols, Bluetooth, radio frequency (RF), or a range of other communications protocols - including military and non-standard spectra.
- Receiver unit 103 receives and processes data into a format usable to the user.
- the unit can stitch images to provide panoramic views, overlay these images with data from the other sensors on the device, and play streamed audio from sensor unit 101 's digital microphone over the receiver unit's speakers or headphones.
- the unit can stitch images to provide panoramic views, overlay these images with data from the other sensors on the device, and play streamed audio from sensor unit 101 's digital microphone over the receiver unit's speakers or headphones.
- the receiver unit 103 is an Android-based tablet or smartphone running a custom-developed application program.
- receiver unit 103 can be an iOS, Windows-based, or other smartphone or tablet. Such tablets may be hand-held or mounted, such as in some pouches that mount on the back of a partner's vest for the operator to view without having to hold the tablet.
- the receiver unit 103 can be a laptop computer.
- the receiver may be a heads-up or other display, such as those currently incorporated into military and first-responder units.
- the server-client architecture is flexible, meaning that the server can exist on the sensor unit 101 , on the receiver unit 103, or in a third station or device that serves as a router.
- the receiver unit 103 serves as the server and the sensor unit 101 servers as the client.
- the sensor unit 101 can function as the server and the receiver unit 103 as the client.
- Sensor unit 101 can be paired to one or more receiver unit(s) 103 via quick response (QR) code, near-field/radio -frequency identification (RF ' ID) communication, manual code entry, or other pairing method.
- Receiver units 103 can be paired with one or more sensor units 101.
- the pairing provides the user with the significant advantage that if the user already owns an Android or other compatible smartphone or (ablet device, the user does not need to purchase a receiver unit but can rather simply pair his/her phone/tablet (via the application described below) to the sensor unit 101 . in addition if sensor unit 101 is lost or damaged, receiver unit 103 can simply be paired to different sensor unit(s) 101.
- sensor unit 101 can easily be paired to different receiver units 103.
- This pairing ability also allows multiple users to share the information from one or more sensor units or one or more receiver units.
- the receiver unit 103 can act as a repeater for other receiver units 103, allowing users outside the transmission range of one or more sensor units 101 to see the same information as is being viewed on receiver 103.
- several sensor units 101 can use a wireless connection 104 to "daisy chain" to each other and thus extend their range by serving as repeaters. This connection also allows more extensive processing by using the relative position of units for mapping or three dimensional (3-D) imaging.
- the system 100 can be extended by gathering data from many sensor units 101. For example, in search & rescue after earthquakes a common problem is the lack of reliable maps given building collapses, often resulting in multiple searches of the same site. By aggregating location information from multiple sensor units 101, a map overlay cars be generated to avoid such duplication. Similar applications incorporating multiple sensor units 101 can assist in security and fire applications, among others.
- FIG. 2 illustrates a remote sensor unit, such as sensor unit 101.
- wide-angle lenses 201 e.g., fisheye lenses
- the CMOS sensors behind wide-angle lenses 201 take short-exposure (e.g., l/10,G00 tn or 1/100,000 th of a second) images of the scene observed through the lenses in order to compensate for the motion blur that might otherwise result from an image sensor unit being thrown or otherwise propelled into a space.
- infrared/near-infrared LEDs 202 could be triggered briefly prior and during the exposure. This mfrared/near-infrared light is visible to the CMOS sensors but is not within the range of human vision (allowing for both some degree of stealth and minimizing disturbance to bystanders).
- monochrome sensors are applied because the monochrome sensors are significantly more light-sensitive than color sensors. In some embodiments, however, color CMOS sensors or sensors in other areas of the spectrum might be applied.
- the lenses 201 are reinforced to resist heat and damage from exposure to chemicals or radiation.
- the aperture 204 for a digital microphone and speaker serves two functions.
- the digital microphone allows the digital microphone to be close to the surface of the sensor unit's housing and thus provide better audio to the operator listening to this audio stream via the receiver unit 103.
- the sensor unit allows the sensor unit to project audio via a small speaker or buzzer - a function that allows for locating the sensor unit once deployed and importantly to create a loud sound which can be a useful diversion when employed by police or in similar settings.
- the speaker can convey audio from the receiver unit to assist in communication between the person at the receiver unit and persons near the sensor unit (e.g. in hostage negotiations).
- high- intensity LEDs in the unit can be triggered along with the speaker to create a more substantial diversion
- the aperture 205 allows additional sensors to be exposed to the outside environment to gather the additional readings that are overlaid on the information provided on the display of the receiver unit 103.
- This aperture is compatible with a wide array of sensors, many of which can communicate with the central processor via the simple inter integrated circuit (PC) format.
- the sensors detect carbon monoxide, temperature, and hydrogen cyanide gas. These gases in particular have been found to pose a hazard to firefighters in the aftermath of a blaze.
- the system 100 is compatible with a wide range of sensors and can be easily adapted to support the sensors in the Table 1 below and many others using PC and similar standard formats, protocols, or analog outputs.
- different sensor combinations including oxygen (0 2 ) and other gases or Chem-Bio- adio- uclear sensor can be used, depending on configuration.
- the rubber or elastomer shell over the hard/reinforced inner shell serves two purposes. First, it absorbs much of the force of an impact as the unit enters a space and hits a wall, floor, ceiling, or other object - protecting the image sensors and internal components of the sensor unit. Second, it provides a degree of "bounce" to the sensor unit which allows it greater travel within a space.
- this outer shell is achieved by means of an elastomer or rubber overmold simultaneously poured with an injection mold of a hard plastic inner shell.
- the outer rubber or elastomer shell can be molded separately and attached to the hard internal metal, composite, or plastic shelf by means of an adhesive, screw, subsequent compression or injection molding, or snap-fit mechanism.
- the outer shell is reinforced via the choice of elastomer, rubber, or other materia] to sustain the harsh temperatures and chemical and radiological environments presented by firefighting and industrial inspection applications.
- rubber/elastomer "bumpers" on the surface of the outer shell allo w for greater impact resistance without blocking the fseld of view of the image sensors.
- the sensor unit is deployed by an operator who throws or rolls the unit into the space
- FIG. 3 illustrates some examples of other methods of deployment.
- Pole 301 can be attached to a hole in the housing of sensor unit 101 to allow the unit to be inserted slowly into a space.
- Tether 302 can be used to retrieve the sensor unit 101 from a space when it is difficult to retrieve manually, such as when searching for a victim inside a well or when inspecting a pipe.
- this tether 302 can provide supply power and act as a communications link for the unit, especially when continuous surveillance is required or adverse conditions limit wireless communications range.
- Optional unit 303 is similar to a tennis-ball thrower and can be used to extend the range of the sensor unit 101 beyond where a human operator can throw. Other embodiments can otherwise be propelled via air-cannon or other propulsion system.
- the sensor unit is partially self-propelled and extends its travel and propels itself via two methods - one or many internal motors whose torque cause the sensor unit to move, or a series of counterweights which are shifted to roll the sensor unit. Irs some embodiments, these movements are random and achieve greater coverage of the room in an unguided way or in a probabilistic fashion. In other embodiments, the propulsion is guided via the receiver unit 103 and more precise control of the motors and/or counterweights. Different applications require different levels of guidance (e.g.
- PCB 403 holds many of the components of the system, most importantly embedded processor and/or digital signal processor 402. In one
- the processor 402 is a BlackfinF548 by Analog Devices. In some embodiments, other processors are used.
- Printed circuit board 403 also hold connectors (in one embodiment, insulation displacement connector (IDC) ribbon cable connectors) for the image sensors and connection points for the other sensors, microphone, and other components.
- IDC insulation displacement connector
- the printer circuit board also houses the wireless module, shown in figures that follow.
- the central PCB 403 is mechanically supported at six points once the sensor unit shell is closed. This pro vides significant support to the board while allowing it some freedom of movement and flexion to better survive impacts when the sensor unit is thrown. In addition, a ntbber insert at the support points further cushions the central printed circuit board 403 and its components from shocks. 8048]
- the sensor has one or more batteries 404 to power the central processor, wireless module, image sensors, LEDs, and other sensors and components. In one embodiment, two batteries 404 are housed symmetrically in the two hemispheres. This arrangement both balances the sensor unit, allowing for more predictable travel through the air, and is mechanically optimal from an impact/resilience perspective. In some
- the batteries run through the center of a "donut-shaped" central PCB, again for balance and mechanical reasons.
- the image sensor boards 405 house the imaging sensors (in one embodiment, a complementary metal-oxide-semiconductor (CMOS) sensor, in some embodiments, a charge-coupled device (CCD) or other imaging sensor) and attach to each hemisphere 401.
- CMOS complementary metal-oxide-semiconductor
- CCD charge-coupled device
- the position and orientation of the image sensor boards 405 is optimized to maximize the overlap in field of view across all the sensors to ensure global coverage of the space being imaged. This is important because standard CMOS sensors are rectangular (e.g. WVGA is 752 x 480 pixels) and thus their vertical field of view is narrower than their horizontal field of view with a standard lens. This is further complicated by very wide-angle lenses.
- the orientation of the image sensor boards 405 is important to ens ure full coverage and sufficient overlap for image stitching (described below).
- the six image sensor boards 405 are equally spaced across the surface of the sensor unit and are rotated approximately 90-degrees from an adjacent image sensor board 405. In some embodiments, different combinations of spacing and rotation are used, but always with the objective of ensuring sufficient overlap across fields of view to ensure global coverage and enough overlap for image stitching.
- FIG. 5 provides a view of the image sensor board.
- the image sensor board houses the imaging sensor 501.
- the imaging sensor 501 is an Aptina V022MT9-series monochrome CMOS sensor. This sensor has very good low-light performance and dy namic range with low noise and can detect the wavelength of the light that the infrared (IR) or near-IR LEDs emit all important for the short-exposure, dark environment images the sensor unit is capturing.
- CMOS or CCD sensors are used, including sensors that are not limited to monochrome (e.g. color sensors) and sensors in other ranges of the light spectrum, such as infrared and ultraviolet.
- One or more LEDs 502 provide illumination to both light dark environments and to compensate for the light loss associated with short exposures.
- these LEDs 502 are near-infrared, high-intensity LEDs with their light brightest at around 850 nanometer (mil). This light is visible to the imaging sensors 501 but not to the human eyes.
- the LEDs emit in the visible light spectrum (particularly for color applications or when the LEDs serve a diversionary purpose). In some embodiments, LEDs emit at other wavelengths appropriate to the imaging sensor being employed.
- Lens holder 503 on the imaging board holds the lens in place and at the proper focus above the CMOS sensor.
- the lens holder is incorporated into the sphere casing 401. This both allows the parts to be injection molded in plastic and rubber and protects the lenses from impacts.
- the selection of lens 505 allows the sensor unit 101 to maximize the use of its imaging sensor 501.
- the fisheye lens 505 allows for an effective image footprint that covers nearly entirely or entirely the CMOS sensor as shown in 506. This is not trite for many lenses, which "cut off valuable pixels by covering only part of the image sensor.
- Ribbon cable connector 504 connects the imaging board in FIG. 5 with the central PCB 403.
- the imaging board in FIG. 5 is connected to PCB 403 via inflexible printed circuit board layer, effectively making the central PCB and imaging boards a single printed circuit board.
- other connectors are used depending on requirements for data transfer rate and mechanical performance.
- FIG. 6 shows the top and bottom of the central printed circuit board.
- This board houses the microprocessor (MCU) and/or digital signal processor (DSP) 601.
- the processor is an Analog Devices Blackfin 548BF DSP.
- This processor handles the multiple streams of image and sensor data being captured by the sensor unit's imaging and other sensors at a reasonable component cost and power drain.
- other microprocessors and/or digital signal processors are used, including units with multiple cores.
- the multi-core processing unit allows Linux or other operating system (OS) to ran on the processors, easing the implementation of networking protocols discussed below.
- OS operating system
- Ribbon cable connector 602. connects the cables running to the central PCB from the imaging boards described above in FIG. 5. In one embodiment shown, three of these connectors lie on each side of the central PCB. In some embodiments, other types of connectors are used, ⁇ some embodiments, the central PCB connects to the imaging boards via flexible layers of the printer circuit board, forming effectively one single board.
- USB connector 603 allows the central printed circuit board to connect to an external computer and external power sources.
- the primary memeposes of this U SB connection which in one embodiment uses a micro-USB connector tip, are to load and update the firmware for the sensor unit and to al low for testing, debugging, and if necessary calibration of the unit.
- Wireless module 604 transmits image and other sensor data processed by the microprocessor 601 to the receiver unit 103.
- the wireless module is a WiFly GSX 802.1 Ib/ ' g module with file transfer protocol (FTP) and hypertext transfer protocol (HTTPS) client services.
- FTP file transfer protocol
- HTTPS hypertext transfer protocol
- different wireless modules are used, such as the Texas Instrument's CC3300 module.
- other types of wireless modules incorporating Bluetooth transmitters or transmitters in other ranges of the spectrum (such as those for dedicated military or security communications channels) are employ ed.
- the type of wireless transmitter in each embodiment is tied to end-user needs (for example, the US military operates in restricted frequency ranges and under proprietary protocols).
- Sensor block 605 illustrates a connection point for the non-imaging sensors in the unit.
- the sensor block 605 connects to a digital temperature sensor, a carbon monoxide sensor, and a hydrogen cyanide sensor.
- this sensor block can connect to any suitable sensors, including those listed in Table 1.
- a cable connects the sensors on the surface of sensor unit 101 , but some sensors (e.g. the Geiger counter) do not need to be surface-mounted.
- Digital microphone port 606 connects the digital microphone mounted on the surface of sensor unit 101 to the central PCB.
- this microphone is a mono micro-electro-mechanical system (MEMS) microphone with digital output.
- MEMS micro-electro-mechanical system
- the microphone may be stereo or may connect to several microphones on the surface of the sensor unit 101 , in some embodiments, the microphone is not surface mounted.
- An inertial measurement unit (IMU) 607 on the central printed circuit board provides information about the orientation and direction in which the sensor unit 101 was thrown. This information is useful for providing an image with reference points for the user, such as which direction is up and in which direction the sensor unit 101 w r as thrown. In the absence of such information, the images displayed on the receiver unit 03 might be disorienting.
- (he 1MU is an Invensense MPU 6000, which is a 6-axis gyroscope-accelerometer module.
- 9-axis IMUs are used to compensate for IMIJ "drift" problems. In some embodiments for more extreme motion, multiple IMUs are used. In some embodiments, no IMU is used, relying primarily on software to compensate for orientation as needed.
- a plurality of photo (light) sensors 608 connect to the central PCB. These simple, surface mounted sensors provide information about ambient lighting that allow the sensor unit 101 to modify shutter exposures and LED flash intensity. In some embodiments, these photo sensors are not included and the sensor unit uses the CMOS sensors milliseconds before capturing and image to calibrate lighting and exposure duration.
- Power supply connection 609 indicates where the central PCB connects to a power supply board or external power supply. In some embodiments, there is a separate power supply PCB. In some embodiments, the power supply components are mounted on the central PCB. These power supply components connect either to internal batteries (e.g., lithium ion (LiON) batteries) or to an external power supply. In some embodiments, pow r er can be supplied to this board via the tether 302 shown in FIG. 3.
- internal batteries e.g., lithium ion (LiON) batteries
- pow r er can be supplied to this board via the tether 302 shown in FIG. 3.
- memory 610 includes additional memor (e.g.
- SDRAM Secure Digital RAM
- flash memory flash memory
- This memory allows for some buffering by the microprocessor 601 as needed. In some embodiments, no external memory is needed as the processor can use onboard memory.
- FIG. 7 provides a high-level view of the hardware design and operation.
- Microprocessor and/or digital signal processor 701 (in some embodiments, a Blackfin 548BF, in some embodiments, different microprocessors/DSPs as discussed above) triggers imaging sensor 702 (in some embodiments, an Aptina V022MP9 series monochrome CMOS sensor, in some embodiments, different imaging sensors as discussed above), which is mounted on image sensor board 703, to capture an image.
- imaging sensor 702 in some embodiments, an Aptina V022MP9 series monochrome CMOS sensor, in some embodiments, different imaging sensors as discussed above
- Imaging sensor 702 takes a quick calibration read to determine light conditions in the space being imaged, and based on these conditions determines the appropriate exposure and whether (and how strongly) to trigger LEDs 705. In some embodiments, the calibration is earned out using a photosensor 608, In some embodiments, high-intensity near-infrared LEDs 705 with max output at a wavelength of 850 nm are used, in other embodiments different LEDs are used (as discussed above) appropriate to the application, LEDs 705 rest on an LED board 706 controlled in some embodiments by the CMOS sensor 702 and in some embodiments by the microprocessor 701.
- IMU 707 provides the microcontroller 701 with information about the orientation and acceleration of the sensor unit 101 as it is moving through its path of travel in the air and on the ground.
- the microcontroller 701 associates this information with images and transmits it to the receiver unit.
- This data allows the receiver unit 103 to provide information to the end user that allows that user to understand in which direction the sensor unit was thrown and what orientation the unit had when it took an image. The data can also help determine how to display the images and position information on the receiver unit screen.
- no IMU is used, relying on software correction methods,
- Sensor interface 708 connects additional analog and digital sensors to the microprocessor 701 , In the diagram shown (example of one embodiment), an I 2 C interface connects a carbon monoxide/temperature sensor 709a and a hydrogen-cyanide sensor 709b to the microprocessor. In some embodiments, a wide range of sensors can be employed, examples of which are listed in Table I above.
- Digital microphone 710 captures audio from the environment and transmits this information back to microprocessor 701, which in turn may make it available to receiver unit 103.
- microprocessor 701 there can also be a speaker or buzzer connected to the microprocessor 701, as discussed above.
- stereo microphones or other sound-gathering devices e.g. hydrophones, both analog and digital, are employed.
- microprocessor may employ memory 71 1, flash memory 712, or other forms of storage to buffer or store data or files. In some embodiments, all buffering and storage may be conducted onboard the microprocessor 701.
- Microprocessor 701 accepts and processes information from the imaging sensors 702 and/or the additional sensors 709 and/or the microphone 710 and/or IMU 707. It then transmits data or files including the processed information to onboard flash memory 712 or other memory.
- the microprocessor 701 transmits the data or files directly to the receiver unit 103 via a wireless module 713. In some embodiments, as discussed above, this is a WiFiy module transmitting under 802.1 lb. In some embodiments, a different wireless module is used.
- Wireless module 713 transfers data and communications back and forth between receiver unit 103 and sensor unit 101 over a wireless link with the aid of antenna 714. In some embodiments, the wireless module 713 may broadcast data without a link being established, as in cases when links are difficult to establish.
- Receiver unit 715 receives data from the sensor unit 101 and then processes and displays this information to a user or users.
- this receiver unit is an Android-based tablet running an Android application program.
- this may be another smart device such as an iPad, iPhone, Blackberry phone or tablet, Windows-based phone or tablet, etc., as discussed above.
- this may be a computer.
- this may be a second sensor unit 103 acting as a repeater for this unit, or forming a mesh network of units.
- Power supply 716 provides the electrical energy for the hardware design.
- the power supply may draw current from battery 717.
- battery 717 is a prismatic lithium-ion battery. In some embodiments, it may be one or many alkaline batteries. In some embodiments, battery 717 may take another form of high-performance battery.
- power supply 716 will connect directly to an external power supply 718. In some embodiments, tether 302 may provide the connection to such an external power supply.
- external power supply/adapter 718 is an A/C or USB adapter that helps power the unit 101 and/or charges the battery 717.
- FIG. 8 describes the process via which multiplexing is used to allow the microprocessor 701 to accept data from a plurality of image sensors.
- a BlackfmBF548 microprocessor 802 accepts data from six imaging sensors 803 over only two parallel peripheral interfaces (PPI) 806.
- PPI peripheral interfaces
- Each of 6 image sensors 803 are driven by same clock source 801, which ensures that image data from them is synchronized.
- Each of Image sensors 803 uses 10 bit data bus to transfer images.
- Six image sensors 803 are separated into two groups of three image sensors 803 in each group - groups 807 and 808.
- 24-bit signal 809 and 810 Eight Most Significant Bits from 3 image sensors 803 in each group are placed sequentially, forming 24-bit signal 809 and 810, Two Least Significant Bits from 3 image sensors 803 in each group are placed sequentially, forming 6-bit signal 81 1 and 812, Two 24-bit signals 809 and 810 are multiplexed by multiplexor 805 A into single 24-bit signal 813. Two 6-bit signals 811 and 812 are multiplexed by Multiplexor 805B into single 6-bit signal 814.
- the 24-bit signal 813 is sent to PPI 0 port of BF548 802, The 6-bit signal is sent to PPI. port of BF548 802.
- Multiplexor 805 passes data from group 807 during high level of clock signal 815, and from group 808 during low level of clock signal 815, resulting in doubling data rate of the image data, in order to correctly receive this data, both of PPI ports 806 must use clock 816, which is double the clock used by image sensor.
- clock source 801 allows phase control between clocks 815 and 816.
- this combination of multiple image data streams is achieved via the use of a Field-Programmable Gate Array (FPGA).
- FPGA Field-Programmable Gate Array
- small microprocessors associated with each of the image sensors can buffer data and thus address the multiple-data- input problem solved through multiplexing above.
- FIG. 9 offers a high-level view of the firmware implementation on the sensor unit 101.
- the on-board processor 901 runs a full operating system, such as Linux or Real Time OS.
- a full operating system such as Linux or Real Time OS.
- the firmware 902 for microprocessor 901 may be written in C, a widely used programming language. In some embodiments, different programming languages might be utilized - interpreted scripting and/or assembly.
- the firmware begins its executions upon reset and runs one time initialization of the hardware first, as illustrated in 903. From there, the main execution loop is entered and run repeatedly as indicated in 904.
- Peripherals and services may be specific to on-board processor 901 and may vary in other embodiments.
- Peripherals for 901 processor include PPI bus 907 for imaging sensors, I 2 C bus 908 for additional nonimaging sensors control and data acquisition, serial peripheral interface (SPI) bus 909 for wireless connectivity, I 2 S bus 910 for audio and universal asynchronous receiver/transmitter (UART) channel 91 1 for auxiliary communication functionality.
- Services include timers 912, power management facilities 913 and general purpose I/O 914 for various system needs.
- the firmware 902 controls and lEtilizes external devices attached to processor 901 by mechanical and electrical means.
- Set of image sensors 915 is controlled and utilized via PPI bus 907 and I 2 C bus 910. Audio functionality 920 is controlled and utilized via 1 2 S bus 910.
- Wireless connectivity module 917 is controlled and utilized via SPI bus 909.
- Set of system sensors 916 (temperature, toxic gases, buzzer, IMU, etc) is controlled and utilized via I bus 918.
- UART channel 91 1 and its multiple instances can serve many auxiliary control and utilization needs, e.g., test bench command line terminal 919 or alternative access to wireless connectivity module 917.
- Most of system devices external to the processor 901 are also controlled and utilized via GPIO 914 pins.
- Utilization and control for image sensor functionality in firmware allows proper acquisition of images into processor's 901 internal memory. Similarly other data is collected from all system sensors.
- the firmware uses wireless connectivity functionality embedded in the module 917, which provides 802.1 1 WiFi protocol along with higher level communication stacks, namely TCP/IP, Berkeley software distribution (BSD) sockets, FTP and HTTP, In some
- FIG. 10 illustrates one of several possible architectures for communication between the sensor unit 1001 and the receiver unit 1002,
- the sensor unit acts as WEB service client to the receiver unit and sensor's wireless module 1003 facilitates such behavior by providing embedded plain TCP/IP, BSD sockets, FTP and HTTP protocols and stacks.
- Microprocessor 701(901) communicates with wireless module 1003(917) over UART or SPI connection.
- sensor unit 1001 may implement and act as a server to the receiver unit client with support from the wireless module. Data transmission might also occur in ad hoc fashion without a clear server-client arrangement established.
- wireless module 1003 links as a client to a server on receiver unit 1002 via an 802.1 l b wireless link 1004,
- the server on the receiver unit 1002 e.g., an Android tablet
- operates at the operating system level e.g., Android Linux
- the server or client on the receiver unit can be implemented at the application level (e.g., at the Java level in an applicaiion program).
- the application program 1005 both configures the server properties of the receiver unit and processed data from the sensor unit 1001.
- FIG. 1 1 shows a simplified, example, high level diagram of the design of the display application on receiver unit 1101.
- This application displays for the user a series of images 1 102 of the space into which the sensor unit 101 is thrown. These images 1 102 can cycle automatically or be advanced manually, and display the perspective of the sensor unit 1 101 at different intervals over the course of its travel. The application on the receiver unit that produces these images is described below.
- Images 1 102 are oriented based on IMU information from the sensor unit 103 in such a way as to make the images intelligible to the user (e.g. right-side up and pointing in the direction that the sensor unit 101 was thrown). This is important for the user, as it provides visual reference points important for making decisions about entering a space (e.g. "Is that object to the right or left relative to where the bail was thrown?").
- the application has a "deep inspection mode", which allows the user to get more information about the scene displayed.
- the user can get a "sensor unit's 101 perspective” as if he/she were standing in the place the image is taken and could look left/right/up/down.
- the user can use gestures on the screen or interface to, for example, swipe fingers right to look right or swipe fingers up to look up. Because first responders often use gloves and can not use finger-activated swipes, the application also has an option to use a receiver unit's built-in gyroscopes and accelerometers to navigate the image.
- the application defaults to a single image with data overlay to quickly provide crucial information - only providing the additional functionality if the user has time and mental bandwidth to decide to access it.
- Sensor data overlay 1 103 provides an example of how additional sensor data is displayed in some embodiments.
- data 1 103 about temperature and gas levels is provided at the bottom of the screen.
- data is overlaid directly over the image where it is relevant,
- Headphone jack 1 104 on the receiver unit 1 101 allows the user or users to listen to audio data being transmitted from the sensor unit 101.
- the application which display s information on receiver unit 1 101 can take several forms. In one embodiment it is a Java-based Android application program running on an Android tablet or smartphone (as shown in FIG. 1 1 ). In some embodiments, it may be an application program on another operating system, such as iOS, Windows, or Blackberry. In some embodiments, it may be a custom application for a different receiver unit. In each case, the application program's three main functions are: a) configuring the communications protocols with one or more sensor units 101, b) processing image and sensor information received from the sensor unit 101, and e) displaying that information in a way that is useful to the end user. In some embodiments, the application has further functions, including triggering/deciding when an image or data point is taken, activating beepers, sirens, or diversionary devices, and controlling the motion of sensor units 101 when these are self- propelled.
- FIG. 12, FIG. 13, and FIG. 14 illustrate how the application program running on receiver unit 1 101 processes and displays the images receiver from sensor unit 101.
- the creation of a panoramic image with the image data from the sensor unit 101 may, in one embodiment, assume the configuration shown in FIG. 12 of spherically projected images. For clarity, a wide-angle of 100° is shown in this example for the horizontal field of view (RFQV) and a 63° vertical field of view (VFQV), which are lower that the real FQV achieved with wide-angle or fish-eye lenses. It is shown that the image orientations rotate always 90° between neighbors to increase the coverage of the spherical field of view.
- the aspect ratio shown is the same one as in the image sensor chosen in one embodiment (e.g., 480/752).
- FIG. 13 shows another sphere coverage example with an HFOV of 140° and a VFOV of 89°.
- the spherical projection of each image is computed from the sensor image, and due to the displacement of each image sensor in the physical sphere, the center of the spherical projection is also displaced with respect to the center of the reference sphere, on which the panoramic image is created.
- the panorama creation follows the processing pipeline depicted in FIG. 14. Once the input images 141 1 are received, the panorama creation process is separated in two main steps: Registration 1401 and Compositing 1402.
- Registration 1401 begins with initial image distortion correction 1403. In then proceeds to feature detection 1404, which among other things allows for control point matching across neighboring images. Feature match 1405 follows based on feature detection 1404. Next in the process is estimation of image sensor parameters 1406.
- the exposure of the image is estimated 1408 and compensated for 1409.
- the images are then blended 14.10 into a single image, which forms the final panorama 141 1 displayed to the user on the receiver unit.
- the image processing must mathematically correct for that spatial displacement (otherwise the perspective from each of the image sensors would seem to have been shifted relative to a neighboring image sensor's images). This is accomplished by mathematically assigning the position of the image sensor relative to that virtual central reference point.
- multiple images taken at different points in the travel of the sensor unit 101 can allow stereoscopic processing of images, allowing for the creation of three- dimensional representations of a space.
- images from multiple sensor units 101 thrown into a space can similar provide stereoscopic perspective, again allowing for three dimensional representations of the space.
- the use of several sensors units 101 can allow for effective "mapping" of a space using the
- the sensor unit 101 can be deployed as pari of a broader system, such as when employed with other sensor units 101 in a mesh network, when deployed along with robots or other remote sensing equipment, or when integrated into a broader communications system employed by first responders or the military (a nation-wide first responder network for such coordination is currently being deployed for this purpose).
- FIG 15 illustrates motor 1501 and counter- weight 1502. When activated the motor 1501 turns, changing the position of counter-weight 1502. This shift in the position of counter-weight 1502 changes the center of gravity of the receiver unit 101 , one hemisphere 1503 of which is shown in the diagram. This change in center of gravity of the receiver unit causes it to roll or "hop" either randomly in some embodiments or in a more directed fashion (when controlled using data inputs from ihe 1MU 607) in other embodiments,
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201261717130P | 2012-10-23 | 2012-10-23 | |
US13/801,649 US9479697B2 (en) | 2012-10-23 | 2013-03-13 | Systems, methods and media for generating a panoramic view |
US13/801,627 US8957783B2 (en) | 2012-10-23 | 2013-03-13 | Remote surveillance system |
US13/801,558 US9426430B2 (en) | 2012-03-22 | 2013-03-13 | Remote surveillance sensor apparatus |
PCT/US2013/066205 WO2014066405A1 (en) | 2012-10-23 | 2013-10-22 | Remote surveillance sensor apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2912835A1 true EP2912835A1 (en) | 2015-09-02 |
EP2912835A4 EP2912835A4 (en) | 2016-10-19 |
Family
ID=53717813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13849824.1A Withdrawn EP2912835A4 (en) | 2012-10-23 | 2013-10-22 | Remote surveillance sensor apparatus |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP2912835A4 (en) |
WO (1) | WO2014066405A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102011109990A1 (en) | 2011-05-05 | 2012-11-08 | Jonas Pfeil | Camera system for taking pictures and associated method |
TW201632949A (en) * | 2014-08-29 | 2016-09-16 | 伊奧克里公司 | Image diversion to capture images on a portable electronic device |
EP3207695A1 (en) | 2014-10-17 | 2017-08-23 | Panono GmbH | Camera system for capturing images and methods thereof |
EP3339951A1 (en) | 2016-12-20 | 2018-06-27 | Nokia Technologies Oy | Fill lighting apparatus |
US20180224657A1 (en) * | 2017-02-06 | 2018-08-09 | The Charles Stark Draper Laboratory, Inc. | Integrated Wide Field of View Optical System for Image Based Navigation Applications in G-hardened Package |
CN206619273U (en) * | 2017-03-20 | 2017-11-07 | 深圳创维汽车智能有限公司 | A kind of 360 ° of vehicle mounted infrareds transmitting remote control and vehicle-running recording system |
DE102017208598A1 (en) | 2017-05-22 | 2018-11-22 | Robert Bosch Gmbh | Safety arrangement for safety monitoring of a plant, production plant and transport vehicle with the safety arrangement |
CN109544456B (en) * | 2018-11-26 | 2022-04-15 | 湖南科技大学 | Panoramic environment sensing method based on two-dimensional image and three-dimensional point cloud data fusion |
WO2022106216A1 (en) * | 2020-11-18 | 2022-05-27 | Interdigital Ce Patent Holdings, Sas | Method and apparatus for obtaining observation data of an environment |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7023913B1 (en) * | 2000-06-14 | 2006-04-04 | Monroe David A | Digital security multimedia sensor |
IL146802A0 (en) * | 2001-11-28 | 2003-01-12 | Wave Group Ltd | A self-contained panoramic or spherical imaging device |
IL156478A0 (en) * | 2003-06-17 | 2004-07-25 | Odf Optronics Ltd | Compact rotating observation assembly with a separate receiving and display unit |
US7325495B1 (en) * | 2005-04-11 | 2008-02-05 | Thomas Giandomenico | Hot gas deployment devices |
US20090167861A1 (en) * | 2005-07-13 | 2009-07-02 | Ehud Gal | Observation System |
GB0701300D0 (en) | 2007-01-24 | 2007-03-07 | Dreampact Ltd | An inspection device which may contain a payload device |
US8035734B2 (en) * | 2007-04-02 | 2011-10-11 | Kenneth R Jones | Self-balancing remote sensing device and remote sensing system comprising same |
JP5154152B2 (en) | 2007-07-04 | 2013-02-27 | ルネサスエレクトロニクス株式会社 | Boost power supply circuit |
US8149285B2 (en) * | 2007-09-12 | 2012-04-03 | Sanyo Electric Co., Ltd. | Video camera which executes a first process and a second process on image data |
US8237787B2 (en) * | 2009-05-02 | 2012-08-07 | Steven J. Hollinger | Ball with camera and trajectory control for reconnaissance or recreation |
-
2013
- 2013-10-22 WO PCT/US2013/066205 patent/WO2014066405A1/en active Application Filing
- 2013-10-22 EP EP13849824.1A patent/EP2912835A4/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO2014066405A1 (en) | 2014-05-01 |
EP2912835A4 (en) | 2016-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10019632B2 (en) | Remote surveillance sensor apparatus | |
US9479697B2 (en) | Systems, methods and media for generating a panoramic view | |
US8957783B2 (en) | Remote surveillance system | |
WO2014066405A1 (en) | Remote surveillance sensor apparatus | |
US20200366841A1 (en) | Imaging systems and methods | |
US11412627B2 (en) | Multipurpose accessory and storage system | |
US10032387B2 (en) | Wireless immersive simulation system | |
US9344612B2 (en) | Non-interference field-of-view support apparatus for a panoramic facial sensor | |
WO2016137146A1 (en) | Mobile communication terminal having unmanned air vehicle | |
KR101395354B1 (en) | Panorama camera | |
CN103620527A (en) | Headset computer that uses motion and voice commands to control information display and remote devices | |
KR20170095716A (en) | 360 degree image capture apparatus enclosed in a ball-shape housing | |
US20110164137A1 (en) | Reconfigurable surveillance apparatus and associated method | |
WO2017119653A1 (en) | Electronic device having camera module | |
US20110085041A1 (en) | Stably aligned portable image capture and projection | |
EP2685707A1 (en) | System for spherical video shooting | |
KR102007390B1 (en) | Gimbal | |
JP2015080186A (en) | Automatic positioning tracking photographing system and automatic positioning tracking photographing method | |
WO2020129029A2 (en) | A system for generating an extended reality environment | |
KR101855790B1 (en) | Control watch for ominidirectional camera | |
CN104735352B (en) | Image recording device, panoramic picture camera device, detecting ball and detecting system | |
CN205378126U (en) | Spring imager based on cloud computing environment | |
WO2022141348A1 (en) | Systems, devices, and methods supporting multiple photography modes with a control device | |
KR20140064295A (en) | Method and apparatus of acquiring and transmitting dynamic image | |
CN105681757A (en) | Bouncing imager based on cloud computation environment and imaging method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20150513 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
RA4 | Supplementary search report drawn up and despatched (corrected) |
Effective date: 20160919 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G03B 37/02 20060101ALI20160913BHEP Ipc: F42B 12/46 20060101ALI20160913BHEP Ipc: G03B 17/02 20060101ALI20160913BHEP Ipc: H04N 5/225 20060101AFI20160913BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190319 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210209 |