US20140327733A1 - Image monitoring and display from unmanned vehicle - Google Patents

Image monitoring and display from unmanned vehicle Download PDF

Info

Publication number
US20140327733A1
US20140327733A1 US14/036,669 US201314036669A US2014327733A1 US 20140327733 A1 US20140327733 A1 US 20140327733A1 US 201314036669 A US201314036669 A US 201314036669A US 2014327733 A1 US2014327733 A1 US 2014327733A1
Authority
US
United States
Prior art keywords
image capture
display
image
display system
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/036,669
Inventor
David Wagreich
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CRANE-COHASSET HOLDINGS LLC
Original Assignee
CRANE-COHASSET HOLDINGS LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201261685539P priority Critical
Priority to US13/847,161 priority patent/US9350954B2/en
Application filed by CRANE-COHASSET HOLDINGS LLC filed Critical CRANE-COHASSET HOLDINGS LLC
Priority to US14/036,669 priority patent/US20140327733A1/en
Assigned to CRANE-COHASSET HOLDINGS, LLC reassignment CRANE-COHASSET HOLDINGS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAGREICH, David
Publication of US20140327733A1 publication Critical patent/US20140327733A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLYING SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/0063Recognising patterns in remote scenes, e.g. aerial images, vegetation versus urban areas
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23238Control of image capture or reproduction to achieve a very large field of view, e.g. panorama
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23293Electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/247Arrangements of television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infra-red radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed circuit television systems, i.e. systems in which the signal is not broadcast
    • H04N7/181Closed circuit television systems, i.e. systems in which the signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed circuit television systems, i.e. systems in which the signal is not broadcast
    • H04N7/183Closed circuit television systems, i.e. systems in which the signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed circuit television systems, i.e. systems in which the signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C2201/00Unmanned aerial vehicles; Equipment therefor
    • B64C2201/12Unmanned aerial vehicles; Equipment therefor adapted for particular use
    • B64C2201/127Unmanned aerial vehicles; Equipment therefor adapted for particular use for photography, or video recording, e.g. by using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C2201/00Unmanned aerial vehicles; Equipment therefor
    • B64C2201/14Unmanned aerial vehicles; Equipment therefor characterised by flight control
    • B64C2201/146Remote controls

Abstract

An image capture and display system comprises an image capture array positioned on a vehicle, the image capture array comprising a plurality of image sensing members. At least one transmitter is provided for transmitting images from the image sensing members. The system also includes a monitor display array positioned remotely from the image capture array. The monitor display array comprises a display monitor for and corresponding to each image sensing member, and a receiver for receiving the images corresponding to each transmitter.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation-in-part application of U.S. patent application Ser. No. 13/847,161 filed Mar. 19, 2013, which claims the benefit of U.S. Provisional Patent Application No. 61/685,539 filed Mar. 20, 2012, the contents of both of which are incorporated by reference herein in their entirety.
  • FIELD AND BACKGROUND OF THE INVENTION
  • This invention relates to an image monitoring and display from a vehicle, typically an unmanned vehicle, but also capable of use on a manned vehicle. More particularly, the invention relates to a system of sensors mounted on a vehicle, whether manned or unmanned, a display system, and transmission and reception mechanisms associated with the sensors and display system respectively whereby images or information captured by the sensors can be viewed on the display system. There may be a single or multiple display systems.
  • In one preferred form, the invention is for an expanded perspective, wrap-around view, image capture and monitoring system for remote piloting of UAVs (unmanned aerial vehicles).
  • One significant shortcoming when utilizing a single lens, conventionally configured camera and monitoring device combination for airborne first person view (FPV) orientation and guidance of an unmanned aerial vehicle (UAV) during remote control operations is the limited field of view (FOV) that it offers. When using a single lens, conventionally-configured camera and monitor combination, the relatively narrow perspective that is created may hinder a pilot's ability to ascertain an object's true location and proximity to the UAV due to a lack of ambient visual cues, making accurate attitude and control adjustments more difficult.
  • In certain instances, a very wide perspective (extreme wide angle) lens can be mounted on the conventionally configured single camera setup to compensate for the limited field of view. The problem with using an extreme wide angle lens is that, because of it's unnaturally exaggerated wide perspective and accompanying increased spherical distortion, objects appear diminished in size and an observer's perception of movement and speed is inaccurate, making the precise maneuvering of the craft difficult. This effect is somewhat analogous to the configuration of the right side view mirror in most automobiles, namely, objects are closer than they appear, with accurate distance determinations therefore difficult to make.
  • SUMMARY OF THE INVENTION
  • According to one aspect of the invention, there is provided an image capture and display system comprising: an image capture array positioned on a vehicle, the image capture array comprising a plurality of image sensing members, and at least one transmitter for transmitting images from the image sensing members; and a monitor display array positioned remotely from the image capture array, the monitor display array comprising a display monitor for and corresponding to each image sensing member, and a receiver for receiving the images corresponding to each transmitter.
  • In one form, the vehicle is an unmanned aerial vehicle, and the monitor display is positioned at a ground based location. A transmitter may be provided for each of the image sensing members in the image capture array and a receiver corresponding receiver is provided for each of the display monitors in the monitor display array.
  • In one embodiment, image sensing members of different types are placed in the image capture array so that the system can simultaneously capture images of different conditions.
  • Preferably, the image capture array comprises a frame member attached to the vehicle configured for mounting of the image sensing members thereon. Further, images from two or more image sensing members may be combined for transmission by a single transmitter, the transmission being received by a receiver capable of separating and reconstructing the images to display them on display monitors to form a substantially continuous visual representation as captured by the image sensing members.
  • In another embodiment of the invention, the image sensing members comprise a first forward facing camera capturing a first forward field of view and a second rearward facing camera capturing a second rearward field of view; and the monitor display array comprises a first display monitor which is curved and displays the images corresponding to the first forward field of view, and a second display monitor which displays images corresponding to the second rearward field of view. The second display monitor may be curved, smaller than the first display monitor, and located above the first display monitor.
  • According to another aspect of the invention, there is provided an image capture and display system comprising: an image capture array for positioning on a vehicle, the image capture array comprising a plurality of image sensing members, and at least one transmitter for transmitting images from the image sensing members; and a monitor display array for positioning remotely and comprising a pair of wearable video type goggles having a visual display for and corresponding to the composite image provided by the image sensing member, and a receiver for receiving the images corresponding to each transmitter.
  • In another aspect of the invention, there is provided a method for the capture and display of images, the method comprising: attaching an image capture array to a vehicle, the image capture array comprising a plurality of image sensing members, and at least one transmitter for transmitting images from the image sensing members; and positioning a monitor display array positioned remotely from the image capture array, the monitor display array comprising a display monitor for and corresponding to each image sensing member, and a receiver for receiving the images corresponding to each transmitter for display on the display monitors.
  • The expanded perspective, wrap-around view, image capture and monitoring system, which can be in both vertical and horizontal views, for remote piloting of UAVs is intended to counteract the limitations of the traditional single camera and monitor FPV combination for UAVs by providing the pilot or operator thereof with a wider, yet substantially undistorted optical perspective that is more in line with the way a human normally sees the environment that encircles him. By enhancing peripheral vision and affording the pilot or operator (or an individual monitoring the craft's actions if it is being flown autonomously) the ability to see what is surrounding the aircraft in all directions, more accurate control input decisions can be made with vastly improved situational awareness and obstacle avoidance capabilities. The cumulative effect further makes the craft safer and improves capability for aerial surveillance, aerial inspection, aerial mapping, transport and lifting of materiel, aerial photography or videography, and other remotely controlled UAV applications.
  • The panoramic, wrap-around view can be achieved in a number of ways. These ways would include, but are not limited to, the following:
  • (1) Utilizing multiple, relatively normal perspective (˜55°, for example) cameras or sensors fashioned in a generally circular (or semi-circular) array which capture images or information which are then transmitted to, and then displayed on a corresponding monitoring array that is fashioned in a general circle (or semi-circle).
  • (2) Utilizing one or more panoramic or surround-view cameras or sensors to create a circular (or semi-circular), curved view, which captures images it information that are then transmitted to, and displayed on a corresponding curved monitoring system.
  • Version (1):
  • Multiple, Normal-Perspective, Image Capture (camera or sensor) Array.
  • This version of the device comprises four main stages or components. These are:
  • (1) an image capturing stage,
  • (2) a wireless image data transmission stage,
  • (3) a wireless image data reception stage, and
  • (4) an image display stage.
  • The first stage is that of the image capture. To achieve this, multiple motion-picture, image capturing devices, such as cameras or sensors, are arranged, each in succession, in a generally circular or multi-faceted pattern to create an array of cameras or sensors which afford a substantially continuous and contiguous 360° view around the vehicle, which in this embodiment will be considered to be an aircraft. It should be noted that this array can also be created with fewer image capture devices if the application calls for less than a 360° view, or if each image capture device covers a wider view. For instance, multiple cameras could be installed on the front quadrant of the craft creating something like a 180° forward view, with another camera facing the rear creating a 90° rearward view, the cumulative result being a non-contiguous view of 270°. Many different combinations to cover selected areas may be configured within the scope of the present invention.
  • The structure supporting the array of cameras or sensors can take any number of forms and is, generally, of secondary importance so long as the structure permits cameras to be mounted with the appropriate spacing and offset angle, relative to one another, to facilitate an accurate circular view or other selected of the environment surrounding the aircraft.
  • The second stage is the wireless signal transmission. The video output feeds from each of the cameras or capturing devices is relayed to multiple transmitters, or alternatively, can be amalgamated by a “combiner” into a matrix which then relays this amalgamated signal to a single transmitter. These transmitters, or the single transmitter when using the amalgamated and matrix feed method, then use one or more of radio frequencies, infrared frequencies, or other electromagnetic transmission frequencies to send a wireless signal to remote receivers, or a single receiver, when using the amalgamated and matrix feed method. These signal transmissions can be either in an analogue or digital format, subject to the requirements of the application.
  • The third stage is the wireless signal reception. The signal(s) received from the transmitter(s) as described above are captured by multiple receivers, or a single receiver if the amalgamated and matrix feed method is used. These signals are converted into video feeds, and in the case of an amalgamated and matrix feed method, the signal is “uncombined” or decoded, and sent to the display portion of the device.
  • The fourth stage is the image monitoring. The image monitoring or display stage of this multiple normal-perspective image capture version of the device can be implemented in the following several different ways. Examples of such ways are as follows:
  • (a) In a first variant, there is an image monitoring station comprising an array of monitors which are configured so that the array generally mimics the layout or configuration of the airborne portion (that is, cameras or sensors) of the expanded perspective, wrap-around view, image capture and monitoring system of the invention. This “station” is preferably comprised of a series of video monitors or displays arranged in succession in a generally circular or multi-faceted pattern around the pilot or operator(s), who may be positioned within or at the center of the circle. The video feed from each of the image capture devices onboard the airborne craft is fed to its corresponding video display in the array, providing the pilot or operator(s) with a contiguous, coherent, remote view of the environment surrounding the unmanned aerial vehicle, or other type of vehicle as the case may be.
  • As an example, if there are six outwardly facing cameras, arranged at about a 60° offset on the airborne craft, then there would be six monitor displays facing inward, arranged at about the same 60° offset at the monitoring station. The pilot or operator would be situated within the circle (or multi-faceted arrangement), encompassed or surrounded by the array of displays, in a viewing position that best augments UAV situational awareness and control.
  • The monitor configuration does not have to be circular, and can be arranged in any configuration that is appropriate to the given application, such, for example only, hexagonal, square, elliptical, or the like.
  • (b) In the second variant of the image display, a wrap-around monitor with a generally circular (or semi-circular) screen is used to display the feeds from the multiple image capturing devices. The associated images from each of the transmitted video feeds would be displayed one next to the other in the correct order relative to each other in order to create a substantially contiguous “single” image on this curved monitor display.
  • Additionally, motion picture “stitching” software can optionally be utilized to “join” or make the real time output of the multiple camera feeds appear seamless as if generated by one big panoramic camera, if desired.
  • (c) In the third variant, the video feeds from the cameras or sensors are viewed using video “immersion” type goggles, or equivalent type goggles, which one or more operators may be wearing. These goggles may be used in conjunction with “head-tracking” or “eye movement tracking” technology where virtual tracking of the visual field is based upon either or both of the operator's head motion or eye movement. The necessity of this tracking technology is due to the fact that these types of goggles often provide a field of view (FOV) that may be narrower or more limited than that which the expanded perspective, wrap-around view system image sensor is capable of providing. Technological developments may allow for goggles that afford a FOV that is wide enough to encompass the complete, expanded perspective image that has been acquired by the image capture portion of the expanded perspective, wrap-around view system. In this event, this type of goggle can be utilized.
  • Version 2:
  • Panoramic Image Capture (cameras or sensors).
  • The expanded perspective, wrap-around view, image capture and monitoring system of the invention can also be implemented using one or more image capturing devices (such as cameras or sensors) with panoramic lenses to create the desired wrap-around, surround view.
  • As described above with reference to the multiple normal-perspective image capture array version (version 1), the panoramic image capture version of the expanded perspective, wrap-around view, image capture and monitoring system for remote piloting of UAVs also consists of four main stages or components, as follows:
  • (1) an image capturing stage,
  • (2) a wireless image data transmission stage,
  • (3) a wireless image data reception stage, and
  • (4) an image display stage.
  • The first stage is the image capture. The image capture or sensing stage of the panoramic image capture version (version 2) of the device can be implemented in the a number of ways, the following of which are representative examples and are not intended to limit the scope of the invention:
  • (a) A singular, “wrap-around”, 360° panoramic image capture device (i.e. lens, camera or sensor) can be used to create a substantially “seamless” 360°, circular view. Alternatively, a wrap-around, semi-circular view of less than 360° can be implemented as well, depending on the application. For example, the wrap-around view could be 180°, or 270°, or 300°, or some other selected value.
  • (b) Multiple “wrap-around”, panoramic image capture devices (i.e. lenses, cameras or sensors), depending on the application, can be used to create a cumulative wrap-around view of up to 360°. One example of this image capture could be: one panoramic lens and video sensor combination creating a 180° view forward of the aircraft, plus one panoramic lens and video sensor combination creating a 90° view to the rear of the craft, in total creating a cumulative, non-contiguous view of 270°. Other configurations would be within the scope of the invention.
  • The second stage is the wireless signal transmission. The video output feeds from each of the cameras or capturing devices is relayed to multiple transmitters, or alternatively, can be amalgamated by a “combiner” into a matrix which then relays this amalgamated signal to a single transmitter. These transmitters, or single transmitter when using the matrix feed method, then use radio frequencies, infrared frequencies, or other electromagnetic transmission frequencies to send a wireless signal to remote receivers, or a single receiver, depending on the method in place. These signal transmissions can be either in an analogue or digital format, based on the requirements of the selected application.
  • The third stage is the wireless signal reception. The signal(s) from the transmitter(s) are captured by multiple receivers, or a single receiver if the amalgamated and matrix method is used. These signals are converted into video feeds, or, in the case of an amalgamated and matrix method, the signal is “uncombined” or decoded, and sent to the display portion of the device.
  • In this case, as is the situation with any of the other embodiments or versions described herein, there may be a mix of single transmitters and receivers for each image capturing device and its corresponding monitor, used in conjunction with the amalgamated or combined transmitters and receivers. Thus, one or more selected camera or sensor and its paired monitor may each have their own dedicated transmitters and receivers, while two or more of the remaining cameras or sensors and monitors pairs may have amalgamated or combined transmitters or receivers serving multiple such pairs.
  • The fourth stage is the image monitoring. The image monitoring stage of the panoramic image capture (version 2) of the device can be implemented in the following ways:
  • (a) In a first variant, one or more wrap-around monitors with generally circular or semi-circular or otherwise curved display surfaces are configured to exhibit the feed or feeds of the image capture portion of the system in the first stage. The layout of these monitors will be configured to emulate to the extent possible or desired the lens and camera (or sensor) arrangement on the airborne component of the craft, creating a viewing perspective that attempts to “mirror” or is representative of the airborne field of view that is being captured on the vehicle.
  • (b) In a second variant, the video feeds are viewed using video “immersion” type or similar type goggles, which the operator(s) may be wearing in conjunction with “head-tracking” or “eye movement tracking” technology where virtual tracking of the visual field is based upon head motion or eye movement of the operator. The necessity of this tracking technology is due to the fact that these types of goggles often provide a field of view (FOV) that may be narrower or more limited than that which the expanded perspective, wrap-around view system image sensor is capable of providing. Technological developments may allow for goggles that afford a FOV that is wide enough to encompass the complete, panoramic image that has been acquired by the image capture portion of the expanded perspective, wrap-around view system. In such an event, this type of goggle can be utilized.
  • System Configuration
  • The airborne, camera or image capture portion of the expanded perspective, wrap-around view, image capture and monitoring system of the invention may be attached to some part or structure of the vehicle being flown, and preferably in a location where the view is largely unobstructed by the craft for unhampered performance in collecting desirable images. The location may be above, below, in front of, behind, or to the side, of the host vehicle, or such other suitable location.
  • The expanded perspective, wrap-around view, image capture and monitoring system can be installed on any type or configuration of airborne UAV. This would include (but not be limited to) the following types of aircraft: fixed-wing, rotor craft (conventional single blade or multi-rotor), glider, lighter-than-air, rocket, space vehicle, etc. The UAV can be of any size or weight.
  • The monitoring or display station portion of the expanded perspective, wrap-around view, image capture and monitoring system can be implemented as a fixed, ground-based installation, or it can be carried by or contained within a variety of mobile vehicles. These host mobile vehicles would include (but are not limited to) vehicles that are ground-based, water-based (surface or underwater), airborne, or deployed in space.
  • The monitoring or display station can also be installed in the airborne craft that is being piloted and which contains the image capturing portion of the system. As an example, the image capture portion of the device can be mounted outside the aircraft, while the monitoring portion is installed inside the same flying craft. This type of configuration would be useful in scenarios such as night operations where the craft is being piloted using thermal (infrared) imaging as opposed to a visible light spectrum camera (or sensors), or when dangerous environmental factors such as high radiation levels necessitate that the pilot or occupants of the craft be shielded from these dangerous elements, which may impact the visibility of the outside conditions from the aircraft.
  • The expanded perspective, wrap-around view system of the invention can utilize many types of image capturing or sensing devices in its construction. These would include (but not be limited to) normal visual spectrum cameras or sensors, infrared cameras or sensors for night vision and/or heat sensing (useful in the carrying out of fire prevention or search and rescue operations), radiation sensors, pressure sensors, gaseous content or humidity sensors, sonic sensors, EMF spectrometers, to name a few possibilities.
  • Furthermore, the system can be configured to capture and display a 3D version of the wrap-around view for increased depth perception cues.
  • The complete system and its operation, from image capture, through transmission, to monitoring, would be designed to accomplish minimal delay or latency throughout the signal chain to give the pilot or operator information in as close to real time as technically feasible.
  • The apparatus and method of the invention can utilize analogue, digital, channel hopping digital (i.e. COFDM), infrared, or other yet to be developed image capture or transmission technologies, or even a combination of such technologies to achieve optimal performance.
  • The image displayed on the monitoring portion of the system can be augmented or even replaced by computer generated 3D map imagery linked to a navigational system such as GPS or a Real Time Kinematic System if desired. And although the displayed image may be generated synthetically, the benefits of the wrap-around view would still be present in that it provides the pilot or operator(s) with wide perspective, visual cues and increased situational awareness for accurate control input and craft maneuverability.
  • For example, if a UAV is being operated in an environment of reduced or even zero visibility, such as in fog or during fire fighting operations where very thick smoke is present, the wrap-around computer generated, augmented view of surrounding terrain or obstacles will allow the pilot or operator the ability to maneuver the craft effectively and safely despite his or her inability to see any clear image using a conventional visual spectrum video feed.
  • The aircraft borne component (Stages 1 and 2) of the expanded perspective, wrap-around view, image capture and monitoring system installation, in any of its variants, can be implemented with or without gyro stabilization, or other types of stabilization, technology depending on the application and whether stabilization and/or a level horizon is deemed to be advantageous to the application.
  • In one embodiment, the airborne component (Stages 1 and 2) of the expanded perspective, wrap-around view, image capture and monitoring system installation, in any of its variants, can be placed within a protective housing, chamber or other suitable structure that safeguards and protects the components such as the lenses from weather elements, from shock in the event of a crash, hostile weapons attacks, or if it allows for aerodynamic benefits. Various filters may also be provided on the cameras, and in another embodiment, some or all of the cameras or sensors may be contained in a dedicated housing or container to afford the necessary protection.
  • Transmission repeaters or signal amplifiers can be added to the system and placed in a location or locations which are considered to improve or boost signal quality or extend the range of the UAV's operational area. As on example only, if a craft is flown on one side of a mountain range, with the pilot or operator's monitoring station situated on the other side of a mountain range in a position that does not facilitate a suitable line-of-sight (LOS) signal transmission, a repeater and/or an amplifying device can be placed at a point of high elevation atop the mountain range to aid in the quality and strength of the signal that is being transmitted and received.
  • A transmission repeater can also be carried aloft by another aircraft (manned or remotely-controlled), or satellite in addition to or instead of the above situation.
  • The signal that is being broadcast by the airborne component (Stages 1 and 2 as described above) of the expanded perspective, wrap-around view, image capture and monitoring system can be received and displayed by multiple monitoring or display stations (fixed-installation or mobile) simultaneously. As such, the video feed may be monitored in several locations at the same time, providing information to different operators or entities for use according to specific need. Such locations may be a command center as well as a field unit, so that both may benefit, and use the information, from the live video or other feed.
  • More than one expanded perspective, wrap-around view, image capture and monitoring system can be mounted on a UAV depending on the application. For instance, one system could work within the visual spectrum for piloting (or other) purposes, while another system could utilize a thermal imaging spectrum for aerial surveillance, fire fighting operations, or search and rescue purposes. Additionally, more than one system can be installed on a UAV for the purpose of operational redundancy in the event that one system fails. Further, different types of system feeds can be transmitted to different monitoring stations depending on the needs of each such monitoring station.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the drawings:
  • FIG. 1 is a schematic representation of an unmanned aerial vehicle and a monitoring display station in accordance with one aspect of the invention;
  • FIG. 2 is perspective view of a camera or sensor arrangement on an unmanned aerial vehicle in accordance with one embodiment of the invention;
  • FIG. 3 is a top perspective view of a monitoring and display station in accordance with one aspect of the invention;
  • FIG. 4 is detail perspective view of the image capture apparatus mounted on an airborne vehicle in one aspect of the invention;
  • FIG. 5 is a detailed perspective view of a the image capture apparatus and antenna shown mounted on an aerial vehicle which is partially shown;
  • FIG. 6 is a top schematic view of a monitoring station and the image capture apparatus showing the respective transmitters and receivers associated with each other;
  • FIG. 7 is a schematic view of an aerial vehicle with cameras or sensors and a monitoring station with displays in accordance with a further embodiment of the invention;
  • FIG. 8 is a detailed view of the aerial vehicle and its cameras or sensors of the type illustrated in FIG. 7 of the drawings;
  • FIG. 9 is a detailed view of the monitoring station and its monitors or displays of the type illustrated in FIG. 7 of the drawings;
  • FIG. 10 shows a schematic view of goggles and related hardware worn by an operator for viewing the image captured on the aerial vehicle;
  • FIG. 11 is a schematic representation of an unmanned aerial vehicle including a circular sensor in accordance with a further aspect of the invention;
  • FIG. 12 is a schematic representation of a monitoring station including a circular video display in accordance with a further aspect of the invention;
  • FIG. 13 is a schematic representation of an unmanned aerial vehicle including a spherical sensor in accordance with a further aspect of the invention;
  • FIG. 14 is a schematic representation of a monitoring station including a substantially spherical video display in accordance with a further aspect of the invention;
  • FIG. 15 is a schematic representation of a monitoring station including a spherical video display accordance with yet a further aspect of the invention;
  • FIG. 16 is a schematic representation of a monitoring station in accordance with still a further aspect of the invention; and
  • FIG. 17 is a schematic representation of an unmanned vehicle and monitoring station utilizing circular or spherical image capture and display with a plurality of sensor and display mechanisms, transmitters and receivers.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Reference is now made to the accompanying drawings which show various embodiments of the invention. The drawings are intended to illustrate some of the embodiments, but certainly not all, which can be configured and constructed in accordance with the invention.
  • FIG. 1 of the drawings illustrates an image monitoring and display system 10 of the invention. It is noted that the monitoring and display is not limited to conventional video in the visual spectrum, but can also utilize infrared, thermal, radiation, or other sensors and display mechanisms. Therefore, any reference herein to visual spectrum image and display should also be understood to encompass all other types of sensing and their respective forms of display.
  • The monitoring and display system 10 comprises and camera array 12 mounted on a frame 14, which is attached to the underside of an unmanned aerial vehicle 16. One type of aerial vehicle 16 is shown in FIG. 1, but many types may be used. These include manned or unmanned vehicles, fixed wing or rotor craft, gliders, to name a few. The aerial vehicle 16 may optionally include a gyro stabilization device 18 where considered necessary or appropriate.
  • The camera array 12 comprises six cameras 20 mounted on the frame 14, which may be in the form of a chassis, platform or other suitable structure. The cameras 20 may be exposed, as shown in FIG. 1, but they may also be individually enclosed or enclosed in a containment structure which would make up the frame 14. Each camera or sensor has a lens 22, and each is located on the frame 14 to as to capture a different segment of the surroundings. In the embodiment illustrated, there are six cameras 20 and each would capture images representing six approximately 60 degree segments so that the combined image encapsulates the full 360 degree view about the aerial vehicle 16. The cameras 20 are placed on the frame 14 so that there will preferably be a clear and unimpeded view of the surroundings.
  • Each camera 20 has associated therewith an antenna 26. In the embodiment of FIG. 1, the antennae 26 are mounted on the lower surface of the frame 14 and project downwardly. Each antenna 26 is directly below its associated camera 20, and is configured to transmit the image from its associated camera 20 to a receiver, as will be described. The antenna arrangement can also take other forms and is not specific to “downward” and “below its associated camera” as described with references to the present embodiment.
  • The embodiment in FIG. 1 illustrates an antenna 26 for each camera 20 or sensor, but the invention contemplates within its scope fewer antennae 26 than cameras 20. The images from one or more or all of the cameras 20 may be combined and transmitted by a single antenna to a base station where the combined signal will be received and separated into its individual component parts so as to recreate the 360 degree (or other spectrum) view captured by the plurality of the cameras 20.
  • The invention as illustrated in FIG. 1 of the drawings also has a base monitoring or display station 40. This station 40 comprises six monitors 42 arranged relative to each other to form a generally hexagonal shape, and defining an interior space 44 which can accommodate a pilot or operator 46 and seating 48. The monitors 42 are adjacent or abut each other so as to preferably provide as close to a seamless composite display of the various images captured by the cameras 20.
  • Each monitor has associated therewith a receiver 50, and each receiver has its own antenna 52. Each antenna 52 of the receivers 50 communicates with one corresponding antenna 26 of the cameras 20, in a manner that will allow the monitors 42 to display the captured images of the cameras 20 in the same order as they are captured by the cameras 20. In this way, a generally 360 degree surround image is assembled and produced on the monitors to display the view much like an operator would have if he or she was actually in the aerial vehicle 16.
  • It is to be noted that the drawings only show the image capture and monitoring system, but there will also be ground or remote controls to navigate the aerial vehicle (when unmanned) so that it can be directed and flown to the desired location and orientation. Further, each of the lenses 22 on the cameras 20 may be adjustable, either manually or automatically, so that they are directed downwardly to the area to be photographed. Such adjustments may become necessary as the vehicle 16 ascends or descends so that the desired or selected area below continues to be the main focus for the cameras 20.
  • Reference is now made to FIG. 2 of the drawings which shows a detail of the camera array 12 mounted on the aerial vehicle 16. The frame 14 is shown as generally comprised of six equal panels 62 arranged in the shape of a hexagon and having an outside lower shelf 60 which supports each camera 20 and lens 22 facing outwardly, each lens 22 being positioned to capture its own approximately 60 degree segment of the surroundings. The frame 14 is fixed to the aerial vehicle 16 by means of an appropriate attachment fitting 64. The frame 14 defines an internal chamber or space 66 designed to house the electronics and other hardware for operating the system, including a transmitter 70 associated with each camera 20 which is connected to its respective antenna 26 located below the shelf 60 to facilitate optimal and unimpeded communication with the antennae 52 on the monitoring station 40.
  • FIG. 3 of the drawings is a detail view of the monitoring station 40 of the invention, showing an operator 46 with control box 76 on a seat 48 placed in the space 44. The operator 46 is surrounded by the six hexagonally arranged bank of monitors 42, each monitor 42 having its dedicated receiver 50 with antenna 52. The seat 48 is on wheels or casters, offering the operator 46 mobility in moving about the space 44, with the ability to turn or rotate his position to view any one or more of the six monitors 42. The six monitors collectively provide a 360 degree panoramic view that would be available to a person sitting in the aerial vehicle 16.
  • Each of the six antennae illustrated in FIG. 3 has been designated its own reference numeral, 52 a, 52 b, 52 c, 52 d, 52 e and 52 f. Each of these antennae 52 a to 52 f has its frequency communication setting and is programmed to receive data from one of the six corresponding antennae 26 on the camera array 12. As seen in FIG. 5 of the drawings, showing a more detailed view of the antennae 26 associated with the cameras or sensors 20, there are six antennae each having their own reference numerals, namely, 26 a, 26 b, 26 c, 26 d, 26 e and 26 f. Thus antenna 26 a communicates with antenna 52 a, antenna 26 b communicates with antenna 52 b, and so on. Each camera 20 therefore has the image it is capturing displayed on a dedicated monitor 42, all of the monitors 42 together providing the composite real time image of the surroundings as would be observed from the aerial vehicle 16. There is also the ability to move the aerial vehicle 16 in both direction and orientation to a selected location or position so that the image of the desired areas can be secured at the monitoring station 40.
  • FIG. 4 of the drawings shows an upward perspective view of the aerial vehicle 16, the attachment fitting 64, frame 14 and cameras 20. This figure illustrates a situation where the system may be capturing images and transmitting them, as described above.
  • FIG. 6 of the drawings shows a schematic representation of the camera array 12, including cameras 20 a, 20 b, 20 c, 20 d, 20 e and 20 f, while attached to the aerial vehicle 16 (not shown) and its manner of communication with the monitoring station 40 as has already been described above. This figure illustrates the specific mechanism of communication between the two, with a transmitter 70 and antennae 26 a, 26 b, 26 c, 26 d, 26 e and 26 f on the aerial vehicle 16 communicating exclusively with its corresponding antenna 52 and receiver 50 on the monitoring station 40. As mentioned above, there are embodiments of the invention where two or more signals from separate transmitters 70 are combined or matrixed, transmitted through a common antenna, and then decoded at the monitoring station 40 and directed to the appropriate monitor 42 to reconstruct the composite image. In other words, it is not necessary to the invention that each camera 20 and monitor 42 have its own transmitter or receiver, so there may be fewer such transmitters or receivers than there are cameras 20.
  • While FIGS. 1 to 6 of the drawings show six cameras or monitors, the invention may have other desired numbers of these components. For example, there may be only four of each, or eight of each. Additionally, the cameras may not necessarily capture the entire 360 degree panoramic view. There may be two or more cameras for capturing images which represent only part of the full circumference, and the individual views need not be contiguous each other. There are many variations possible all of which are in the scope of this invention.
  • FIG. 7 of the drawings shows another embodiment of the invention. The aerial vehicle 16 in this embodiment has two cameras, namely, a forward facing camera 90 which captures a 180 degree field of view, and rearward facing camera 92 having a 90 degree field of view. The cameras 90 and 92 therefore combine to provide a cumulative field of view of approximately 270 degrees. Each camera (or sensor, as the case may be) 90 and 92 may have its own transmitter and associated antenna, or there may be a single transmitter 94, as illustrated in FIG. 7 of the drawings, with its antenna 96 which transmits the combined signal from both cameras 90 and 92 to a receiver for display as will be described.
  • Note that the two cameras 90 and 92 may provide a field of view with different cumulative combinations to that illustrated, or even the full 360 degree view, based on the specific requirements and parameters of a given operation.
  • The monitoring station 100 is modified to suit the configuration of the cameras 90 and 92. The monitoring station 100 comprises a larger curved display 102 showing a 180 degree view corresponding to the image captured by the forward facing camera 90. The operator or pilot (or someone serving both of these functions) 46 sits in front of this display 102 as shown. A smaller curved display 104 is placed above the larger curved display 102, so that both the front view image from the aerial vehicle as well as the back view image can be observed simultaneously without having to physically shift positions within the space 44 referenced above. Precedence in terms of image size will be given to the forward view transmitted, but at the same time, the operator 46 will have within her peripheral vision at least the view to the rear of the aerial vehicle, as captured by the rearward facing camera 92.
  • The monitoring station 100 includes a receiver 106 with antenna 108 for communicating with and receiving data streams or signals from the transmitter 94. The number of receivers 106 on the monitor station 100 will correspond with the number of transmitters on the aerial vehicle 16 to facilitate proper presentation of the video stream accurately at the monitor station 100. Instead of multiple streams, there may be a combined matrix or single stream, as described with reference to a previous embodiment.
  • Note that combination of flat panel viewing and curved panel viewing may be provided. Thus, for example, a larger curved display 102 may be used in conjunction with one or more smaller flat panel displays of the type illustrated above, to present a view suitable to the circumstances of the operation.
  • FIG. 8 shows a more detailed view of the aerial vehicle 16 as illustrated in FIG. 7 of the drawings, while FIG. 9 shows a more detailed perspective view of the monitor station 100, with a better view of the control panel used by the operator to mange and adjust the cameras as well as the direction of flight and orientation of the aerial vehicle.
  • FIG. 10 is a schematic representation of video goggles 120 which may be worn by a user or operator 122 for viewing the signals transmitted from the cameras or sensors. In a preferred form of the invention, the goggles 120 may have sensors 124 for determining the position of the head of the operator 122 and displaying the image (such as a forward facing, rearward facing or side facing image) according to the sensed position of the head. There may also be associated with the goggles 120 a receiver 126 with antenna 128 which will have a purpose and function as already described.
  • Reference is now made to FIG. 11 of the drawings, which shows an unmanned aerial vehicle 160 in accordance with a further aspect of the invention. Note that this figure shows an unmanned aerial vehicle 160, but it is within the scope of the invention that the vehicle may also be manned or operated by an onboard pilot. The vehicle 160 may itself be similar to that which has been previously described, such as illustrated in FIG. 8 of the drawings. The vehicle 160 has a sensor attachment fitting 162 extending downwardly therefrom, and a circular sensor 164 fixed to the sensor attachment fitting 162. There is further provided a video transmitter 166, and a video antenna 168. It is to be noted that this figure illustrates a sensor which has 360 degree circular or near such circular viewing capabilities. The image captured is therefore of the 360 degree view surrounding the circular sensor 164. In other embodiments, the circular sensor may not necessarily capture images of the total 360 degree view, but may be less than this, such as, for example only, 270 degrees.
  • FIG. 12 of the drawings shows a monitoring station 170 which comprises a central area for accommodating the pilot 172 and the unmanned aerial vehicle piloting controller 174. A circular video display 176, mounted on legs or a frame 178, is positioned about the pilot 172. The video display 176 receives data through a video receiver antenna 180, and a video receiver 182, so that the image is displayed on the circular substantially continuous video display 176. It should be noted that the circular video display 176 would typically be one which receives data from an unmanned aerial vehicle having a circular sensor 164 as generally illustrated in FIG. 11 of the drawings.
  • FIG. 13 of the drawings shows an unmanned aerial vehicle 160 which has a sensor attachment fitting 162. A spherical sensor 190 is fixed to the sensor attachment fitting 162, and is able to capture images which are spherical in nature. Such spherical images would include substantially 360 degree views in both the vertical and horizontal orientations. The images so captured by the spherical sensor 190 are transmitted to a video transmitter 192 with accompanying video antenna 194 which are in turn transmitted to a monitoring station.
  • FIG. 14 of the drawings shows a monitoring station 200, partially cutaway to illustrate its interior features, having a structure and configuration which is adapted to display images which have been captured by the spherical sensor 190 illustrated in FIG. 13 of the drawings and transmitted to the monitoring station 200 through the video transmitter 192 and video antenna 194. The monitoring station 200 comprises a video display enclosure 202 of generally spherical shape with a video display screen 204 on the inside surface thereof. Inside the video display enclosure 202, a piloting controller 206 may be accommodated, and a pilot 207 sits on a seat 208 mounted on a seat base 210. The video display enclosure 202 is itself mounted on a supporting base 212, and has an enclosure access port 214 and ladder 216 for the convenience of the pilot controller 207. Associated with the video display enclosure 202 is a video receiver antenna 218 and video receiver 220. Data captured from the video antenna 194 by the receiver antenna 218 is processed through the video receiver 220 and displayed on the video display screen 204.
  • FIG. 15 of the drawings shows a further view of a monitoring station 230 in accordance with the present invention. The monitoring station 230, cutaway for demonstrating the interior, shows the video display enclosure 232 mounted on a base 234, and a video display screen 236 of the inside surface of the video display enclosure 232. The piloting controller 238 is positioned for convenient use by a pilot 240, who sits on a seat 242 supported on a platform 244.
  • FIG. 16 of the drawings shows a another version of a monitoring station 250 which is generally substantially spherical, but is cutaway at its lower edge or portion. The monitoring station 250 is accessed through an enclosure access door 252, which forms an integral part of the substantially spherical video display enclosure 254. The video display screen 256 is preferably formed on the inside surface of the video display enclosure 254, including over the inside surface of the access door 252, so that when the door 252 is closed, a preferably continuous and substantially seamless image will be shown on the video display screen 256 including over the inside of the access door 252. Within the video display enclosure 254, a floor platform 260 is formed, upon which a seat 262 is mounted, the seat 262 being for the benefit of the pilot 264. The pilot 264 operates a controller 266, as previously described and illustrated. The video display enclosure 254 has a video downlink receiver 268 with an associated video downlink receiver antenna to 270. The antenna or 270 and receiver 268 receive captured images from an unmanned aerial vehicle, and through appropriate controllers convert these images to live feed on the video display screen 256.
  • FIG. 17 of the drawings illustrates a system in accordance with the invention, which may have a spherical or circular image capturing mechanism, and a spherical or circular monitor. In this embodiment, there is a plurality of image sensors, transmitters, receivers, as well as monitor sections. In this figure, there is shown an unmanned aerial vehicle 360 having an attachment mechanism 362 depending therefrom, upon which a sensor housing 364 is mounted. The sensor housing 364 contains a plurality of sensors able to capture an image which is circular or spherical in nature. The sensor housing 364 has four transmitters 366 a to 366 d, each of the transmitters having an associated antenna 368 a to 368 d. These transmit the images captured within the sensor housing 364 to a monitoring station 370. The monitoring station 370 is configured to accommodate an operator 372 and control board 374. The monitoring station 370, in the present embodiment, has four sections, each of the four sections having a receiver 382 a to 382 d, each of these receivers having an associated antenna 380 a to 380 d. The transmitters 366 and antennae 368 transmit respective images to antennae 380 and receivers 382. Where there are four image capturing sensors in the sensor housing 364, four sections of the monitoring station 370 display an image corresponding to each of these four capturing sensors. Preferably, the four images are seamed together so as to provide a substantially continuous image around the operator 372, which according to the nature of the monitoring system may be circular, spherical, or a combination thereof. Although four image sensors are shown in FIG. 17, the invention is not limited to spherical or circular monitoring with for such sensors, and there may be any number of conveniently selected sensors and corresponding monitors. Furthermore, the number of sensors and monitors need not be the same, with a particular transmitter transmitting its image to more than one monitor section, or vice versa.
  • Throughout this description, the embodiments and examples shown should be considered as exemplars, rather than limitations on the apparatus and procedures disclosed or claimed. Although many of the examples presented herein involve specific combinations of method acts or system elements, it should be understood that those acts and those elements may be combined in other ways to accomplish the same objectives. Acts, elements and features discussed only in connection with one embodiment are not intended to be excluded from a similar role in other embodiments.
  • As used herein, “plurality” means two or more. As used herein, a “set” of items may include one or more of such items. As used herein, whether in the written description or the claims, the terms “comprising”, “including”, “carrying”, “having”, “containing”, “involving”, and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of”, respectively, are closed or semi-closed transitional phrases with respect to claims. Use of ordinal terms such as “first”, “second”, “third”, etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one claim element having a certain name from another element having a same name (but for use of the ordinal term) to distinguish the claim elements. As used herein, “and/or” means that the listed items are alternatives, but the alternatives also include any combination of the listed items.

Claims (20)

1. An image capture and display system comprising:
an image capture sensor positioned on a vehicle, the image capture sensor comprising an arcuate continuous image sensing member, and a transmitter for transmitting images from the image capture sensor; and
a monitor display positioned remotely from the image capture sensor, the monitor display comprising a display monitor having an arcuate continuous image configuration substantially corresponding to that of the image capture sensor, and a receiver for receiving the images corresponding to the transmitter.
2. An image capture and display system as claimed in claim 1 wherein the vehicle is an unmanned aerial vehicle.
3. An image capture and display system as claimed in claim 2 wherein the monitor display is positioned at a ground based location.
4. An image capture and display system as claimed in claim 1 wherein the image capture sensor cumulatively captures less than a 360 degree field of view.
5. An image capture and display system as claimed in claim 1 wherein the image capture sensor are cameras for capturing video images in the visual spectrum.
6. An image capture and display system as claimed in claim wherein the image capture sensor are infrared cameras for capturing images in the infrared spectrum.
7. An image capture and display system as claimed in claim 1 wherein the image capture sensor are night vision sensors for capturing video images in the visual spectrum.
8. An image capture and display system as claimed in claim 1 wherein the image capture sensor are selected from the group consisting of: heat sensors, radiation sensors, pressure sensors, gaseous content sensors, humidity sensors, sonic sensors and EMF spectrometers.
9. An image capture and display system as claimed in claim 1 comprising image capture sensor of different types so that the system can simultaneously capture images of different conditions.
10. An image capture and display system as claimed in claim 1 wherein the monitor display array defines an internal space capable of accommodating an operator of the system.
11. An image capture and display system as claimed in claim 1 wherein the image capture sensor comprises a frame member attached to the vehicle configured for mounting of the image sensing members thereon.
12. An image capture and display system as claimed in claim 1 wherein the vehicle is selected from the group consisting of: manned aerial vehicle, glider, ground-based vehicle, water-based vehicle, and space vehicle.
13. An image capture and display system as claimed in claim 1 wherein the image capture sensor is located outside of an aerial vehicle and the monitor display array is located inside the same aerial vehicle.
14. An image capture and display system comprising:
an image capture sensor for positioning on a vehicle, and at least one transmitter for transmitting images from the image capture sensor; and
a monitor display array for positioning remotely from the image capture sensor, the monitor display array comprising a display monitor for and corresponding to the image capture sensor, and a receiver for receiving the images corresponding to each transmitter.
15. An image capture and display system as claimed in claim 1 wherein the image capture sensor comprises a generally circular sensor for capturing substantially 360 degree images.
16. An image capture and display system as claimed in claim 15 wherein the display monitor is substantially circular in shape and displays the images in a substantially continuous 360 degree image.
17. An image capture and display system as claimed in claim 1 wherein the image capture sensor comprises a generally spherical or partially spherical sensor for capturing substantially spherical images.
18. An image capture and display system as claimed in claim willing the display monitor is substantially spherical or partially spherical in shape and displays the images in a substantially continuous 360 degree image.
19. An image capture and display system as claimed claim 1 wherein a plurality of image sensing members are provided to form the arcuate continuous image.
20. An image capture and display system as claimed claim 1 wherein a plurality of display monitors are provided which display the arcuate image.
US14/036,669 2012-03-20 2013-09-25 Image monitoring and display from unmanned vehicle Abandoned US20140327733A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US201261685539P true 2012-03-20 2012-03-20
US13/847,161 US9350954B2 (en) 2012-03-20 2013-03-19 Image monitoring and display from unmanned vehicle
US14/036,669 US20140327733A1 (en) 2012-03-20 2013-09-25 Image monitoring and display from unmanned vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/036,669 US20140327733A1 (en) 2012-03-20 2013-09-25 Image monitoring and display from unmanned vehicle
US15/064,533 US9533760B1 (en) 2012-03-20 2016-03-08 Image monitoring and display from unmanned vehicle

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US13/847,161 Continuation-In-Part US9350954B2 (en) 2012-03-20 2013-03-19 Image monitoring and display from unmanned vehicle
US13/847,161 Continuation US9350954B2 (en) 2012-03-20 2013-03-19 Image monitoring and display from unmanned vehicle

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/064,533 Continuation US9533760B1 (en) 2012-03-20 2016-03-08 Image monitoring and display from unmanned vehicle

Publications (1)

Publication Number Publication Date
US20140327733A1 true US20140327733A1 (en) 2014-11-06

Family

ID=51841238

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/036,669 Abandoned US20140327733A1 (en) 2012-03-20 2013-09-25 Image monitoring and display from unmanned vehicle
US15/064,533 Active US9533760B1 (en) 2012-03-20 2016-03-08 Image monitoring and display from unmanned vehicle

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/064,533 Active US9533760B1 (en) 2012-03-20 2016-03-08 Image monitoring and display from unmanned vehicle

Country Status (1)

Country Link
US (2) US20140327733A1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140327770A1 (en) * 2012-03-20 2014-11-06 David Wagreich Image monitoring and display from unmanned vehicle
US20150210387A1 (en) * 2014-01-24 2015-07-30 Maxlinear, Inc. First-Person Viewer for Unmanned Vehicles
US20160054733A1 (en) * 2014-08-22 2016-02-25 Innovative Signal Analysis, Inc. Video enabled inspection using unmanned aerial vehicles
CN105828049A (en) * 2016-05-20 2016-08-03 国网山东省电力公司日照供电公司 Intelligent remote viewing system of transformer substation
US9413956B2 (en) 2006-11-09 2016-08-09 Innovative Signal Analysis, Inc. System for extending a field-of-view of an image acquisition device
WO2016131007A1 (en) * 2015-02-13 2016-08-18 Esco Corporation Monitoring ground-engaging products for earth working equipment
US9430923B2 (en) 2009-11-30 2016-08-30 Innovative Signal Analysis, Inc. Moving object detection, tracking, and displaying systems
US20160252438A1 (en) * 2014-09-09 2016-09-01 Nanjing University Of Aeronautics And Astronautics Method for locating impact area of composite structure based on energy weighted factor
TWI556198B (en) * 2015-09-11 2016-11-01 Geosat Aerospace & Technology Inc Positioning and directing data analysis system and method thereof
US9533760B1 (en) 2012-03-20 2017-01-03 Crane-Cohasset Holdings, Llc Image monitoring and display from unmanned vehicle
US20170012697A1 (en) * 2015-04-10 2017-01-12 SZ DJI Technology Co., Ltd Method, apparatus and system of providing communication coverage to an unmanned aerial vehicle
US20170032175A1 (en) * 2015-07-31 2017-02-02 Hon Hai Precision Industry Co., Ltd. Unmanned aerial vehicle detection method and unmanned aerial vehicle using same
DE102016013971A1 (en) 2016-11-23 2017-05-24 Daimler Ag A method for remote operation of a motor vehicle, and system for remote control operation of a motor vehicle
US20180002036A1 (en) * 2016-12-26 2018-01-04 Haoxiang Electric Energy (Kunshan) Co., Ltd. Obstacle avoidance device
USD822746S1 (en) * 2016-02-05 2018-07-10 Durst Sebring Revolution, Llc Photo booth
FR3061807A1 (en) * 2017-01-12 2018-07-13 Tekcem Antenna comprising an unmanned aircraft
US10024033B2 (en) 2013-11-25 2018-07-17 Esco Corporation Wear part monitoring
WO2018150095A1 (en) * 2017-02-15 2018-08-23 Rolls-Royce Oy Ab Remote operation centre for monitoring a vessel

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160080702A1 (en) * 2014-09-11 2016-03-17 Gabriel Shachor Systems and methods for controlling multiple aerial units
EP3328731A4 (en) * 2015-07-28 2018-07-18 Margolin, Joshua Multi-rotor uav flight control method and system
JP6283425B2 (en) * 2015-09-11 2018-02-21 エスゼット ディージェイアイ オスモ テクノロジー カンパニー リミテッドSZ DJI Osmo Technology Co., Ltd. Unmanned aircraft
US20170255824A1 (en) * 2016-03-04 2017-09-07 General Electric Company Aerial camera system and method for identifying route-related hazards
FI20175127L (en) * 2017-02-15 2018-08-16 Rolls Royce Oy Ab Shore operation centre workstation

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4772942A (en) * 1986-01-11 1988-09-20 Pilkington P.E. Limited Display system having wide field of view
US5034759A (en) * 1989-11-28 1991-07-23 Ronald Watson Photo device
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5240207A (en) * 1992-08-03 1993-08-31 The United States Of America As Represented By The Secretary Of The Navy Generic drone control system
US20020196339A1 (en) * 2001-03-13 2002-12-26 Andrew Heafitz Panoramic aerial imaging device
US20040066449A1 (en) * 2000-11-29 2004-04-08 Dor Givon System and method for spherical stereoscopic photographing
US20060083501A1 (en) * 2004-10-18 2006-04-20 Mark Segal Method and apparatus for creating aerial panoramic photography
US20060200382A1 (en) * 2005-03-03 2006-09-07 Arutunian Ethan B Notifications using enhanced map-based imagery
US20070088709A1 (en) * 2005-10-03 2007-04-19 Bechtel Corporation System and Methods for Intergrating Data Into a Network Planning Tool
US20070096446A1 (en) * 1995-06-07 2007-05-03 Automotive Technologies International, Inc. Airbag Deployment Control Based on Contact with Occupant
US20070268155A1 (en) * 2006-05-22 2007-11-22 Phelps Dodge Corporation Position tracking and proximity warning system
US20080043020A1 (en) * 2006-08-18 2008-02-21 Microsoft Corporation User interface for viewing street side imagery
US20090015674A1 (en) * 2006-04-28 2009-01-15 Kevin Alley Optical imaging system for unmanned aerial vehicle
US20090045290A1 (en) * 2007-08-13 2009-02-19 James Small Method and system for inflight refueling of unmanned aerial vehicles
US20090232415A1 (en) * 2008-03-13 2009-09-17 Microsoft Corporation Platform for the production of seamless orthographic imagery
US20100329542A1 (en) * 2009-06-30 2010-12-30 Srikumar Ramalingam Method for Determining a Location From Images Acquired of an Environment with an Omni-Directional Camera
US20110064312A1 (en) * 2009-09-14 2011-03-17 Janky James M Image-based georeferencing
US20110169945A1 (en) * 2008-12-19 2011-07-14 Saab Ab Method and arrangement for estimating at least one parameter of an intruder
US20110249100A1 (en) * 2010-04-09 2011-10-13 Sankar Jayaram Apparatus and Method for Capturing Images
US20120033851A1 (en) * 2010-04-22 2012-02-09 Shen-En Chen Spatially integrated aerial photography for bridge, structure, and environmental monitoring

Family Cites Families (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4355328A (en) 1981-02-23 1982-10-19 The United States Of America As Represented By The Secretary Of The Navy 360 Degree closed circuit television system
US5103306A (en) 1990-03-28 1992-04-07 Transitions Research Corporation Digital image compression employing a resolution gradient
FR2725102B1 (en) 1994-09-27 1996-12-13 M5 Soc video-control unit performs distance gear in particular vehicles, and implementation of device
US6781606B2 (en) 1999-05-20 2004-08-24 Hewlett-Packard Development Company, L.P. System and method for displaying images using foveal video
US6954182B2 (en) * 2003-01-17 2005-10-11 The Insitu Group, Inc. Conductive structures including aircraft antennae and associated methods of formation
US8102423B2 (en) 2004-08-10 2012-01-24 Sri International Method and system for performing adaptive image acquisition
US20060146132A1 (en) 2005-01-05 2006-07-06 Hy Mayerson Video system having multiple video cameras for capturing events
US9654200B2 (en) * 2005-07-18 2017-05-16 Mutualink, Inc. System and method for dynamic wireless aerial mesh network
US9270976B2 (en) 2005-11-02 2016-02-23 Exelis Inc. Multi-user stereoscopic 3-D panoramic vision system and method
US7826839B1 (en) * 2006-01-30 2010-11-02 Rockwell Collins, Inc. Communication system to facilitate airborne electronic attack
EP2036043A2 (en) 2006-06-26 2009-03-18 Lockheed Martin Corporation Method and system for providing a perspective view image by intelligent fusion of a plurality of sensor data
JP4791297B2 (en) 2006-08-31 2011-10-12 シャープ株式会社 Monitoring system
EP2100454A4 (en) * 2006-11-20 2017-07-26 Axis AB Wireless network camera systems
US20120229596A1 (en) 2007-03-16 2012-09-13 Michael Kenneth Rose Panoramic Imaging and Display System With Intelligent Driver's Viewer
US8098283B2 (en) * 2007-08-01 2012-01-17 Shaka Ramsay Methods, systems, and computer program products for implementing a personalized, image capture and display system
IL189251D0 (en) 2008-02-05 2008-11-03 Ehud Gal A manned mobile platforms interactive virtual window vision system
US8687111B2 (en) 2008-05-12 2014-04-01 Flir Systems, Inc. Optical payload with folded telescope and cryocooler
US9196168B2 (en) * 2009-05-20 2015-11-24 Textron Innovations Inc. Collision avoidance and warning system
US20100302359A1 (en) 2009-06-01 2010-12-02 Honeywell International Inc. Unmanned Aerial Vehicle Communication
WO2011034645A1 (en) 2009-06-18 2011-03-24 Drs Test & Energy Management, Llc System and method for 360 degree situational awareness in a mobile environment
IL201682D0 (en) 2009-10-22 2010-11-30 Bluebird Aero Systems Ltd Imaging system for uav
US8400511B2 (en) 2009-12-04 2013-03-19 Lockheed Martin Corporation Optical detection and ranging sensor system for sense and avoid, and related methods
US20110134209A1 (en) 2009-12-04 2011-06-09 Eric Schmidt Virtual Structure
US8494760B2 (en) 2009-12-14 2013-07-23 American Aerospace Advisors, Inc. Airborne widefield airspace imaging and monitoring
US20110234796A1 (en) * 2010-03-29 2011-09-29 Raytheon Company System and Method for Automatically Merging Imagery to Provide Enhanced Situational Awareness
US20120043411A1 (en) * 2010-06-01 2012-02-23 L2 Aerospace Unmanned aerial vehicle system
US20110291918A1 (en) 2010-06-01 2011-12-01 Raytheon Company Enhancing Vision Using An Array Of Sensor Modules
EP2423871B1 (en) * 2010-08-25 2014-06-18 Lakeside Labs GmbH Apparatus and method for generating an overview image of a plurality of images using an accuracy information
EP2423873B1 (en) * 2010-08-25 2013-12-11 Lakeside Labs GmbH Apparatus and Method for Generating an Overview Image of a Plurality of Images Using a Reference Plane
US20130222590A1 (en) 2012-02-27 2013-08-29 Honeywell International Inc. Methods and apparatus for dynamically simulating a remote audiovisual environment
US20140327733A1 (en) 2012-03-20 2014-11-06 David Wagreich Image monitoring and display from unmanned vehicle
US9119061B2 (en) * 2012-03-20 2015-08-25 Farrokh Mohamadi Integrated wafer scale, high data rate, wireless repeater placed on fixed or mobile elevated platforms
US9350954B2 (en) 2012-03-20 2016-05-24 Crane-Cohasset Holdings, Llc Image monitoring and display from unmanned vehicle
US8930044B1 (en) * 2012-12-28 2015-01-06 Google Inc. Multi-part navigation process by an unmanned aerial vehicle for navigating to a medical situatiion
JP2016514381A (en) * 2013-01-22 2016-05-19 エデン ロック コミュニケーションズ, エルエルシーEden Rock Communications, Llc The method and system of the intelligent jamming signal generator
US9213333B2 (en) 2013-06-06 2015-12-15 Caterpillar Inc. Remote operator station
US8903568B1 (en) 2013-07-31 2014-12-02 SZ DJI Technology Co., Ltd Remote control method and terminal
US9440750B2 (en) 2014-06-20 2016-09-13 nearmap australia pty ltd. Wide-area aerial camera systems
US9185290B1 (en) 2014-06-20 2015-11-10 Nearmap Australia Pty Ltd Wide-area aerial camera systems
USD756842S1 (en) * 2014-08-21 2016-05-24 Javad Gnss, Inc. Unmanned aerial drone
US20160124435A1 (en) * 2014-10-29 2016-05-05 Lyle Thompson 3d scanning and imaging method utilizing a self-actuating compact unmanned aerial device

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4772942A (en) * 1986-01-11 1988-09-20 Pilkington P.E. Limited Display system having wide field of view
US5034759A (en) * 1989-11-28 1991-07-23 Ronald Watson Photo device
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5240207A (en) * 1992-08-03 1993-08-31 The United States Of America As Represented By The Secretary Of The Navy Generic drone control system
US20070096446A1 (en) * 1995-06-07 2007-05-03 Automotive Technologies International, Inc. Airbag Deployment Control Based on Contact with Occupant
US20040066449A1 (en) * 2000-11-29 2004-04-08 Dor Givon System and method for spherical stereoscopic photographing
US20020196339A1 (en) * 2001-03-13 2002-12-26 Andrew Heafitz Panoramic aerial imaging device
US20060083501A1 (en) * 2004-10-18 2006-04-20 Mark Segal Method and apparatus for creating aerial panoramic photography
US20060200382A1 (en) * 2005-03-03 2006-09-07 Arutunian Ethan B Notifications using enhanced map-based imagery
US20070088709A1 (en) * 2005-10-03 2007-04-19 Bechtel Corporation System and Methods for Intergrating Data Into a Network Planning Tool
US20090015674A1 (en) * 2006-04-28 2009-01-15 Kevin Alley Optical imaging system for unmanned aerial vehicle
US20070268155A1 (en) * 2006-05-22 2007-11-22 Phelps Dodge Corporation Position tracking and proximity warning system
US20080043020A1 (en) * 2006-08-18 2008-02-21 Microsoft Corporation User interface for viewing street side imagery
US20090045290A1 (en) * 2007-08-13 2009-02-19 James Small Method and system for inflight refueling of unmanned aerial vehicles
US20090232415A1 (en) * 2008-03-13 2009-09-17 Microsoft Corporation Platform for the production of seamless orthographic imagery
US20110169945A1 (en) * 2008-12-19 2011-07-14 Saab Ab Method and arrangement for estimating at least one parameter of an intruder
US20100329542A1 (en) * 2009-06-30 2010-12-30 Srikumar Ramalingam Method for Determining a Location From Images Acquired of an Environment with an Omni-Directional Camera
US20110064312A1 (en) * 2009-09-14 2011-03-17 Janky James M Image-based georeferencing
US20110249100A1 (en) * 2010-04-09 2011-10-13 Sankar Jayaram Apparatus and Method for Capturing Images
US20120033851A1 (en) * 2010-04-22 2012-02-09 Shen-En Chen Spatially integrated aerial photography for bridge, structure, and environmental monitoring

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9413956B2 (en) 2006-11-09 2016-08-09 Innovative Signal Analysis, Inc. System for extending a field-of-view of an image acquisition device
US9430923B2 (en) 2009-11-30 2016-08-30 Innovative Signal Analysis, Inc. Moving object detection, tracking, and displaying systems
US9350954B2 (en) * 2012-03-20 2016-05-24 Crane-Cohasset Holdings, Llc Image monitoring and display from unmanned vehicle
US20140327770A1 (en) * 2012-03-20 2014-11-06 David Wagreich Image monitoring and display from unmanned vehicle
US9533760B1 (en) 2012-03-20 2017-01-03 Crane-Cohasset Holdings, Llc Image monitoring and display from unmanned vehicle
US10024033B2 (en) 2013-11-25 2018-07-17 Esco Corporation Wear part monitoring
US20150210387A1 (en) * 2014-01-24 2015-07-30 Maxlinear, Inc. First-Person Viewer for Unmanned Vehicles
US10077109B2 (en) * 2014-01-24 2018-09-18 Maxlinear, Inc. First-person viewer for unmanned vehicles
US20160054733A1 (en) * 2014-08-22 2016-02-25 Innovative Signal Analysis, Inc. Video enabled inspection using unmanned aerial vehicles
US10139819B2 (en) * 2014-08-22 2018-11-27 Innovative Signal Analysis, Inc. Video enabled inspection using unmanned aerial vehicles
US20160252438A1 (en) * 2014-09-09 2016-09-01 Nanjing University Of Aeronautics And Astronautics Method for locating impact area of composite structure based on energy weighted factor
WO2016131007A1 (en) * 2015-02-13 2016-08-18 Esco Corporation Monitoring ground-engaging products for earth working equipment
US10011975B2 (en) 2015-02-13 2018-07-03 Esco Corporation Monitoring ground-engaging products for earth working equipment
EP3256650A4 (en) * 2015-02-13 2019-02-20 Esco Group Llc Monitoring ground-engaging products for earth working equipment
US20170012697A1 (en) * 2015-04-10 2017-01-12 SZ DJI Technology Co., Ltd Method, apparatus and system of providing communication coverage to an unmanned aerial vehicle
US10038492B2 (en) * 2015-04-10 2018-07-31 SZ DJI Technology Co., Ltd Method, apparatus and system of providing communication coverage to an unmanned aerial vehicle
US20170032175A1 (en) * 2015-07-31 2017-02-02 Hon Hai Precision Industry Co., Ltd. Unmanned aerial vehicle detection method and unmanned aerial vehicle using same
US9824275B2 (en) * 2015-07-31 2017-11-21 Hon Hai Precision Industry Co., Ltd. Unmanned aerial vehicle detection method and unmanned aerial vehicle using same
TWI556198B (en) * 2015-09-11 2016-11-01 Geosat Aerospace & Technology Inc Positioning and directing data analysis system and method thereof
USD822746S1 (en) * 2016-02-05 2018-07-10 Durst Sebring Revolution, Llc Photo booth
CN105828049A (en) * 2016-05-20 2016-08-03 国网山东省电力公司日照供电公司 Intelligent remote viewing system of transformer substation
DE102016013971A1 (en) 2016-11-23 2017-05-24 Daimler Ag A method for remote operation of a motor vehicle, and system for remote control operation of a motor vehicle
US20180002036A1 (en) * 2016-12-26 2018-01-04 Haoxiang Electric Energy (Kunshan) Co., Ltd. Obstacle avoidance device
US10259593B2 (en) * 2016-12-26 2019-04-16 Haoxiang Electric Energy (Kunshan) Co., Ltd. Obstacle avoidance device
WO2018130888A1 (en) * 2017-01-12 2018-07-19 Tekcem Antenna comprising an unmanned aircraft
FR3061807A1 (en) * 2017-01-12 2018-07-13 Tekcem Antenna comprising an unmanned aircraft
WO2018150095A1 (en) * 2017-02-15 2018-08-23 Rolls-Royce Oy Ab Remote operation centre for monitoring a vessel

Also Published As

Publication number Publication date
US9533760B1 (en) 2017-01-03

Similar Documents

Publication Publication Date Title
EP2959352B1 (en) Remote control method and terminal
US9270976B2 (en) Multi-user stereoscopic 3-D panoramic vision system and method
US9280038B1 (en) Interchangeable mounting platform
EP2976687B1 (en) Systems and methods for uav docking
US9927812B2 (en) Remote control method and terminal
US8269893B2 (en) Optical payload electrical system
US8643719B2 (en) Traffic and security monitoring system and method
EP2089677B1 (en) Methods, apparatus and systems for enhanced synthetic vision and multi-sensor data fusion to improve operational capabilities of unmanned aerial vehicles
US9456185B2 (en) Helicopter
US7000883B2 (en) Method and apparatus for stabilizing payloads, including airborne cameras
US20100228406A1 (en) UAV Flight Control Method And System
US8543265B2 (en) Systems and methods for unmanned aerial vehicle navigation
US9616998B2 (en) Unmanned aerial vehicle/unmanned aircraft system
US20100179691A1 (en) Robotic Platform
US20180323862A1 (en) Method, apparatus and system of providing communication coverage to an unmanned aerial vehicle
US6909381B2 (en) Aircraft collision avoidance system
US4805015A (en) Airborne stereoscopic imaging system
EP2167920B1 (en) Aircraft landing assistance
US9765926B2 (en) Systems and methods for payload stabilization
US20160063987A1 (en) Unmanned aerial vehicle (uav) for collecting audio data
US6626398B1 (en) Unmanned biplane for airborne reconnaissance and surveillance having staggered and gapped wings
US20130073775A1 (en) Systems and methods for image stream processing
JP5349055B2 (en) Multi-lens array system and method
DK2997768T3 (en) Adaptive communication mode switching
US7280134B1 (en) Landscape camera system with electronic field of view switching

Legal Events

Date Code Title Description
AS Assignment

Owner name: CRANE-COHASSET HOLDINGS, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAGREICH, DAVID;REEL/FRAME:033671/0742

Effective date: 20140903

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION