GB2551682A - Drone based accident,traffic surveillance - Google Patents

Drone based accident,traffic surveillance Download PDF

Info

Publication number
GB2551682A
GB2551682A GB1419371.8A GB201419371A GB2551682A GB 2551682 A GB2551682 A GB 2551682A GB 201419371 A GB201419371 A GB 201419371A GB 2551682 A GB2551682 A GB 2551682A
Authority
GB
United Kingdom
Prior art keywords
drone
camera
images
information
operator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1419371.8A
Other versions
GB201419371D0 (en
Inventor
Hiebl Johann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive Systems Inc
Original Assignee
Continental Automotive Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Systems Inc filed Critical Continental Automotive Systems Inc
Publication of GB201419371D0 publication Critical patent/GB201419371D0/en
Publication of GB2551682A publication Critical patent/GB2551682A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/012Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • G08G1/0145Measuring and analyzing of parameters relative to traffic conditions for specific applications for active traffic flow control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/091Traffic information broadcasting
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096791Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is another vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/20Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
    • G08G1/205Indicating the location of the monitored vehicles as destination, e.g. accidents, stolen, rental
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/31UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U50/00Propulsion; Power supply
    • B64U50/30Supply or distribution of electrical power
    • B64U50/37Charging when not in flight

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Atmospheric Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A drone 200 with a camera 202 which captures images of a roadway 302 beneath it and wirelessly sends live images to an analysis centre 400; an operator 420 analyses the real time images 308 and sends out information over a cellular network 327. The information may be the live image taken by the drone 308, which was selected by the operator; or transferred via two way audio communication. The drone may comprise a network access device 204 through which the analysis centre 400 accesses the cameras images. The drone may communicate over radio frequency (rf) signals with the analysis centre and preferably will use a cellular network. The information provided by the analysis centre may is provided to a vehicle 326 which may alter its route based on the information provided. The information may be sent to subscribers who have paid to receive the information.Also disclosed is an up pole hangar for the drone (figure 5).

Description

I, EPODOC, TXTE & INTERNET (54) Title of the Invention: Drone based accident,traffic surveillance
Abstract Title: Capturing images of a roadway using a drone, transmitting them to an analysis centre and sending out traffic information based on the drone images.
(57) A drone 200 with a camera 202 which captures images of a roadway 302 beneath it and wirelessly sends live images to an analysis centre 400; an operator 420 analyses the real time images 308 and sends out information over a cellular network 327. The information may be the live image taken by the drone 308, which was selected by the operator; or transferred via two way audio communication. The drone may comprise a network access device 204 through which the analysis centre 400 accesses the cameras images. The drone may communicate over radio frequency (rf) signals with the analysis centre and preferably will use a cellular network. The information provided by the analysis centre may is provided to a vehicle 326 which may alter its route based on the information provided. The information may be sent to subscribers who have paid to receive the information.Also disclosed is an up pole hangar for the drone (figure 5).
Figure GB2551682A_D0001
/6
Figure GB2551682A_D0002
2/6
A £ · 59-^
Figure GB2551682A_D0003
3/6
Figure GB2551682A_D0004
4/6
Figure GB2551682A_D0005
5/6
Γϊ b'
Λ
Figure GB2551682A_D0006
6/6
Fife-· (,
Figure GB2551682A_D0007
DRONE-BASED ACCIDENT AND TRAFFIC SURVEILLANCE [0001] [0002]
BACKGROUND
Traffic congestion has been and will continue to be a bane of drivers everywhere. Avoiding traffic congestion will thus continue to be important to drivers.
Some commercial broadcast radio stations in various metropolitan areas provide traffic bulletins but such traffic reports usually pertain to major highways or roadways located in or around major metropolitan areas. Such traffic reports are also usually provided by piloted aircraft, which are costly to maintain, operate and store. And, since they are usually broadcast as part of a radio station’s programming, traffic reports broadcast by commercial radio stations are sometimes untimely. An apparatus and method for providing real-time roadway and terrain surveillance inexpensively along any part of a roadway on an as-needed basis would be an improvement over the prior art.
BRIEF DESCRIPTION OF THE DRAWINGS [0003] [0004] [0005] [0006] [0007] [0008]
FIG. 1 depicts a system for providing remote observation of terrain including roadways using a battery-powered pilotless drone aircraft; FIG. 2 is a block diagram of a battery-powered pilotless drone aircraft;
FIG. 3 depicts the flow of image information from a pilotless drone aircraft to an image analysis center and to an end-user/subscriber of a real-time surveillance service;
FIG. 4 is a block diagram of one embodiment of an image analysis center, which receives image information from a drone aircraft, evaluates the imagery and distributes reports to service subscribers;
FIG. 5 depicts a pole-mounted hangar in which a pilotless drone aircraft can be stored and charged until needed; and
FIG. 6 is a flow chart that depicts steps of a method for providing remote observation of a roadway or other terrain using a batterypowered pilotless aircraft.
Detailed Description [0009] [0010] [0011]
As used herein, the term, “real time” refers to the actual time during which something takes place or the actual time required for an event to occur. FIG. 1 depicts a system 100 for providing remote observation of terrain including roadways, in real time or near-real time, using a battery-powered pilotless drone aircraft. The system 100 is comprised of essentially three major subsystems described hereinafter.
A first subsystem is of course a battery-powered, pilotless drone aircraft 200, preferably embodied as a battery-powered pilotless helicopter having a plurality of rotors 210 and which is provided with a high-resolution camera 202 directed downwardly. Using an on-board global positioning system navigation and a software-controlled processor, not shown in FIG. 1, the aircraft 200 is preferably configured to fly a predetermined route 300 over or along a roadway 302 or other terrain 304 and continuously capture high-resolution/high definition images 308 of the route 300, roadway 302 or terrain 306 as it flies or hovers. The camera 202 is a digital camera, well known in the art. It generates data representing each picture element (pixel) of a captured image.
Data 309 representing images 308 captured by the on-board camera 202 is sent in real time to a network access device 204. That image data 309 is transmitted by the network access device 204 in real time to a communications network 350. The network access device 204 is essentially a cell phone configured to enable the transmission and reception of data by the aircraft 200 in the same way that similar network access devices used in vehicles provide telematics services to cars and trucks. OnStar® is an example of a telematics system that provides telematics services.
[0012] [0013] [0014]
Image data 309, i.e., the data representing capture images, is transmitted from the network access device 204 to an image analysis center 400 via the network 350. The image analysis center 400 is a facility where images are reproduced from the image data 309 and displayed to a person or provided to a computer. A person or a computer evaluates the images and the information contained therein. The evaluations are transmitted from the image analysis center to subscribers, i.e., persons or entities who pay for real-time overhead surveillance. In a second embodiment, the image data 309 is also sent to a subscriber or entity so that the subscriber can also see the images 308 captured by the aircraft 200.
The wireless communications network 350 is another subsystem, preferably embodied as a prior art, terrestrial cellular telephone network 350. Such communications networks are well known and provide the aforementioned vehicle telematics services. The network 350 comprises multiple cell sites 352, 354, 356 located in different geographic areas. They are coupled to the image analysis center 400 so that image data 309 can be sent to the center 400 from any location that is within a cellular communications coverage area.
Put simply, the network 350 provides real-time or near real-time, twoway data communications between an image analysis center 400 and drone aircraft 200; two-way data communications between the image analysis center 400 and a remote, pole-mounted hangar 500 described below; and, two-way data communications between the image analysis center 400 and a subscriber’s cell phone, typically in a motor vehicle 600, which is depicted in FIG. 1 as traveling over a roadway 302. The image analysis center 400 is thus a central control center for the system 100.
[0015] [0016] [0017] [0018]
The image analysis center 400 is essentially a facility that receives image data 309, re-constructs the captured images 308 from the data 309 and provides an evaluation of information in the images or a computer-generated analysis of the images 308. The image evaluations/analyses are transmitted in real-time or near real-time from a network access device 418 at the image analysis center 400 (not shon in FIG. 1) to cell phones or other wireless communications devices of individuals and/or entities who subscribe, i.e., pay for, real-time over-head surveillance provided by the drone air craft 200.
In one alternate embodiment, the image analysis center 400 transmits image data 309 that it receives from the drone aircraft 200 to subscriber cell phones via the network 350 in real time or nearly real time. A program on a user’s cell phone, commonly referred to as an “app,” reconstructs the image 308 captured by the camera on the aircraft 200 from the image data 309 received from the image analysis center 400 and displays the image 309 on the user’s cell phone display device. The image analysis center 400 thus stores and forwards image data 309 to users’ cell phones.
In yet another embodiment, the image analysis center 400 transmits image data 309 that it receives from the drone aircraft 200 to subscriber cell phones via the network 350 in real time or nearly real time and provides a two-way audio connection with a person at the image analysis center 400 who can advise a service subscriber or receive requests from a subscriber similar to how existing telematics systems can provide operator assistance.
The system 100 shown in FIG. 1 also preferably includes one or more hangars 500, preferably mounted atop a pole or tower 502 to deter vandalism but to also extend the aircraft’s battery life. Parking the aircraft 200 atop a pole or tower 502 eliminates energy that would [0019] [0020] [0021] [0022] otherwise be required to lift aircraft 200 from the ground to a cruising altitude. The hangar 500 is referred to herein as being a polemounted hangar 500.
Since hangars 500 are exposed to the elements including wind, the hangars 500 are preferably shaped to have a smooth, hemispherical cover 506 to reduce wind loading on the pole or tower 502 on which a hangar is supported. And, since the aircraft 200 requires a downwardly-directed air current, the cover 506 of the hangars 500 preferably open completely, as shown in FIG. 1 in order to provide a wide, flat surface 508 against which air current from the aircraft rotors can be directed.
The aircraft 200 is preferably a helicopter, which is considered herein to be an aircraft, whose lift is derived from aerodynamic forces acting on one or more powered rotors turning about substantially vertical axes. As is well known, a helicopter can move vertically but it can also move laterally by tilting on or more of the vertical axes about which the rotors turn.
FIG. 2 is a detailed block diagram of a battery-powered pilotless drone aircraft (aircraft) 200, preferably embodied as helicopter. As stated above, the aircraft 200 comprises a digital camera 202 and a network access device 204, which is also provided with an antenna 206. It also comprises two or more rotors 210 shown in FIG. 1 but omitted from FIG. 2, which are powered by D.C. motors. Since D.C. motors and their control are well known in the art, they are omitted from FIG. 1 and FIG. 2 for clarity.
In addition to the camera 202 and network access device 204, the aircraft 200 comprises a processor 212, which is coupled to a nontransitory memory device 214, such as an electrically erasable programmable read-only memory (EEPROM) and/or random access [0023] [0024] [0025] [0026] memory (RAM). The memory device 214 stores program instructions for the processor 212.
The term “bus” refers to a set of parallel electrical conductors in a computer system that forms a main transmission path. A typical bus includes address, data and control lines.
The memory 214 and processor are coupled to each other by a conventional address/data/control bus 216. When instructions stored in the memory 214 are executed by the processor 212, they cause the processor 212 to send and receive various electrical signals over bus 216 by which the processor controls the aircraft 20. Operations pertinent to this disclosure are described hereinafter.
In addition to the camera 202 and network access device (NAD) 204, other devices coupled to the bus 216 and controlled by the processor include a global positioning system (GPS) 218, rotor drive motor controllers 220, at least part of a battery charger 226 coupled to a battery 224 that provides power to the aircraft 200, a public address system 222 and an infrared detector 270.
As stated above, the processor 212 sends and receives signals over the bus 216 to control the aircraft and other peripheral devices it carries. By way of example, program instructions stored in memory 214 cause the processor 212 to obtain a current location from the GPS 218, obtain a destination or route from memory 214 and adjust the speed and inclination angles of the rotors 210 to direct the aircraft 200 to the destination or over the route. The rotor controllers 220 as well as the other peripheral devices (GPS, network access device, public address system, charger and IR detector) are thus responsive to commands issued to each of them by the processor 212 via the bus 216.
[0027] [0028] [0029]
As used herein, “predetermined” means specified or identified in advance. In a preferred embodiment, the aircraft 200 preferably flies a predetermined route, along a predetermined road 302 or over some predetermined terrain. Geographic coordinates corresponding to the predetermined route, road or terrain are thus stored in the memory device 214 prior to the aircraft 200 taking off. In an alternate embodiment, however, geographic coordinates for a new route, a different road or a different area can be transmitted to the aircraft 200 via the network 350, loaded into the memory device 214 by the processor 212 and used by the processor 212 and GPS 218 to reroute/re-purpose the aircraft in real time or nearly real-time.
Program instructions in the memory device 214 cause the processor 212 to obtain locations from the global positioning system 218 and upon receipt of a location there from, provide instructions to the rotor drive motor controller 220 that cause the aircraft to fly to a particular location or traverse a particular route determined by route information stored in the memory device 214.
In a preferred embodiment, the aircraft 200 can also be flown manually by a human pilot at a remote location, preferably the image analysis center 400. In such an embodiment, human flight control of the aircraft 200 is provided by relaying flight information between the pilot at the remote location and the aircraft 200 in real time via the network 350. Flight control data is exchanged between the aircraft 200 and a remote “cockpit” located the image analysis center 400 through the NAD 218 similar to how a vehicle telematics system issues commands to an engine control unit (ECU) of a motor vehicle. In order to enable remote piloting of the aircraft 200, some of the instructions stored in the memory device 214 require the processor 212 to receive flight commands from a pilot at a remote location via the network [0030] [0031] [0032] [0033] [0034] access device 204 to which the processor 212 is also coupled to via the bus 216.
Still referring to FIG. 2, the camera 202 on the aircraft 200 sends image data 309 to the network access device 204 in real time via a dedicated connection 221 extending between the camera 202 and network access device 204. Image data 309 from the camera 202 thus does not have to pass through the processor 212 on its way to the network access device 204.
Image data 309 sent to the network access device 204 is preferably transmitted by the network access device 204 to the image analysis center 400 but can optionally be stored in a non-transitory memory device such as the memory device 214 from which the processor 212 obtains executable instructions.
The network access device 204 is preferably a network access device used in prior art vehicle telematics systems. Such devices are well known to those of ordinary skill in the art to be capable of sending and receiving data at rates that allow for continuous or near continuous transmission of image data 309 obtained from the camera 202.
In addition to providing a network connection for the camera 202, the network access device 204 also provides data exchanges that enable a voice connection to a public address system 222 embodied as a loud speaker 232 and a directional, noise-cancelling microphone 234, both of which are also carried by the aircraft 200. The public address system 222 is coupled directly to the network access device 204, also making it unnecessary to transfer audio-frequency information through the processor 212.
The directional, noise-cancelling microphone 234 enables an operator at the image analysis center 400 to “listen” to audio signals below the [0035] [0036] [0037] aircraft 200. The loudspeaker 232 enables the aircraft 200 to broadcast audio to areas below the drone. The public address system 223 thus provides a two-way, speaker phone-like audio communication capability to people who are within range of the microphone 234 and the speaker 224. When combined with the functionality of the camera 202 and the aircraft’s ability to hover, the directional noise-cancelling microphone 234 and loudspeaker 232 enable the aircraft 200 to selectively identify particular areas, persons or things to listen to or for, and to also provide audible responses.
Finally, an infrared (IR) detector 270 is also coupled to the processor 212 via the bus 216. The IR detector 270 is directed downwardly, i.e., toward the ground beneath the aircraft 200 to enable the aircraft 200 to look for objects and individuals on the ground by search for heat energy emitted from them.
As stated above, the aircraft is battery-powered. In FIG. 2 a battery pack 224 for the aircraft 200 is kept charged when the aircraft 200 is not aloft by a battery charger located at a pole-mounted hangar 500. The battery pack 224 is connected to the charger inductively or through direct electrical contacts. In the case of an inductive charger, a coil 228 inductively couples the charger 226 portion in the aircraft 200 to a mating coil on a landing surface or hanger, not shown in FIG.
2. Alternatively, direct electrical connections 230 on the aircraft landing gear, not shown in FIG. 2, enable a direct connection of the battery pack 224 to a charger.
FIG. 3 is a depiction of how images obtained by a camera on a pilotless drone aircraft can be collected, sent to an analysis center and provided to an end user/consumer in real time or substantially real time. A wireless network is omitted from FIG. 3 for brevity.
[0038] [0039] [0040] [0041]
In FIG. 3, a drone aircraft 200 hovers over a roadway 302. Images 308 of a two-vehicle collision 310 on a roadway 300 are captured by the camera 202 carried by the aircraft 200 hovering above the collision site 311. Digital data 309 representing captured images is provided by the camera 202 directly to a network access device 204 on the aircraft 200. The network access device transmits image data 309 in real time to a wireless network, not shown, through which the image data 309 is routed to a network access device 414 of an image analysis center 400.
The network access device 414 is preferably co-located at the image analysis center 400. Image data 309 is received at the image analysis center 400 is provided to a computer 416 having a display device 417 on which images 308 collected by the aircraft’s camera 202 are displayed to a person. The terrain below the camera 202, including the collision 310, can thus be seen on the display device 417 in real time.
The image of the collision is an aerial view, i.e., a view of the collision 310 taken from above the collision 310 and looking downwardly at it.
In one embodiment, a person seated at the computer 416 and viewing images on the display device 417 provide analysis of the accident 310 and surrounding traffic congestion if any. Such person is also provided with the ability to fly the aircraft 200 from the image analysis center 400 using an input device 420 wirelessly coupled to the drone aircraft 200 via a communications link existing between the two network access devices 204 and 414. In an alternate embodiment, evaluation of an accident, road conditions and the terrain is performed by image processing with the computer 416, not a person. The computer 416 is thus considered herein to be an operator that provides evaluation of terrain, driving conditions as well as the vehicle collisions and other information relevant thereto.
[0042] [0043] [0044]
Still referring to FIG. 3, image data from the camera 202 and which is displayed on the computer 416 can be optionally transmitted from the image analysis center 400 in real time or nearly real time via a second network access device 418 at the image analysis center 400. Using a second network access device 418 to transmit image data 309 enables the image data 309 to be transmitted in real time or near real time to other destinations, including for example a cell phone 324 in a motor vehicle 326 traveling on the roadway 302 and approaching the collision 310.
In a preferred embodiment of the system described herein, evaluations and/or images of areas under surveillance by the aircraft 200 are selectively provided to only the cell phones of individuals or entities that pay for surveillance service. The system shown in FIG. 1 and FIG. 3 thus selectively provides real-time surveillance, i.e., images and other information obtained by a pilotless drone. A real-time two-way dialogue can also be established between a person in a vehicle and an operator at an image analysis center by which an operator can advise a driver as to what the driver sees from the drone. In one particular embodiment, a subscriber in a motor vehicle 326 can also pay for the right to request a drone to be launched and an area surveilled in order to provide a driver or other person, real-time overhead information not currently available from existing traffic awareness service providers. Such information can include but is not limited to obstacles or roadway conditions that are hidden by a hill or curve 319 on a roadway that might be new to a driver or not frequently used.
In another embodiment, obstacles and/or roadway information not visible to a driver who is headed toward them are detected or seen by the aircraft 200 and sent to an image analysis center 400. In such an embodiment, the image analysis center 400 or an operator at the [0045] [0046] [0047] center 400 causes data regarding an unseen driving problem to be sent to a vehicle 326 via a radio frequency data link 327. Data transmitted to the vehicle 326 is received by a prior art telematics system 329 that is in the vehicle 326 and which is coupled to various computers that control operation of the vehicle 326. Such computers are well known in the art. They are omitted from FIG. 3 for clarity but include the engine controller, transmission controller and anti-lock brakes controller, all of which are also coupled to the vehicle’s controller area network (CAN) bus. Accident prevention data sent to the vehicle 326 causes the one or more processors on the vehicle 326 to change the vehicle’s operation via the computers’ control of the vehicle 326 in anticipation of an unseen obstacle or roadway condition 319. By way of example, if the speed of a vehicle 326 approaching a sharp curve 319 is too high for the vehicle to safely negotiate the curve 319, commands transmitted to the vehicle’s onboard telematics system 329 are provided by the telematics system 329 to an engine controller, which reduces engine speed, shifts a transmission, and/or applies the brakes, independent of the driver.
FIG. 4 is a more detailed block diagram of an image analysis center 400. As stated above, the image analysis center 400 comprises a first network access device 414, configured to provide high-speed two-way data communications between the image analysis center 400 and a battery-powered pilotless drone aircraft 200.
The image analysis center also comprises a computer 416. The computer 416 comprises a processor 416 and an associated, nontransitory memory 417 storing program instructions for the processor 416 and data received from a network access device 414.
The stored program instructions include image data recovery instructions 421 by which the processor 416 reads incoming data from [0048] [0049] [0050] the network access device 414 and recovers images 308 from the image data 309. Other image display instructions 423 cause the processor to display images on a display device from which an operator can see what is below the aircraft 200.
In one embodiment, the image analysis computer 416 comprises image analyzer instructions 425. The image analyzer instructions 425 cause the processor 419 to recognize shapes of vehicles by which it detects a number of vehicles per unit area that is under the aircraft and within the camera’s field of view to provide a measure of vehicle flow rate on a road. Such image processing software also recognizes collisions using pattern recognition and thermal energy detected by the infrared detector 270. Another set of instructions 427 thus enables the processor 419 to analyze image data 309 for heat and temperature information provided by the infrared detector 270 carried on the aircraft 200.
In one embodiment, controls 404 for piloting the aircraft 200 are coupled to the network access device 414. Such controls enable direct control of the aircraft 200 from the image analysis center 400 using data sent to and received from the aircraft 200 via the network 350. A person operating the remote pilot controls 404 can thus direct the aircraft 200 manually instead of allowing the aircraft 200 to fly automatically over a predetermined course programmed into the aircraft’s processor.
Still referring to FIG. 4, a microphone 406 and loud speaker 408 at the image analysis center 400 and which are coupled to the network access device 414 allow a person at the network access device 414 to listen to audio signals picked up by the microphone 234 attached to the aircraft 200 and to broadcast audio from the loudspeaker 232 that is also attached to the aircraft 200.
[0051] [0052] [0053] [0054]
Referring now to FIG. 5 there is shown a pole-mounted hangar 500 for the aircraft 200. The hangar 500 comprises of course a pole or tower 502 having a height, “h” great enough to deter someone from attempting to climb it and remove or vandalize equipment mounted thereon. The height, “h” also provides a starting elevation for the aircraft 200 effectively reducing the energy otherwise needed from the aircraft’s battery to lift the aircraft to an elevation at which it operates, A typical value for “h” is about twenty -five feet although those of ordinary skill in the art will recognize that higher elevations are more desirable.
A substantially planar horizontal landing pad surface or deck 504 at the top 503 of the pole 502 is provided with a motor-operated hemispherical-shaped clam shell-like cover 506, the outside surface 508 of which is provided with conventional photovoltaic cells 510 that generate electricity to re-charge the battery 224 of the aircraft 200 and power a local network access device 522, also attached to the pole or tower 502.
The motor 512, which is of course reversible, is rigidly attached to the deck 504 and to the hemispherical clam-shell 506. It rotates in opposite directions responsive to open and close commands that the motor 512 receives from the network access device 522 and the image analysis center 400. Such instructions cause the cover 506 to open and close by operating the motor in opposite directions. The cover 506 and deck 504 thus effectively provide a hangar in which the aircraft 200 can be kept until it is needed.
In the embodiment shown in FIG. 5, the landing pad deck 504 sits atop a pole or tower. In an alternate embodiment, not shown but readily understood by those of ordinary skill in the mechanical arts, the landing pad deck 506 and cover 506 are supported by a cantilever, [0055] [0056] [0057] [0058] [0059] which is well known to those of ordinary skill in the mechanical art as a beam supported at only one end.
As stated above, the hangar 500 provides battery recharging to batteries in the aircraft 200. In one embodiment, the battery charging is by way of induction. In a second embodiment, direct electrical connections are used.
In FIG. 5, an induction coil 514 on the deck 504 or contact pads 516 on the deck 504 are different mechanisms by which energy to recharge batteries 224 in the aircraft 200 is provided to the aircraft 200 when it is on the deck 504. Solar energy to recharge the batteries can also be provided by a photo-voltaic panel 518 attached to the pole 502. In another embodiment, a wind driven turbine 520 provides electric energy by which batteries 224 in the aircraft 200 can be recharged.
As mentioned above, the pole-mounted hangar 500 is also provided with its own wireless network access device 522. It too is also preferably mounted to the pole 502.
The network access device 522 for the hangar 500 receives commands from the image analysis center 400 that include commands to open the clam-shell cover 506. A command to fly from the hangar 500 is sent directly to the aircraft 200 via its own network access device 204. The wireless communications device 522 thus provides a remote control interface for the pole-mounted hangar 500.
In one embodiment, the cover 506 and pad 504 are electrically conductive and when the cover 506 is closed they are electrically coupled to each other such that the cover and pad form an electromagnetic shield, blocking radio frequency signals from the interior of the cover 506 and preventing the aircraft from being actuated when the cover 506 is closed.
[0060] [0061] [0062]
In another embodiment, meteorological sensors that include temperature sensors 524 a rain gauge 526 and a wind speed and direction indicator 528 are mounted to the pole 502 and operatively coupled to the network access device 522. They provide real-time information on ambient weather conditions that can be relayed by the network access device 522 to the image analysis center 400. Knowing ambient weather conditions can thus facilitate a decision whether launching or operating the drone might be unwise.
Finally, FIG. 6 depicts steps of a method for providing real-time accident and traffic surveillance from a battery-powered pilotless drone aircraft. At a first step 602, a request for real-time traffic information is received at an image analysis center or other control center. Such a request is typically from a subscriber, i.e., a person who pays for the right to receive real-time terrain surveillance but optionally from a service provider at the image analysis center or elsewhere. If a drone aircraft is aloft, as determined at 604, image data from the aircraft can be displayed on a display device. Alternatively, an operator can divert the drone to a requested area at step 606. If on the other hand no aircraft is aloft as determined at step 608, a drone aircraft can be released from the nearest pole-mounted hangar responsive to a launch command provided by the image analysis center to the hangar via network access devices.
Once an aircraft is determined to be on station, at step 610, in one particular embodiment, a scan command can be sent to the drone via the wireless communications link to scan terrain below the aircraft. Regardless of whether a camera is continuously scanning or is instructed to scan, at step 612, image data obtained from the camera is received at the drone, stored in its local computer and transmitted in [0063] [0064] [0065] real-time via the wireless communications network to an image analysis center.
At step 614, images obtained from the camera on the aircraft are displayed on a display device at the image analysis center. Image analysis can be provided by either a person or a computer. At step 616, a data message, audio message or stream of images captured by the drone-mounted camera are sent to a subscriber via the wireless communications network. At step 618 a decision is made whether the subscriber has paid for image data in which case images are sent to the subscriber at step 620. Alternatively, the method returns at step 618 to the waiting state 602.
The method, system and apparatus described above provide real-time surveillance of geographic areas using pilotless aircraft or drones.
They are operationally similar to existing telematics systems in that real-time information is transferred over a cellular network and by which data is exchanged in real time by which is meant, the actual time during which something takes place.
The foregoing description is for purposes of illustration. The scope of the invention is set forth in the following claims.
04 15
1. A method providing remote observation of a roadway, the method comprising: wirelessly receiving a request for real-time surveillance of an area of terrain via a wireless communications network;
operating a battery-powered pilotless drone to fly over the area of terrain; capturing images of the area of terrain in real time;
transmitting captured images of the terrain to a predetermined wireless communications device in a vehicle in real time, the transmission of the captured images being made by way of the wireless communications network;
wherein the wireless communications network is a cellular telecommunications network, and wherein the step of transmitting captured images of the terrain to a predetermined wireless communications device comprises transmitting captured images through the cellular telecommunications network to a cell phone in the vehicle;
wherein captured images are first sent to an image analysis center in real time and wherein captured images are sent in real time from the image analysis center to the pre-determined wireless communications device in a vehicle by way of the wireless communications network and

Claims (10)

  1. wherein the step of transmitting captured images of the terrain to a predetermined wireless communications device comprises selectively transmitting captured images through the cellular telecommunications network to a cell phone of a cellular telephone service subscriber who paid for the right to receive real-time terrain surveillance.
  2. 2. The method of claim 1, further comprising providing a two-way audio connection between a recipient of information obtained from captured images and, a person at an image analysis center.
    04 15
  3. 3. A system for providing remote observation of a roadway, the system comprising:
    a drone configured to be able to hover over at least a predetermined section of roadway;
    a camera carried by the drone and which is configured to capture images of terrain below the drone and provide a stream of data representing captured images, in real time;
    a first wireless network access device carried by the drone and which is coupled to the camera to receive the stream of data representing captured images, the first wireless network access device being configured to transmit radio frequency signals carrying the stream of data representing images to an image analysis center in real time;
    an image analysis center configured to receive data representing captured images from a second network access device, which is wirelessly coupled to the first wireless network access device, the image analysis center being configured to display images captured by the camera to an operator, the operator providing an evaluation of the terrain below the drone as captured by the camera; and a cellular telecommunications network coupled to the image analysis center, the cellular telecommunications network being configured to transmit information provided by said operator regarding the operator’s evaluation of the roadway conditions in real time.
  4. 4. The system of claim 3, wherein information provided by the operator is selectively provided to a predetermined cellular service subscriber in real time, who pays for the ability to receive real-time information derived from the captured images.
    04 15
  5. 5. The system of claim 3, wherein information is provided by the operator to a vehicle, said information effectuating a change in the operation of the vehicle responsive to an image of a roadway condition that was captured by the camera on the drone and which is observed by the operator, said roadway condition not being visible to an operator of the vehicle to which the information effectuating a change in the vehicle operation was sent.
  6. 6. The system of claim 3, wherein the image analysis center is additionally configured to receive images from the camera and configured to selectively transmit said images to the predetermined cellular service subscribers who pay for real-time images captured by the camera.
  7. 7. The system of claim 3, further comprising at least one of a microphone and loudspeaker operatively coupled to the first wireless communications device, the microphone and loud speaker being configured to provide real-time two-way audio communications between the operator and a person near the drone.
  8. 8. The system of claim 3, wherein the first network access device comprises a cellular network access device.
  9. 9. A battery-powered pilotless drone aircraft comprising:
    a camera configured to capture images of terrain below the drone when it is aloft;
    a cellular network access device coupled to and receiving image data from the camera, the cellular network access device configured to be operationally compatible with a cellular telecommunications network and configured to transmit image data into said cellular telecommunications network.
    04 15
  10. 10. An apparatus for providing remote observation of a roadway, the apparatus comprising:
    a battery-powered pilotless drone configured to be able to fly and hover over a predetermined section of roadway responsive to at least one of: location information obtained from an on-board navigation system and a remote pilot;
    a camera carried by the drone and configured to capture images of terrain below the drone, the camera being additionally configured to provide a stream of information-bearing signals representing captured images in real time;
    a first wireless communications device carried by the drone and which is coupled to the camera to receive the stream of information-bearing signals from the camera in real time, the first wireless communications device being configured to transmit radio frequency signals carrying the stream of information-bearing signals to an image analysis center, first wireless communications device being additionally configured to receive flight control signals for the drone;
    an image analysis center configured to receive said informationbearing signals and to display images captured by the camera, to an operator, the operator providing an evaluation of the terrain below the drone as captured by the camera; and a second wireless communications device coupled to the image analysis center, the second wireless communications device being configured to transmit information into a cellular telecommunications network regarding the operator’s evaluation of the roadway.
    Intellectual
    Property
    Office
    Application No: GB1419371.8 Examiner: Max Emery
    Claims searched: 1-10 Date of search: 17 April 2015
    Patents Act 1977: Search Report under Section 17
    Documents considered to be relevant:
    Category Relevant to claims Identity of document and passage or figure of particular relevance X 3,4,5,7,8, 9,10 US2008/0010003 Al (Hedgedus) - Figure 2 and Paragraphs 35 & 57 X 3,4,5,7,8, 10 CN202150183 U (Huang) - Paragraphs 9, 13 & 14 X 3,4,5,7,8, 9,10 US6608559B1 (Femelson) - Figure 1 and Column 3: 32-40 & 9: 21-35 X 3,4,5,7,8, 9,10 CN102654940 B (University Shanghai Jiaotong) - See whole document X 3,4,5,7,8, 9,10 US2006/0058928 Al (University Brigham Young) - Figure 1 and Paragraphs 37 & 65 X 9 US2012/0143482 Al (Honeywell) - Figure 1 and Paragraphs 16 & 17 X 9 US2014/0236390 Al (Mohamadi) - Figure 6B and Paragraphs 25 & 28 X 9 US2012/0229660 Al (Mathews) - Paragraphs 26, 27 & 32 A 3,4,5,7,10 http: //www. real wire. com/releases/never-get- stuck-in-traffic-againgetmethere-co-uk-launches-sms-traffic-alerts-for-uk-motorists The article (dated 8/12/2005) discloses an example of what is common place in the art - users subsribing to traffic updates via text alerts or premium rate telephone calls.
    Categories:_
    X Document indicating lack of novelty or inventive A step
    Y Document indicating lack of inventive step if P combined with one or more other documents of same category.
    Document indicating technological background and/or state of the art.
    Document published on or after the declared priority date but before the filing date of this invention.
    Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
    E Patent document published on or after, but with priority date earlier than, the filing date of this application.
    Intellectual
    Property
    Office & Member of the same patent family
    Field of Search:
    Search of GB, EP, WO & US patent documents classified in the following areas of the UKCX :
    International Classification:
    Subclass Subgroup Valid From G08G 0001/09 01/01/2006 G08G 0001/01 01/01/2006
    Intellectual Property Office is an operating name of the Patent Office www.gov.uk/ipo
GB1419371.8A 2014-10-16 2014-10-30 Drone based accident,traffic surveillance Withdrawn GB2551682A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US201462064862P 2014-10-16 2014-10-16

Publications (2)

Publication Number Publication Date
GB201419371D0 GB201419371D0 (en) 2014-12-17
GB2551682A true GB2551682A (en) 2018-01-03

Family

ID=52118472

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1419371.8A Withdrawn GB2551682A (en) 2014-10-16 2014-10-30 Drone based accident,traffic surveillance

Country Status (1)

Country Link
GB (1) GB2551682A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3640915A1 (en) * 2018-10-18 2020-04-22 Wellen Sham Uav network assisted situational self-driving
US10783776B2 (en) 2018-08-20 2020-09-22 Ford Global Technologies, Llc Drone-based event reconstruction
EP3726494A1 (en) * 2019-04-18 2020-10-21 Transdev Group Method for assisting with the driving of vehicles, associated computer program and system

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110047269B (en) * 2019-04-08 2022-07-26 王飞跃 Accident support system, accident support method, electronic device, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6608559B1 (en) * 1997-04-18 2003-08-19 Jerome H. Lemelson Danger warning and emergency response system and method
US20060058928A1 (en) * 2004-09-14 2006-03-16 Beard Randal W Programmable autopilot system for autonomous flight of unmanned aerial vehicles
US20080010003A1 (en) * 2003-05-20 2008-01-10 Ildiko Hegedus Method and system for collecting traffic information using thermal sensing
CN202150183U (en) * 2010-12-15 2012-02-22 黄钰峰 Traffic management air information acquisition platform
US20120143482A1 (en) * 2010-12-02 2012-06-07 Honeywell International Inc. Electronically file and fly unmanned aerial vehicle
US20120229660A1 (en) * 2011-03-09 2012-09-13 Matthews Cynthia C Methods and apparatus for remote controlled devices
CN102654940B (en) * 2012-05-23 2014-05-14 上海交通大学 Processing method of traffic information acquisition system based on unmanned aerial vehicle and
US20140236390A1 (en) * 2013-02-20 2014-08-21 Farrokh Mohamadi Vertical takeoff and landing (vtol) small unmanned aerial system for monitoring oil and gas pipelines

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6608559B1 (en) * 1997-04-18 2003-08-19 Jerome H. Lemelson Danger warning and emergency response system and method
US20080010003A1 (en) * 2003-05-20 2008-01-10 Ildiko Hegedus Method and system for collecting traffic information using thermal sensing
US20060058928A1 (en) * 2004-09-14 2006-03-16 Beard Randal W Programmable autopilot system for autonomous flight of unmanned aerial vehicles
US20120143482A1 (en) * 2010-12-02 2012-06-07 Honeywell International Inc. Electronically file and fly unmanned aerial vehicle
CN202150183U (en) * 2010-12-15 2012-02-22 黄钰峰 Traffic management air information acquisition platform
US20120229660A1 (en) * 2011-03-09 2012-09-13 Matthews Cynthia C Methods and apparatus for remote controlled devices
CN102654940B (en) * 2012-05-23 2014-05-14 上海交通大学 Processing method of traffic information acquisition system based on unmanned aerial vehicle and
US20140236390A1 (en) * 2013-02-20 2014-08-21 Farrokh Mohamadi Vertical takeoff and landing (vtol) small unmanned aerial system for monitoring oil and gas pipelines

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
http://www.realwire.com/releases/never-get-stuck-in-traffic-again-getmethere-co-uk-launches-sms-traffic-alerts-for-uk-motorists *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10783776B2 (en) 2018-08-20 2020-09-22 Ford Global Technologies, Llc Drone-based event reconstruction
EP3640915A1 (en) * 2018-10-18 2020-04-22 Wellen Sham Uav network assisted situational self-driving
EP3726494A1 (en) * 2019-04-18 2020-10-21 Transdev Group Method for assisting with the driving of vehicles, associated computer program and system
FR3095175A1 (en) * 2019-04-18 2020-10-23 Transdev Group Vehicle driving assistance method, associated computer program and system
US11556137B2 (en) 2019-04-18 2023-01-17 Transdev Group Innovation Method of assisting with the driving of vehicles carried out by associated system including sensors installed along sections of road

Also Published As

Publication number Publication date
GB201419371D0 (en) 2014-12-17

Similar Documents

Publication Publication Date Title
US11394457B2 (en) Method, apparatus and system of providing communication coverage to an unmanned aerial vehicle
US20200290752A1 (en) Autonomous hanging storage, docking and charging multipurpose station for an unmanned aerial vehicle
US10588027B2 (en) Method and system for implementing self organizing mobile network (SOMNET) of drones and platforms
US11693402B2 (en) Flight management system for UAVs
US10450091B2 (en) Package acceptance, guidance, and refuel system for drone technology
EP3664479B1 (en) Uav supported vehicle-to-vehicle communication
JP2021167191A (en) Vibration suppression control device
EP3796571B1 (en) Method and device for controlling unmanned aerial vehicle to access network
KR101580609B1 (en) Unmanned ground vehicle equipped with unmanned aerial vehicle
EP3507999A1 (en) Uav for cellular communication
GB2551682A (en) Drone based accident,traffic surveillance
US20210070471A1 (en) Collaborative relationship between a uav and an automobile
US20210179137A1 (en) Autonomous dining vehicle
JP7030571B2 (en) Flight management system
CN113093800B (en) Sharing system based on unmanned aerial vehicle takes photo by plane
Allsopp Emergency airborne 4G comms to aid disaster traffic management
CN112130546A (en) 5G-based unmanned aerial vehicle and control method thereof
KR102213036B1 (en) drone control system
EP4287162A1 (en) Control system, flying body identification method, computer-readable medium, and flying body
EP4287164A1 (en) Aircraft, control system, aircraft identification method, and computer-readable medium
CN115051742A (en) Floating communication relay platform of asymmetric link

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)