WO2017114503A1 - Facilitating communication with a vehicle via a uav - Google Patents
Facilitating communication with a vehicle via a uav Download PDFInfo
- Publication number
- WO2017114503A1 WO2017114503A1 PCT/CN2016/113724 CN2016113724W WO2017114503A1 WO 2017114503 A1 WO2017114503 A1 WO 2017114503A1 CN 2016113724 W CN2016113724 W CN 2016113724W WO 2017114503 A1 WO2017114503 A1 WO 2017114503A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- uav
- identification
- communication
- information regarding
- Prior art date
Links
- 238000004891 communication Methods 0.000 title claims abstract description 123
- 238000000034 method Methods 0.000 claims description 46
- 230000004044 response Effects 0.000 claims description 9
- 230000000977 initiatory effect Effects 0.000 claims 2
- 238000012545 processing Methods 0.000 description 49
- 230000003287 optical effect Effects 0.000 description 14
- 230000001413 cellular effect Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 8
- 230000003936 working memory Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000009434 installation Methods 0.000 description 3
- 230000015654 memory Effects 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000008450 motivation Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000007794 irritation Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B10/00—Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
- H04B10/11—Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B7/00—Radio transmission systems, i.e. using radiation field
- H04B7/14—Relay systems
- H04B7/15—Active relay systems
- H04B7/185—Space-based or airborne stations; Stations for satellite systems
- H04B7/18502—Airborne stations
- H04B7/18504—Aircraft used as relay or high altitude atmospheric platform
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04B—TRANSMISSION
- H04B7/00—Radio transmission systems, i.e. using radiation field
- H04B7/14—Relay systems
- H04B7/15—Active relay systems
- H04B7/185—Space-based or airborne stations; Stations for satellite systems
- H04B7/18502—Airborne stations
- H04B7/18506—Communications with or from aircraft, i.e. aeronautical mobile service
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/214—Specialised server platform, e.g. server located in an airplane, hotel, hospital
- H04N21/2146—Specialised server platform, e.g. server located in an airplane, hotel, hospital located in mass transportation means, e.g. aircraft, train or bus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/258—Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
- H04N21/25808—Management of client data
- H04N21/25841—Management of client data involving the geographical location of the client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41422—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W16/00—Network planning, e.g. coverage or traffic planning tools; Network deployment, e.g. resource partitioning or cells structures
- H04W16/24—Cell structures
- H04W16/26—Cell enhancers or enhancement, e.g. for tunnels, building shadow
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W40/00—Communication routing or communication path finding
- H04W40/24—Connectivity information management, e.g. connectivity discovery or connectivity update
- H04W40/248—Connectivity information update
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W52/00—Power management, e.g. TPC [Transmission Power Control], power saving or power classes
- H04W52/02—Power saving arrangements
- H04W52/0203—Power saving arrangements in the radio access network or backbone network of wireless communication networks
- H04W52/0206—Power saving arrangements in the radio access network or backbone network of wireless communication networks in access points, e.g. base stations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W76/00—Connection management
- H04W76/10—Connection setup
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
- B64U10/14—Flying platforms with four distinct rotor axes, e.g. quadcopters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/20—UAVs specially adapted for particular uses or applications for use as communications relays, e.g. high-altitude platforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
Definitions
- the present disclosure relates tofacilitating telecommunications throughunmanned aerial vehicle, and more specifically to facilitating telecommunications through self-sustaining unmanned aerial vehicle.
- UAV unmanned aerial vehicle
- UAVs commonly known as a drone and also referred by several other names, is an aircraft without a human pilot aboard.
- the flight of UAVs may be controlled either autonomously by onboard computers or by the remote control of a pilot on the ground or in another vehicle.
- UAVs have mostly found military and special operation applications, but also are increasingly finding uses in civil applications, such as policing, surveillance and firefighting, and nonmilitary security work, such as inspection of power or pipelines.
- UAVs are adept at gathering an immense amount of visual information and displaying it to human operators. However, it can take a great deal of time and manpower to interpret the information gathered by UAVs. In many cases, the information gathered by UAVs is misinterpreted by human operators and analysts who have a limited time window in which to interpret the information.
- Cellular telecommunication protocol is generally known, such as the third or 4 th generation of mobile phone networks.
- Radio spectrum is a scarce resource in cellular networks. Governments license the fight to use parts of the spectrum to the cellular network operators, often using a spectrum auction in which network operators submit bids.
- the architecture of the cellular network mainly comprises subscriber devices and base stations. The base stations implement air interfaces with the subscriber devices. For many rural areas, the coverage of base stations provided by the telecommunication operators do not reach those areas.
- Embodiments are provided for facilitating communications with a vehicle through an unmanned aerial vehicle (UAV) .
- the UAV can be configured to track vehicles traveling in an area covered by the UAV.
- An identification of the vehicle can be acquired after the vehicle is tracked by the UAV.
- the vehicle identification can be used to determine communication capability of the vehicle.
- a communication request can be initiated by the UAV.
- the vehicle can determine either accept the communication request from the UAV or turn it down. If the vehicle accepts the communication request from the UAV, information intended for the vehicle, for example from another vehicle, can be forwarded to the vehicle by the UAV.
- Embodiments can provide communication protocols for communications to the vehicle via a processing station.
- the processing station may be configured to have a communication link with the UAV 102 that can communicate with the vehicle as described above.
- the processing station can be connected to a core network that provides, for example, a cellular network, a public switched telephone network (PTSN) and/or the Internet.
- the processing station can serve as relay point or a router for forwarding information to the vehicle via the UAV.
- the processing station can forward information from a server to the vehicle via the UAV.
- the processing station can also serve as a relay point or router to communicate information from the vehicle 106 to other entities connected to the core network.
- Embodiments can provide communication protocols for communications between two vehicles.
- Each vehicle can have a communication link with a corresponding UAV.
- the UAVs can be configured to communicate with each other directly or via a processing station.
- the vehicles can communicate with each other via the communication link established by the UAVs.
- FIG. 1 illustrates an exemplary UAV network in accordance with the disclosure.
- FIG. 2A illustrates one example of communications between a vehicle, a UAV and a ground processing station shown in FIG. 1.
- FIG. 2B illustrates a view of the network shown in FIG. 1 with respect to areas covered by UAVs.
- FIG. 2C illustrates another view of network shown in FIG. 1.
- FIG. 3 illustrates one example of a communication protocol between a vehicle, a UAV, and a processing station shown in FIG. 1.
- FIG. 4 illustrates another example of a communication protocol that can be used to communicate information to a given vehicle via a UAV and a processing station shown in FIG. 1.
- FIG. 5 illustrates an example of a communication protocol between two vehicles via respective UAVs.
- FIG. 6 illustrates an example of a device for facilitating communication with a given vehicle in accordance with the disclosure.
- FIG. 7 illustrates one exemplary method for facilitating communication with a vehicle via a UAV in accordance with the disclosure.
- FIG. 8 illustrates a simplified computer system, according to an exemplary embodiment of the present disclosure.
- UAVs are well suited for applications where the payload consists of optical image sensors such as cameras with powerful lightweight sensors suited for a variety of commercial applications such as surveillance, video conferencing, vehicle positioning, and/or any other applications.
- a UAV in accordance with the disclosure can collect multi-spectral imagery of any object in an area covered the UAV.
- the UAV in accordance with the disclosure can fly up to 65,000 feet and can cover as much as 500 km in range.
- One motivation of the present disclosure is to employ UAVs to facilitate communications between vehicles. In certain areas, especially rural areas, deployment of cellular networks is limited due to cost and benefit considerations. In those areas, cellular coverage is often less than desirable. Communications between vehicles traveling in those areas through cellular networks is thus not reliable.
- Another motivation of the present disclosure is to provide an alternative wireless communication protocol to the existing ones. Mobility of a vehicle is obviously much greater than a mobile device carried by a person. When a vehicle moves out of the range of one cellular base station and into the range of another one, the flow of data must be re-routed from the old to the new cell base station. This technique is known as handover or handoff. A vehicle can quickly move in and out of a cell, which can require frequent handoff between cells. As handoff incurs overhead and does not always take place instantly as a vehicle move from one cell to another cell at a high speed, communications between two vehicles through a cellular network is not reliable. Thus, the existing cellular networks are not suitable for communications between two vehicles. While satellite communications are also possible, the cost of using satellite makes it not commercially viable for vehicle communications. Embodiments provide communication technologies to facilitate telecommunications for a moving vehicle via UAV and/or ground processing station.
- FIG. 1 illustrates an exemplary UAV network 100 for facilitating communications for a an vehicle in accordance with the disclosure.
- the UAV network 100 can comprise multiple UAVs 102, such as UAVs 102a-f.
- the UAV network 100 in certain embodiments, can comprise hundreds, thousands, or even tens of thousands of UAVs 102.
- the individual UAVs 102 in network 100 such as UAV 102a, can fly above the ground, between 50,000 to 65,000 feet altitude. However, this is not intended to be limiting. In some examples, some or all of the UAVs 102 in the network 100 can fly at hundreds or thousands feet above the ground.
- the individual UAVs 102 in the network 100 can communicate with each other through communication hardware carried by or installed on UAVs 102.
- the communication hardware onboard a UAV 102 can include an antenna, a high frequency radio transceiver, an optical transceiver, and/or any other communication components for long range communications.
- a communication channel between any two given UAVs 102 in network 100, for example, UAV 102c and UAV 102d, can be established.
- UAVs 102a, 102b and 102c are neighboring UAVs such that they cover neighboring areas 104a, 104b, and 104c respectively. They can be configured to communicate with each other once they are within a threshold distance.
- the threshold distance can be the maximum communication range of the transceivers onboard the UAVs 102a, 102b, and 102c. In this way, UAVs 102a, 102b, and 102c can send data to each other without an access point.
- a controller may be referred to as a piece of hardware and/or software configured to control communications within network 100.
- the controller can be provided by a ground processing station, such as ground processing station 110a, 110b, or 110c.
- the controller can be implemented by a computer server housed in a ground processing station 110.
- the controller can be provide by a UAV 102 in the network 100.
- a given UAV 102 such as a unmanned helicopter or a balloon, in the network 100 can carry payloads including one or more of a processor configured to implement the controller.
- the controller can be configured to determine network requirements based on an application supported by network 100, and/or to perform any other operations.
- control signals can be transmitted via a control link from the controller to the UAVs 102 shown in FIG. 1.
- an important criteria to a UAV 102 in the network is altitude.
- the signals emitted by UAV 102 becomes weaker.
- a UAV 102 flying at an altitude of 65,000 feet can cover an area up to 100 kilometers on the ground, but the signal loss can be significantly higher than would occur for a terrestrial network.
- Radio signals typically requires a large amount of power for transmission in long distance.
- the payloads can be carried by a UAV 102 that stays in the air for an extended period of time is limited.
- solar energy can be used to power the UAV 102. However this limits the weight of payloads that can be carried by a UAV 102 due to the limited rate at which solar irritation can be absorbed and converted to electricity.
- Free-space optical communication is an optical communication technology that transmits light in free space to wirelessly transmit data for telecommunications.
- Commercially available FSO systems use wave length close to visible spectrum around 850 to 1550 nm.
- two FSO transceivers can be placed on both sides of transmission path that has unobstructed line-of-sight between the two FSO transceivers.
- a variety of light sources can be used for the transmission of data using FSO transceivers. For example, LED and laser can be used to transmit data in a FSO system.
- a FSO unit can be included in the payloads of a UAV 102 for communication.
- the FSO unit can include an optical transceiver with a laser transmitter and a receiver to provide full duplex bi-directional capability.
- the FSO unit can use a high-power optical source, i.e., laser, and a lens to transmit the laser beam through the atmosphere to another lens receiving the information embodied in the laser beam.
- the receiving lens can connect to a high-sensitivity receiver via optical fiber.
- the FSO unit included in a UAV 102 in accordance with the disclsoure can enable optical transmission at speeds up to 10Gbps.
- FIG. 1 Also shown in FIG. 1 are vehicles 106a-f.
- the vehicles 106 can be equipped with communication hardware.
- the communication hardware in a given vehicle 106 can include a FSO unit described above, a radio transceiver, and/or any other type of communication hardware.
- the communication hardware included in the vehicles 106 can be used to establish a communication channel between the vehicles 106 via the UAVs 102.
- FIG. 2A illustrates one example of communications between a vehicle 106, a UAV 102 and a ground processing station 110.
- the vehicle 106 can include a FSO unit 202a, which can include an optical transceiver.
- the optical transceiver included in the FSO unit 202a can be configured to receive laser beam 204 from UAV 102, and/or transmit the laser beam 204 to UAV 102.
- the UAV 102 can include one or more of a FSO unit as well.
- UAV 102 includes a FSO unit 202c configured to communicate with a FSO unit in a vehicle 106, and another FSO unit 202d configured to communicate with a FSO unit 202b in a ground processing station 110.
- UAV 102 may comprise a single FSO unit configured to communicate with a FSO unit in vehicle 106 and as well as a FSO unit in ground processing station 110.
- the FSO units 202a and 202c in order for FSO units 202a and 202c to communicate, there should be a line of sight (LoS) between them so that laser beam 204a can be transmitted and received.
- the wavelength of the laser beam 204a can be between 600nm to 2000nm.
- a ground processing station 110 can include a FSO unit 202b configured to establish a communication channel FSO unit 202d through laser beam 204b.
- UAV 102 can be configured to communicate its geo-locations to processing station 110. Since ground processing station 110 is stationary, the geo-location of ground processing station 110 can be preconfigured into an onboard computer in UAVs 102.
- information intended for vehicle 106 can be forwarded to vehicle 106.
- the ground processing station 110 can be connected to a wired or wireless network. Information intended for vehicle 106 can be communicated through the wired or wireless network from or to another entity connected to the wired or wireless network. The information intended for vehicle 106 can be first communicated to the UAV 102 through laser beam 204b, and the UAV 102 can forward the information to vehicle 106 through laser beam 204a.
- FIG. 2B illustrates a view of network 100 with respect to areas 104 covered by UAVs 102.
- a given area 104 such as area 104a, 104c, 104d, 104e, 104f
- UAV 102 such as UAV 102a-e respectively.
- a processing station 110 such as processing station 110a-c can be provided in certain areas 104, such as areas 104b, 104g and 104h.
- the processing station 110a-c can serve as access points for a given vehicle 106 to communicate with other vehicles and as well to access as a core network. This is illustrated in FIG. 2C.
- FIG. 2C illustrates another view of network 100.
- vehicles 106a-c can communicate with UAVs 102a-c via a FSO interface 210.
- the UAVs 102a-c are operatively connected processing stations 110a and 110b as shown.
- the processing stations 110 and UAVs 102 together make up UAV access network 206.
- the processing stations 110 can be connected to a core network 208 which can comprise a public switched telephone network (PSTN) or the Internet through a mobile switching center (MSC) , a Gateway Mobile Switching center, a media gateway (MGW) .
- PSTN public switched telephone network
- MSC mobile switching center
- MGW Media gateway
- the core network 208 can include a PS (Profile Server) in the core network that registers the location of vehicles 106 and as well as locations of UAVs 102, and other profile information that is used for authentication and authorization. In this way, vehicles 106 can be found by the profile server.
- PS Profile Server
- FIG. 3 illustrates one example of a communication protocol between a vehicle 106, a UAV 102, and a processing station 110.
- a tracking signal can be transmitted from UAV 102 for tracking vehicle 106.
- the tracking signal can be in various forms.
- the UAV 102 may scan the covered area 104 with a camera aboard UAV 102 in a pre-determined pattern.
- the UAV 102 may scan the covered area 104 in a scan line fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104.
- the UAV 102 may scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104. Still as another example, the UAV 102 may scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104. In certain embodiments, the UAV 102 may carry a radio transmitter configured to broadcast in radio signals within the covered area 104. In those examples, the broadcast radio signals can serve as tracking signals such that once they are intercepted by a vehicle 106 passing through the covered area 104, the UAV 102 can be configured to location a position of the vehicle 106 within the covered area 104.
- an identification of the vehicle 106 can be captured after the vehicle 106 has been tracked by UAV 102.
- the identification of the vehicle 106 can be captured by a camera carried by the UAV 102.
- the UAV 102 may be configured to capture a picture of a license plate of vehicle 106 once it has been tracked.
- the UAV 102 may be configured to transmit a request to vehicle 106 to inquire about its identification, and the vehicle 106 can send its identification to the UAV 102 in response to the request.
- information regarding the vehicle 106 can be obtained and/or registered.
- the UAV 102 can obtain information regarding the vehicle 106, for example, from the profile server via the processing station 110 as shown in FIG. 2C.
- the UAV 102 can transmit the identification of the vehicle 106 to the profile server and the profile server can look the vehicle 106 up based on identification received.
- the profile server in that example, can provide the vehicle information regarding the vehicle 106 to UAV 102 once it finds a match.
- the information regarding vehicle 106 can include information regarding communication capability of vehicle 106, such as one or more communication hardware carried by the vehicle.
- the information regarding vehicle 106 can include information indicating a model, type, make, and/or any other information regarding a type of the vehicle 106. Such information may be used to by UAV 102 to assist its communication with vehicle 106.
- UAV 102 can be configured to register the vehicle 106 with the profile server when no information regarding vehicle 106 is obtained from the profile server. For example, UAV 102 may provide the identification information regarding vehicle 106 to the profile server to have the profile server to establish a record for vehicle 106 and obtain information regarding vehicle 106.
- UAV 102 can initiate a request to communicate with vehicle 106 based on the information obtained at S306. For example, the UAV 102 can send a handshake message to vehicle 106 via the FSO units 202a and 202b shown FIG. 2 to request the vehicle 106 to establish a communication channel between vehicle 106 and UAV 102. In certain implementations, UAV 102 can include checksums in the handshake message.
- vehicle 106 can send an acknowledge message back to UAV 102 in response to receiving the handshake message transmitted at S308.
- vehicle 106 can determine if the handshake message transmitted at S308 is damaged by computing the checksums embedded in the handshake message. In those implementations, if vehicle 106 determines that the handshake message has been received without any data loss or being damaged, it can send an acknowledgment to UAV 102 acknowledging that the handshake message has been received without error. If vehicle 106 determines that the received handshake message is damaged, it can send an acknowledgement to the UAV 102 requesting the UAV 102 to resend the handshake message. This process can be repeated until the handshake message is received at vehicle 106 without error.
- vehicle 106 can be registered to indicate a communication link is established with vehicle 106.
- the registration can made at the profile server shown in FIG. 2C.
- a communication channel can be established between UAV 102 and vehicle 106.
- FIG. 4 illustrates one example of a communication protocol that can be used to communicate information to a given vehicle 106 via UAV 102 and processing station 110.
- an inquiry request regarding vehicle 106 can be communicated from processing station 110 to UAV 102.
- the inquiry request can be generated when the processing station 110 receives a communication request from another vehicle 106 to communicate with the given vehicle 106.
- the inquiry request transmitted at S402 can serve as an instruction that instructs UAV 102 to verify whether the given vehicle 106 is available for communication.
- the processing station 110 may communicate the inquiry request to UAV 102 based on a routing table indicating that the UAV 102 can establish a communication link with the given vehicle 106.
- communications can be established between UAV 102 and the given vehicle 106.
- the communications between UAV 102 and the given vehicle 106 can be established using the protocol illustrated in FIG. 3.
- the UAV 102 can send a message to processing station 110 to confirm that the given vehicle 106 is available for communication.
- the confirmation communicated by the UAV 102 at S406 can be performed after the communication between vehicle 106 and UAV 102 is established.
- information communicated from another UAV 102 can be transmitted to the UAV 102.
- the processing station may serve as a relay point or router for another UAV 102 to communicate the information 102 to the given vehicle 106.
- the communication at S408 can be performed in response to the confirmation received from UAV 102 at S406.
- a request for the given vehicle 106 to receive the information communicated from processing station 110 at S408 can be transmitted to UAV 102.
- the request transmitted at S410 can serve as a probing message to probe whether the given vehicle 106 has capacity left to receive the information communicated from another UAV 102.
- the given vehicle 106 may not always have capacity to process the information communicated from another UAV 102, or the given vehicle 106 may be set to a mode in which not incoming information is to be received.
- the information from another vehicle 106 can be forwarded to the given vehicle 106 from UAV 102.
- acknowledgement can be transmitted from the given vehicle 106 to UAV 102 indicating that the information transmitted to the given vehicle 106 at S412 is received by the given vehicle 106.
- the acknowledgement received by UAV 102 can be forwarded to processing station 110, which can forward the acknowledgement to another vehicle 106.
- FIG. 5 illustrates an example of a communication protocol between two vehicles 106 via respective UAVs 102.
- a first UAV 102 can be configured to communicate with a first vehicle 106 and a second UAV 102 can be configured to communicate with a second vehicle 102. Those communications can be facilitated by the communication protocols shown in FIGs 3-4.
- the first or second vehicle 106 can be communicate with each other via the first and second UAVs 102.
- the first and second UAVs 102 may forward the information communicated between the first and second vehicles either directly to each other or via a processing station 110 as shown.
- the communications between UAVs 102 and/or the processing stations 110 are illustrated in FIG. 4.
- FIG. 6 illustrates an example of a device 600 for facilitating communication with a given vehicle 106 in accordance with the disclosure.
- device 600 can be provided in a UAV 102.
- the device 600 can be part of the payload carried by UAV 102.
- device 600 can include one or more of a processor 602 configured to implement programmed components.
- the programmed components can include a tracking component 604, a vehicle identification component 606, a vehicle information component 608, a communication component 610, a vehicle status component 612 and/or any other components.
- the tracking component 604 can be configured to manage UAVs 102 in the network 102. In implementations, the tracking component 604 can instruct generation of tracking signals and direct the tracking signals to be transmitted.
- the tracking signals can be in various forms.
- the tracking component 604 may be configured to scan a covered area 104 with a camera aboard UAV 102 in a pre-determined pattern configured into the tracking component 604.
- the tracking component 604 may be configured to scan the covered area 104 in a scan line fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104.
- the tracking component 604 may be configured to scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104. Still as another example, the tracking component 604 may be configured to scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104.
- the vehicle identification component 606 can be configured to instruct capturing of an identification of the vehicle 106 after the vehicle 106 has been tracked by the tracking component 604.
- vehicle identification component 606 can instruct a camera aboard the UAV 102 to capture the identification of the vehicle 106.
- vehicle identification component 606 may be configured to instruct the camera to capture a picture of a license plate of vehicle 106 once it has been tracked and transmit the image to a profile server to identify the vehicle 106.
- the vehicle identification component 606 may be configured to identify the vehicle 106 by transmitting a request to vehicle 106 to inquire about its identification, and the vehicle 106 can send its identification to the UAV 102 in response to the request.
- the information component 608 can be configured to obtain information regarding vehicle 106.
- the information component 608 can obtain information regarding the vehicle 106, for example, from the profile server via the processing station 110 as shown in FIG. 2C.
- the information component 608 can transmit the identification of the vehicle 106 to the profile server and the profile server can look the vehicle 106 up based on identification received.
- the profile server in that example, can provide the vehicle information regarding the vehicle 106 to information component 608 once it finds a match.
- the information regarding vehicle 106 can include information regarding communication capability of vehicle 106, such as one or more communication hardware carried by the vehicle.
- the information regarding vehicle 106 can include information indicating a model, type, make, and/or any other information regarding a type of the vehicle 106. Such information may be used to by UAV 102 to assist its communication with vehicle 106.
- the communication component 610 can be configured to establish a communication link between the UAV 102 and vehicle 106, error control the communication between the UAV 102 and vehicle 106, manage communication status, and/or any other operations. In certain embodiments, the communication component 610 can be configured to implement the communication protocols illustrated in FIGs. 3-5.
- the vehicle status component 612 can be configured to obtain a status from an individual vehicle 106. Examples of the statuses that can be obtained by the vehicle status component 612 can include a status indicating a load of vehicle 106 (e.g., 50%busy, 80%processing power is used and so on) , a status indicating vehicle 106 is available to receive any information, and/or any other statuses.
- a status indicating a load of vehicle 106 e.g., 50%busy, 80%processing power is used and so on
- a status indicating vehicle 106 is available to receive any information, and/or any other statuses.
- FIG. 7 illustrates one exemplary method for facilitating a UAV network in accordance with the disclosure.
- the operations of method 700 presented below are intended to be illustrative. In some embodiments, method 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 700 are illustrated in FIG. 7 and described below is not intended to be limiting.
- method 700 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information) .
- the one or more processing devices may include one or more devices executing some or all of the operations of method 700 in response to instructions stored electronically on an electronic storage medium.
- the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 700.
- a tracking signal can be transmitted from a UAV.
- the control signal received at 702 can indicate a location of a second UAV.
- the tracking signal can be in various forms.
- the UAV 102 may scan the covered area 104 with a camera aboard UAV 102 in a pre-determined pattern.
- the UAV 102 may scan the covered area 104 in a scan line fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104.
- the UAV 102 may scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104.
- the UAV 102 may scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104.
- the UAV 102 may carry a radio transmitter configured to broadcast in radio signals within the covered area 104.
- operation 702 can be performed by tracking component the same as or substantially similar to tracking component 604 described and illustrated herein.
- an identification of the vehicle tracked at 702 can be obtained.
- the identification of the vehicle can be obtained by a camera carried by the UAV 102.
- the UAV may be configured to capture a picture of a license plate of vehicle once it has been tracked.
- the UAV may be configured to transmit a request to vehicle to inquire about its identification, and the vehicle can send its identification to the UAV in response to the request.
- operation 704 can be performed by a vehicle identification component the same as or substantially similar to vehicle identification component 606 described and illustrated herein.
- information regarding the vehicle identified at 704 can be obtained.
- the UAV can obtain information regarding the vehicle, for example, from the profile server via the processing station as shown in FIG. 2C.
- the UAV can transmit the identification of the vehicle to the profile server and the profile server can look the vehicle up based on identification received.
- the profile server in that example, can provide the vehicle information regarding the vehicle to UAV once it finds a match.
- the information regarding vehicle can include information regarding communication capability of vehicle, such as one or more communication hardware carried by the vehicle.
- the information regarding vehicle can include information indicating a model, type, make, and/or any other information regarding a type of the vehicle. Such information may be used to by UAV to assist its communication with vehicle 106.
- operation 706 can be performed by an information component the same as or substantially similar to information component 606 described and illustrated herein.
- a request to establish a communication with the vehicle can be initiated from the UAV based on the information obtained at 706.
- the UAV can send a handshake message to vehicle via the FSO units 202a and 202b shown FIG. 2 to request the vehicle to establish a communication channel between vehicle and UAV.
- UAV 102 can include checksums in the handshake message.
- operation 708 can be performed by a communication component the same as or substantially similar to communication component 608 described and illustrated herein.
- a communication channel can be established between the vehicle and UAV.
- operation 710 can be performed by a communication component the same as or substantially similar to communication component 608 described and illustrated herein.
- FIG. 8 illustrates a simplified computer system, according to an exemplary embodiment of the present disclosure.
- a computer system 800 as illustrated in FIG. 8 may be incorporated into devices such as a portable electronic device, mobile phone, or other device as described herein.
- FIG. 8 provides a schematic illustration of one embodiment of a computer system 800 that can perform some or all of the steps of the methods provided by various embodiments. It should be noted that FIG. 8 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 8, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
- the computer system 800 is shown comprising hardware elements that can be electrically coupled via a bus 805, or may otherwise be in communication, as appropriate.
- the hardware elements may include one or more processors 810, including without limitation one or more general-purpose processors and/or one or more special-purpose processors such as digital signal processing chips, graphics acceleration processors, and/or the like; one or more input devices 815, which can include without limitation a mouse, a keyboard, a camera, and/or the like; and one or more output devices 820, which can include without limitation a display device, a printer, and/or the like.
- the computer system 800 may further include and/or be in communication with one or more non-transitory storage devices 825, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory ( “RAM” ) , and/or a read-only memory ( “ROM” ) , which can be programmable, flash-updateable, and/or the like.
- RAM random access memory
- ROM read-only memory
- Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
- the computer system 800 might also include a communications subsystem 830, which can include without limitation a modem, a network card (wireless or wired) , an infrared communication device, a wireless communication device, and/or a chipset such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc., and/or the like.
- the communications subsystem 830 may include one or more input and/or output communication interfaces to permit data to be exchanged with a network such as the network described below to name one example, other computer systems, television, and/or any other devices described herein.
- a portable electronic device or similar device may communicate image and/or other information via the communications subsystem 830.
- a portable electronic device e.g. the first electronic device
- the computer system 800 may further comprise a working memory 835, which can include a RAM or ROM device, as described above.
- the computer system 800 also can include software elements, shown as being currently located within the working memory 835, including an operating system 840, device drivers, executable libraries, and/or other code, such as one or more application programs 845, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- an operating system 840 operating system 840
- device drivers executable libraries
- application programs 845 which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
- code and/or instructions can be used to configure and/or adapt a general purpose computer or other device to perform one or more operations in accordance with the described methods.
- a set of these instructions and/or code may be stored on a non-transitory computer-readable storage medium, such as the storage device (s) 825 described above.
- the storage medium might be incorporated within a computer system, such as computer system 800.
- the storage medium might be separate from a computer system e.g., a removable medium, such as a compact disc, and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
- These instructions might take the form of executable code, which is executable by the computer system 800 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 800 e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc., then takes the form of executable code.
- some embodiments may employ a computer system such as the computer system 800 to perform methods in accordance with various embodiments of the technology. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 800 in response to processor 810 executing one or more sequences of one or more instructions, which might be incorporated into the operating system 840 and/or other code, such as an application program 845, contained in the working memory 835. Such instructions may be read into the working memory 835 from another computer-readable medium, such as one or more of the storage device (s) 825. Merely by way of example, execution of the sequences of instructions contained in the working memory 835 might cause the processor (s) 810 to perform one or more procedures of the methods described herein. Additionally or alternatively, portions of the methods described herein may be executed through specialized hardware.
- machine-readable medium and “computer-readable medium, ” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion.
- various computer-readable media might be involved in providing instructions/code to processor (s) 810 for execution and/or might be used to store and/or carry such instructions/code.
- a computer-readable medium is a physical and/or tangible storage medium.
- Such a medium may take the form of a non-volatile media or volatile media.
- Non-volatile media include, for example, optical and/or magnetic disks, such as the storage device (s) 825.
- Volatile media include, without limitation, dynamic memory, such as the working memory 835.
- Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
- Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor (s) 810 for execution.
- the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
- a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 800.
- the communications subsystem 830 and/or components thereof generally will receive signals, and the bus 805 then might carry the signals and/or the data, instructions, etc. carried by the signals to the working memory 835, from which the processor (s) 810 retrieves and executes the instructions.
- the instructions received by the working memory 835 may optionally be stored on a non-transitory storage device 825 either before or after execution by the processor (s) 810.
- configurations may be described as a process which is depicted as a schematic flowchart or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
- examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Abstract
Embodiments are provided for facilitating communications with a vehicle through an unmanned aerial vehicle (UAV). The UAV can be configured to track vehicles traveling in an area covered by the UAV. An identification of the vehicle can be acquired after the vehicle is tracked by the UAV. The vehicle identification can be used to determine communication capability of the vehicle. Based on the determined communication capability of the vehicle, a communication request can be initiated by the UAV. The vehicle can determine either accept the communication request from the UAV or turn it down. If the vehicle accepts the communication request from the UAV, information intended for the vehicle, for example from another vehicle, can be forwarded to the vehicle by the UAV.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS
The present application claims priority to U.S. Provisional Patent Application No. 62/274,112, filed on December 31, 2015, the disclosure of which is hereby incorporated by reference in its entirety for all purposes.
The present application is related to the following co-pending U.S. Nonprovisional Patent Applications: U.S. Nonprovisional Application No. 15/341,813 (Attorney Docket No. 101534-0969605 (004930US) filed concurrently herewith; U.S. Nonprovisional Application No. 15/341,818 (Attorney Docket No. 101534-0969607 (004940US) filed concurrently herewith; U.S. Nonprovisional Application No. 15/341,824 (Attorney Docket No. 101534-0969608 (004950US) filed concurrently herewith; and U.S. Nonprovisional Application No. 15/341,831 (Attorney Docket No. 101534-0969609 (004960US) filed concurrently herewith. The entire disclosures of each of these applications are hereby incorporated by reference in their entireties for all purposes.
The present disclosure relates tofacilitating telecommunications throughunmanned aerial vehicle, and more specifically to facilitating telecommunications through self-sustaining unmanned aerial vehicle.
An unmanned aerial vehicle (UAV) , commonly known as a drone and also referred by several other names, is an aircraft without a human pilot aboard. The flight of UAVs may be controlled either autonomously by onboard computers or by the remote control of a pilot on the ground or in another vehicle. UAVs have mostly found military and special operation applications, but also are increasingly finding uses in civil applications, such as policing, surveillance and firefighting, and nonmilitary security work, such as inspection of power or pipelines. UAVs are adept at gathering an immense amount of visual information and displaying it to human operators. However, it can take a great deal of time and manpower to interpret the information gathered by UAVs. In many cases, the information gathered by UAVs is
misinterpreted by human operators and analysts who have a limited time window in which to interpret the information.
Cellular telecommunication protocol is generally known, such as the third or 4th generation of mobile phone networks. Radio spectrum is a scarce resource in cellular networks. Governments license the fight to use parts of the spectrum to the cellular network operators, often using a spectrum auction in which network operators submit bids. The architecture of the cellular network mainly comprises subscriber devices and base stations. The base stations implement air interfaces with the subscriber devices. For many rural areas, the coverage of base stations provided by the telecommunication operators do not reach those areas.
SUMMARY
Embodiments are provided for facilitating communications with a vehicle through an unmanned aerial vehicle (UAV) . The UAV can be configured to track vehicles traveling in an area covered by the UAV. An identification of the vehicle can be acquired after the vehicle is tracked by the UAV. The vehicle identification can be used to determine communication capability of the vehicle. Based on the determined communication capability of the vehicle, a communication request can be initiated by the UAV. The vehicle can determine either accept the communication request from the UAV or turn it down. If the vehicle accepts the communication request from the UAV, information intended for the vehicle, for example from another vehicle, can be forwarded to the vehicle by the UAV.
Embodiments can provide communication protocols for communications to the vehicle via a processing station. The processing station may be configured to have a communication link with the UAV 102 that can communicate with the vehicle as described above. The processing station can be connected to a core network that provides, for example, a cellular network, a public switched telephone network (PTSN) and/or the Internet. The processing station can serve as relay point or a router for forwarding information to the vehicle via the UAV. For example, the processing station can forward information from a server to the vehicle via the UAV. The processing station can also serve as a relay point or router to communicate information from the vehicle 106 to other entities connected to the core network.
Embodiments can provide communication protocols for communications between two vehicles. Each vehicle can have a communication link with a corresponding UAV. The UAVs
can be configured to communicate with each other directly or via a processing station. The vehicles can communicate with each other via the communication link established by the UAVs.
Other objects and advantages of the invention will be apparent to those skilled in the art based on the following drawings and detailed description.
The accompanying drawings, which are included to provide a further understanding of the invention, are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the detailed description serve to explain the principles of the invention. No attempt is made to show structural details of the invention in more detail than may be necessary for a fundamental understanding of the invention and various ways in which it may be practiced.
FIG. 1 illustrates an exemplary UAV network in accordance with the disclosure.
FIG. 2A illustrates one example of communications between a vehicle, a UAV and a ground processing station shown in FIG. 1.
FIG. 2B illustrates a view of the network shown in FIG. 1 with respect to areas covered by UAVs.
FIG. 2C illustrates another view of network shown in FIG. 1.
FIG. 3 illustrates one example of a communication protocol between a vehicle, a UAV, and a processing station shown in FIG. 1.
FIG. 4 illustrates another example of a communication protocol that can be used to communicate information to a given vehicle via a UAV and a processing station shown in FIG. 1.
FIG. 5 illustrates an example of a communication protocol between two vehicles via respective UAVs.
FIG. 6 illustrates an example of a device for facilitating communication with a given vehicle in accordance with the disclosure.
FIG. 7 illustrates one exemplary method for facilitating communication with a vehicle via a UAV in accordance with the disclosure.
FIG. 8 illustrates a simplified computer system, according to an exemplary embodiment of the present disclosure.
In the appended figures, similar components and/or features may have the same numerical reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components and/or features. If only the first numerical reference label is used in the specification, the description is applicable to any one of the similar components and/or features having the same first numerical reference label irrespective of the letter suffix.
Various specific embodiments of the present disclosure will be described below with reference to the accompanying drawings constituting a part of this specification. It should be understood that, although structural parts and components of various examples of the present disclosure are described by using terms expressing directions, e.g., “front” , “back” , “upper” , “lower” , “left” , “right” and the like in the present disclosure, these terms are merely used for the purpose of convenient description and are determined on the basis of exemplary directions displayed in the accompanying drawings. Since the embodiments disclosed by the present disclosure may be set according to different directions, these terms expressing directions are merely used for describing rather than limiting. Under possible conditions, identical or similar reference numbers used in the present disclosure indicate identical components.
UAVs are well suited for applications where the payload consists of optical image sensors such as cameras with powerful lightweight sensors suited for a variety of commercial applications such as surveillance, video conferencing, vehicle positioning, and/or any other applications. A UAV in accordance with the disclosure can collect multi-spectral imagery of any object in an area covered the UAV. In certain embodiments, the UAV in accordance with the disclosure can fly up to 65,000 feet and can cover as much as 500 km in range. One motivation of the present disclosure is to employ UAVs to facilitate communications between vehicles. In certain areas, especially rural areas, deployment of cellular networks is limited due to cost and
benefit considerations. In those areas, cellular coverage is often less than desirable. Communications between vehicles traveling in those areas through cellular networks is thus not reliable. Another motivation of the present disclosure is to provide an alternative wireless communication protocol to the existing ones. Mobility of a vehicle is obviously much greater than a mobile device carried by a person. When a vehicle moves out of the range of one cellular base station and into the range of another one, the flow of data must be re-routed from the old to the new cell base station. This technique is known as handover or handoff. A vehicle can quickly move in and out of a cell, which can require frequent handoff between cells. As handoff incurs overhead and does not always take place instantly as a vehicle move from one cell to another cell at a high speed, communications between two vehicles through a cellular network is not reliable. Thus, the existing cellular networks are not suitable for communications between two vehicles. While satellite communications are also possible, the cost of using satellite makes it not commercially viable for vehicle communications. Embodiments provide communication technologies to facilitate telecommunications for a moving vehicle via UAV and/or ground processing station.
FIG. 1 illustrates an exemplary UAV network 100 for facilitating communications for a an vehicle in accordance with the disclosure. As shown, the UAV network 100 can comprise multiple UAVs 102, such as UAVs 102a-f. It should be understood the UAV network 100, in certain embodiments, can comprise hundreds, thousands, or even tens of thousands of UAVs 102. The individual UAVs 102 in network 100, such as UAV 102a, can fly above the ground, between 50,000 to 65,000 feet altitude. However, this is not intended to be limiting. In some examples, some or all of the UAVs 102 in the network 100 can fly at hundreds or thousands feet above the ground. As shown, the individual UAVs 102 in the network 100 can communicate with each other through communication hardware carried by or installed on UAVs 102. For example, the communication hardware onboard a UAV 102 can include an antenna, a high frequency radio transceiver, an optical transceiver, and/or any other communication components for long range communications. A communication channel between any two given UAVs 102 in network 100, for example, UAV 102c and UAV 102d, can be established.
One way of establishing a communication channel between any two given UAVs is to have them autonomously establish the communication channel through the communication
hardware onboard the two given UAVs 102. In this example, UAVs 102a, 102b and 102c are neighboring UAVs such that they cover neighboring areas 104a, 104b, and 104c respectively. They can be configured to communicate with each other once they are within a threshold distance. The threshold distance can be the maximum communication range of the transceivers onboard the UAVs 102a, 102b, and 102c. In this way, UAVs 102a, 102b, and 102c can send data to each other without an access point.
Another way of establishing a communication channel between any two given UAVs 102 in network 100 is to have them establish communication channel through a controller. As used herein, a controller may be referred to as a piece of hardware and/or software configured to control communications within network 100. The controller can be provided by a ground processing station, such as ground processing station 110a, 110b, or 110c. For instance, the controller can be implemented by a computer server housed in a ground processing station 110. In certain embodiments, the controller can be provide by a UAV 102 in the network 100. For instance, a given UAV 102, such as a unmanned helicopter or a balloon, in the network 100 can carry payloads including one or more of a processor configured to implement the controller. In any case, the controller can be configured to determine network requirements based on an application supported by network 100, and/or to perform any other operations. In implementations, control signals can be transmitted via a control link from the controller to the UAVs 102 shown in FIG. 1.
As mentioned above, an important criteria to a UAV 102 in the network is altitude. However, as the UAV 102 altitude increases, the signals emitted by UAV 102 becomes weaker. A UAV 102 flying at an altitude of 65,000 feet can cover an area up to 100 kilometers on the ground, but the signal loss can be significantly higher than would occur for a terrestrial network. Radio signals typically requires a large amount of power for transmission in long distance. On the other end, the payloads can be carried by a UAV 102 that stays in the air for an extended period of time is limited. As mentioned above, solar energy can be used to power the UAV 102. However this limits the weight of payloads that can be carried by a UAV 102 due to the limited rate at which solar irritation can be absorbed and converted to electricity.
Free-space optical communication (FSO) is an optical communication technology that transmits light in free space to wirelessly transmit data for telecommunications. Commercially
available FSO systems use wave length close to visible spectrum around 850 to 1550 nm. In a basis point-to-point FSO system, two FSO transceivers can be placed on both sides of transmission path that has unobstructed line-of-sight between the two FSO transceivers. A variety of light sources can be used for the transmission of data using FSO transceivers. For example, LED and laser can be used to transmit data in a FSO system.
Lasers used in FSO systems provide extremely high bandwidths and capacity, on par with terrestrial fiber optic networks, but they also consume much less power than microwave systems. A FSO unit can be included in the payloads of a UAV 102 for communication. The FSO unit can include an optical transceiver with a laser transmitter and a receiver to provide full duplex bi-directional capability. The FSO unit can use a high-power optical source, i.e., laser, and a lens to transmit the laser beam through the atmosphere to another lens receiving the information embodied in the laser beam. The receiving lens can connect to a high-sensitivity receiver via optical fiber. The FSO unit included in a UAV 102 in accordance with the disclsoure can enable optical transmission at speeds up to 10Gbps.
Also shown in FIG. 1 are vehicles 106a-f. The vehicles 106 can be equipped with communication hardware. The communication hardware in a given vehicle 106 can include a FSO unit described above, a radio transceiver, and/or any other type of communication hardware. The communication hardware included in the vehicles 106 can be used to establish a communication channel between the vehicles 106 via the UAVs 102. FIG. 2A illustrates one example of communications between a vehicle 106, a UAV 102 and a ground processing station 110. As shown, the vehicle 106 can include a FSO unit 202a, which can include an optical transceiver. The optical transceiver included in the FSO unit 202a can be configured to receive laser beam 204 from UAV 102, and/or transmit the laser beam 204 to UAV 102. The UAV 102 can include one or more ofa FSO unit as well. In this example, UAV 102 includes a FSO unit 202c configured to communicate with a FSO unit in a vehicle 106, and another FSO unit 202d configured to communicate with a FSO unit 202b in a ground processing station 110. However, this is not necessarily the only case. In some other examples, UAV 102 may comprise a single FSO unit configured to communicate with a FSO unit in vehicle 106 and as well as a FSO unit in ground processing station 110. In any case, in order for FSO units 202a and 202c to communicate, there should be a line of sight (LoS) between them so that laser beam 204a can be
transmitted and received. The wavelength of the laser beam 204a can be between 600nm to 2000nm. Once the FSO units 202a and 202c are aligned with each other, optical data can be transmitted between UAV 102 and vehicle 106.
As also shown in FIG. 2A, a ground processing station 110 can include a FSO unit 202b configured to establish a communication channel FSO unit 202d through laser beam 204b. Through the communication channel, UAV 102 can be configured to communicate its geo-locations to processing station 110. Since ground processing station 110 is stationary, the geo-location of ground processing station 110 can be preconfigured into an onboard computer in UAVs 102. Through the ground processing station 110, information intended for vehicle 106 can be forwarded to vehicle 106. As shown, the ground processing station 110 can be connected to a wired or wireless network. Information intended for vehicle 106 can be communicated through the wired or wireless network from or to another entity connected to the wired or wireless network. The information intended for vehicle 106 can be first communicated to the UAV 102 through laser beam 204b, and the UAV 102 can forward the information to vehicle 106 through laser beam 204a.
FIG. 2B illustrates a view of network 100 with respect to areas 104 covered by UAVs 102. As shown, a given area 104, such as area 104a, 104c, 104d, 104e, 104f, can be covered by a UAV 102, such as UAV 102a-e respectively. As also shown, in certain areas 104, such as areas 104b, 104g and 104h, a processing station 110, such as processing station 110a-c can be provided. The processing station 110a-c can serve as access points for a given vehicle 106 to communicate with other vehicles and as well to access as a core network. This is illustrated in FIG. 2C.
FIG. 2C illustrates another view of network 100. As shown, vehicles 106a-c can communicate with UAVs 102a-c via a FSO interface 210. In this example, the UAVs 102a-c are operatively connected processing stations 110a and 110b as shown. The processing stations 110 and UAVs 102 together make up UAV access network 206. The processing stations 110 can be connected to a core network 208 which can comprise a public switched telephone network (PSTN) or the Internet through a mobile switching center (MSC) , a Gateway Mobile Switching center, a media gateway (MGW) . As also shown, the core network 208 can include a PS (Profile Server) in the core network that registers the location of vehicles 106 and as well as
locations of UAVs 102, and other profile information that is used for authentication and authorization. In this way, vehicles 106 can be found by the profile server.
With architecture and infrastructure of network 1 00 having been generally described, attention is now directed to FIG. 3. FIG. 3 illustrates one example of a communication protocol between a vehicle 106, a UAV 102, and a processing station 110. As shown, at S302, a tracking signal can be transmitted from UAV 102 for tracking vehicle 106. The tracking signal can be in various forms. For example, the UAV 102 may scan the covered area 104 with a camera aboard UAV 102 in a pre-determined pattern. For example, the UAV 102 may scan the covered area 104 in a scan line fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104. As another example, the UAV 102 may scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104. Still as another example, the UAV 102 may scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104. In certain embodiments, the UAV 102 may carry a radio transmitter configured to broadcast in radio signals within the covered area 104. In those examples, the broadcast radio signals can serve as tracking signals such that once they are intercepted by a vehicle 106 passing through the covered area 104, the UAV 102 can be configured to location a position of the vehicle 106 within the covered area 104.
At S304, an identification of the vehicle 106 can be captured after the vehicle 106 has been tracked by UAV 102. In certain implementations, the identification of the vehicle 106 can be captured by a camera carried by the UAV 102. For example, the UAV 102 may be configured to capture a picture of a license plate of vehicle 106 once it has been tracked. As another example, the UAV 102 may be configured to transmit a request to vehicle 106 to inquire about its identification, and the vehicle 106 can send its identification to the UAV 102 in response to the request.
At S306, information regarding the vehicle 106 can be obtained and/or registered. In certain implementations, once the identification of vehicle 106 is captured by UAV 102 at S304, the UAV 102 can obtain information regarding the vehicle 106, for example, from the profile server via the processing station 110 as shown in FIG. 2C. For instance, the UAV 102 can
transmit the identification of the vehicle 106 to the profile server and the profile server can look the vehicle 106 up based on identification received. The profile server, in that example, can provide the vehicle information regarding the vehicle 106 to UAV 102 once it finds a match. The information regarding vehicle 106 can include information regarding communication capability of vehicle 106, such as one or more communication hardware carried by the vehicle. The information regarding vehicle 106 can include information indicating a model, type, make, and/or any other information regarding a type of the vehicle 106. Such information may be used to by UAV 102 to assist its communication with vehicle 106.
In certain embodiments, UAV 102 can be configured to register the vehicle 106 with the profile server when no information regarding vehicle 106 is obtained from the profile server. For example, UAV 102 may provide the identification information regarding vehicle 106 to the profile server to have the profile server to establish a record for vehicle 106 and obtain information regarding vehicle 106.
At S308, UAV 102 can initiate a request to communicate with vehicle 106 based on the information obtained at S306. For example, the UAV 102 can send a handshake message to vehicle 106 via the FSO units 202a and 202b shown FIG. 2 to request the vehicle 106 to establish a communication channel between vehicle 106 and UAV 102. In certain implementations, UAV 102 can include checksums in the handshake message.
At S310, vehicle 106 can send an acknowledge message back to UAV 102 in response to receiving the handshake message transmitted at S308. In some implementations, vehicle 106 can determine if the handshake message transmitted at S308 is damaged by computing the checksums embedded in the handshake message. In those implementations, if vehicle 106 determines that the handshake message has been received without any data loss or being damaged, it can send an acknowledgment to UAV 102 acknowledging that the handshake message has been received without error. If vehicle 106 determines that the received handshake message is damaged, it can send an acknowledgement to the UAV 102 requesting the UAV 102 to resend the handshake message. This process can be repeated until the handshake message is received at vehicle 106 without error.
At S312, vehicle 106 can be registered to indicate a communication link is established with vehicle 106. For example, the registration can made at the profile server shown in FIG. 2C. At S314, a communication channel can be established between UAV 102 and vehicle 106.
FIG. 4 illustrates one example of a communication protocol that can be used to communicate information to a given vehicle 106 via UAV 102 and processing station 110. As shown, at S402, an inquiry request regarding vehicle 106 can be communicated from processing station 110 to UAV 102. The inquiry request can be generated when the processing station 110 receives a communication request from another vehicle 106 to communicate with the given vehicle 106. The inquiry request transmitted at S402 can serve as an instruction that instructs UAV 102 to verify whether the given vehicle 106 is available for communication. In implementations, the processing station 110 may communicate the inquiry request to UAV 102 based on a routing table indicating that the UAV 102 can establish a communication link with the given vehicle 106.
At S404, communications can be established between UAV 102 and the given vehicle 106. For example, the communications between UAV 102 and the given vehicle 106 can be established using the protocol illustrated in FIG. 3.
At S406, the UAV 102 can send a message to processing station 110 to confirm that the given vehicle 106 is available for communication. The confirmation communicated by the UAV 102 at S406 can be performed after the communication between vehicle 106 and UAV 102 is established.
At S408, information communicated from another UAV 102 can be transmitted to the UAV 102. As mentioned above, the processing station may serve as a relay point or router for another UAV 102 to communicate the information 102 to the given vehicle 106. The communication at S408 can be performed in response to the confirmation received from UAV 102 at S406.
At S410, a request for the given vehicle 106 to receive the information communicated from processing station 110 at S408 can be transmitted to UAV 102. The request transmitted at S410 can serve as a probing message to probe whether the given vehicle 106 has capacity left to receive the information communicated from another UAV 102. For example, the given vehicle
106 may not always have capacity to process the information communicated from another UAV 102, or the given vehicle 106 may be set to a mode in which not incoming information is to be received.
At S412, the information from another vehicle 106 can be forwarded to the given vehicle 106 from UAV 102. At S414, acknowledgement can be transmitted from the given vehicle 106 to UAV 102 indicating that the information transmitted to the given vehicle 106 at S412 is received by the given vehicle 106. At S416, the acknowledgement received by UAV 102 can be forwarded to processing station 110, which can forward the acknowledgement to another vehicle 106.
FIG. 5 illustrates an example of a communication protocol between two vehicles 106 via respective UAVs 102. As shown, in certain implementations, a first UAV 102 can be configured to communicate with a first vehicle 106 and a second UAV 102 can be configured to communicate with a second vehicle 102. Those communications can be facilitated by the communication protocols shown in FIGs 3-4. As also shown, the first or second vehicle 106 can be communicate with each other via the first and second UAVs 102. The first and second UAVs 102 may forward the information communicated between the first and second vehicles either directly to each other or via a processing station 110 as shown. The communications between UAVs 102 and/or the processing stations 110 are illustrated in FIG. 4.
FIG. 6 illustrates an example of a device 600 for facilitating communication with a given vehicle 106 in accordance with the disclosure. In certain embodiments, device 600 can be provided in a UAV 102. For example, the device 600 can be part of the payload carried by UAV 102. In any case, as shown, device 600 can include one or more of a processor 602 configured to implement programmed components. The programmed components can include a tracking component 604, a vehicle identification component 606, a vehicle information component 608, a communication component 610, a vehicle status component 612 and/or any other components.
The tracking component 604 can be configured to manage UAVs 102 in the network 102. In implementations, the tracking component 604 can instruct generation of tracking signals and direct the tracking signals to be transmitted. The tracking signals can be in various forms. For example, the tracking component 604 may be configured to scan a covered area 104 with a
camera aboard UAV 102 in a pre-determined pattern configured into the tracking component 604. For example, the tracking component 604 may be configured to scan the covered area 104 in a scan line fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104. As another example, the tracking component 604 may be configured to scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104. Still as another example, the tracking component 604 may be configured to scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104.
The vehicle identification component 606 can be configured to instruct capturing of an identification of the vehicle 106 after the vehicle 106 has been tracked by the tracking component 604. In certain implementations, vehicle identification component 606 can instruct a camera aboard the UAV 102 to capture the identification of the vehicle 106. For example, vehicle identification component 606 may be configured to instruct the camera to capture a picture of a license plate of vehicle 106 once it has been tracked and transmit the image to a profile server to identify the vehicle 106. As another example, the vehicle identification component 606 may be configured to identify the vehicle 106 by transmitting a request to vehicle 106 to inquire about its identification, and the vehicle 106 can send its identification to the UAV 102 in response to the request.
The information component 608 can be configured to obtain information regarding vehicle 106. In implementations, once the identification of vehicle 106 has been captured by vehicle identification component 606, the information component 608 can obtain information regarding the vehicle 106, for example, from the profile server via the processing station 110 as shown in FIG. 2C. For instance, the information component 608 can transmit the identification of the vehicle 106 to the profile server and the profile server can look the vehicle 106 up based on identification received. The profile server, in that example, can provide the vehicle information regarding the vehicle 106 to information component 608 once it finds a match. The information regarding vehicle 106 can include information regarding communication capability of vehicle 106, such as one or more communication hardware carried by the vehicle. The information regarding vehicle 106 can include information indicating a model, type, make,
and/or any other information regarding a type of the vehicle 106. Such information may be used to by UAV 102 to assist its communication with vehicle 106.
The communication component 610 can be configured to establish a communication link between the UAV 102 and vehicle 106, error control the communication between the UAV 102 and vehicle 106, manage communication status, and/or any other operations. In certain embodiments, the communication component 610 can be configured to implement the communication protocols illustrated in FIGs. 3-5.
The vehicle status component 612 can be configured to obtain a status from an individual vehicle 106. Examples of the statuses that can be obtained by the vehicle status component 612 can include a status indicating a load of vehicle 106 (e.g., 50%busy, 80%processing power is used and so on) , a status indicating vehicle 106 is available to receive any information, and/or any other statuses.
FIG. 7 illustrates one exemplary method for facilitating a UAV network in accordance with the disclosure. The operations of method 700 presented below are intended to be illustrative. In some embodiments, method 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 700 are illustrated in FIG. 7 and described below is not intended to be limiting.
In some embodiments, method 700 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information) . The one or more processing devices may include one or more devices executing some or all of the operations of method 700 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 700.
At 702, a tracking signal can be transmitted from a UAV. The control signal received at 702 can indicate a location of a second UAV. The tracking signal can be in various forms.
For example, the UAV 102 may scan the covered area 104 with a camera aboard UAV 102 in a pre-determined pattern. For example, the UAV 102 may scan the covered area 104 in a scan line fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104. As another example, the UAV 102 may scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104. Still as another example, the UAV 102 may scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104. In certain embodiments, the UAV 102 may carry a radio transmitter configured to broadcast in radio signals within the covered area 104. In some implementations, operation 702 can be performed by tracking component the same as or substantially similar to tracking component 604 described and illustrated herein.
At 704, an identification of the vehicle tracked at 702 can be obtained. In certain implementations, the identification of the vehicle can be obtained by a camera carried by the UAV 102. For example, the UAV may be configured to capture a picture of a license plate of vehicle once it has been tracked. As another example, the UAV may be configured to transmit a request to vehicle to inquire about its identification, and the vehicle can send its identification to the UAV in response to the request. In some implementations, operation 704 can be performed by a vehicle identification component the same as or substantially similar to vehicle identification component 606 described and illustrated herein.
At 706, information regarding the vehicle identified at 704 can be obtained. In certain implementations, once the identification of vehicle is obtained by UAV 102 at 704, the UAV can obtain information regarding the vehicle, for example, from the profile server via the processing station as shown in FIG. 2C. For instance, the UAV can transmit the identification of the vehicle to the profile server and the profile server can look the vehicle up based on identification received. The profile server, in that example, can provide the vehicle information regarding the vehicle to UAV once it finds a match. The information regarding vehicle can include information regarding communication capability of vehicle, such as one or more communication hardware carried by the vehicle. The information regarding vehicle can include information indicating a model, type, make, and/or any other information regarding a type of the vehicle.
Such information may be used to by UAV to assist its communication with vehicle 106. In some implementations, operation 706 can be performed by an information component the same as or substantially similar to information component 606 described and illustrated herein.
At 708, a request to establish a communication with the vehicle can be initiated from the UAV based on the information obtained at 706. For example, the UAV can send a handshake message to vehicle via the FSO units 202a and 202b shown FIG. 2 to request the vehicle to establish a communication channel between vehicle and UAV. In certain implementations, UAV 102 can include checksums in the handshake message. In some implementations, operation 708 can be performed by a communication component the same as or substantially similar to communication component 608 described and illustrated herein.
At 710, a communication channel can be established between the vehicle and UAV. In some implementations, operation 710 can be performed by a communication component the same as or substantially similar to communication component 608 described and illustrated herein.
FIG. 8 illustrates a simplified computer system, according to an exemplary embodiment of the present disclosure. A computer system 800 as illustrated in FIG. 8 may be incorporated into devices such as a portable electronic device, mobile phone, or other device as described herein. FIG. 8 provides a schematic illustration of one embodiment of a computer system 800 that can perform some or all of the steps of the methods provided by various embodiments. It should be noted that FIG. 8 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 8, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
The computer system 800 is shown comprising hardware elements that can be electrically coupled via a bus 805, or may otherwise be in communication, as appropriate. The hardware elements may include one or more processors 810, including without limitation one or more general-purpose processors and/or one or more special-purpose processors such as digital signal processing chips, graphics acceleration processors, and/or the like; one or more input devices 815, which can include without limitation a mouse, a keyboard, a camera, and/or the
like; and one or more output devices 820, which can include without limitation a display device, a printer, and/or the like.
The computer system 800 may further include and/or be in communication with one or more non-transitory storage devices 825, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory ( “RAM” ) , and/or a read-only memory ( “ROM” ) , which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
The computer system 800 might also include a communications subsystem 830, which can include without limitation a modem, a network card (wireless or wired) , an infrared communication device, a wireless communication device, and/or a chipset such as a BluetoothTM device, an 802.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc., and/or the like. The communications subsystem 830 may include one or more input and/or output communication interfaces to permit data to be exchanged with a network such as the network described below to name one example, other computer systems, television, and/or any other devices described herein. Depending on the desired functionality and/or other implementation concerns, a portable electronic device or similar device may communicate image and/or other information via the communications subsystem 830. In other embodiments, a portable electronic device, e.g. the first electronic device, may be incorporated into the computer system 800, e.g., an electronic device as an input device 815. In some embodiments, the computer system 800 will further comprise a working memory 835, which can include a RAM or ROM device, as described above.
The computer system 800 also can include software elements, shown as being currently located within the working memory 835, including an operating system 840, device drivers, executable libraries, and/or other code, such as one or more application programs 845, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the methods discussed above, such as those described in relation to FIG. 8, might be implemented as
code and/or instructions executable by a computer and/or a processor within a computer; in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer or other device to perform one or more operations in accordance with the described methods.
A set of these instructions and/or code may be stored on a non-transitory computer-readable storage medium, such as the storage device (s) 825 described above. In some cases, the storage medium might be incorporated within a computer system, such as computer system 800. In other embodiments, the storage medium might be separate from a computer system e.g., a removable medium, such as a compact disc, and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable code, which is executable by the computer system 800 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 800 e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc., then takes the form of executable code.
It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software including portable software, such as applets, etc., or both. Further, connection to other computing devices such as network input/output devices may be employed.
As mentioned above, in one aspect, some embodiments may employ a computer system such as the computer system 800 to perform methods in accordance with various embodiments of the technology. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 800 in response to processor 810 executing one or more sequences of one or more instructions, which might be incorporated into the operating system 840 and/or other code, such as an application program 845, contained in the working memory 835. Such instructions may be read into the working memory 835 from another computer-readable medium, such as one or more of the storage device (s) 825. Merely by way of example, execution of the sequences of instructions contained in the working memory 835 might cause the processor (s) 810 to perform one or more procedures of the methods described herein.
Additionally or alternatively, portions of the methods described herein may be executed through specialized hardware.
The terms “machine-readable medium” and “computer-readable medium, ” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. In an embodiment implemented using the computer system 800, various computer-readable media might be involved in providing instructions/code to processor (s) 810 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media include, for example, optical and/or magnetic disks, such as the storage device (s) 825. Volatile media include, without limitation, dynamic memory, such as the working memory 835.
Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor (s) 810 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 800.
The communications subsystem 830 and/or components thereof generally will receive signals, and the bus 805 then might carry the signals and/or the data, instructions, etc. carried by the signals to the working memory 835, from which the processor (s) 810 retrieves and executes the instructions. The instructions received by the working memory 835 may optionally be stored on a non-transitory storage device 825 either before or after execution by the processor (s) 810.
The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For
instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of exemplary configurations including implementations. However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
Also, configurations may be described as a process which is depicted as a schematic flowchart or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the technology. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not bind the scope of the claims.
As used herein and in the appended claims, the singular forms “a” , “an” , and “the” include plural references unless the context clearly dictates otherwise. Thus, for example, reference to “a user” includes a plurality of such users, and reference to “the processor” includes reference to one or more processors and equivalents thereof known to those skilled in the art, and so forth.
Also, the words “comprise” , “comprising” , “contains” , “containing” , “include” , “including” , and “includes” , when used in this specification and in the following claims, are intended to specify the presence of stated features, integers, components, or steps, but they do not preclude the presence or addition of one or more other features, integers, components, steps, acts, or groups.
Claims (20)
- A method for facilitating communication to a vehicle via an unmanned aerial vehicle (UAV) , the method being implemented in one or more of a processor configured to execute programmed components, the method comprising:transmitting, at the UAV, a tracking signal for tracking the vehicle;obtaining, at the UAV, an identification of the vehicle in response to the vehicle having been tracked;obtaining, at the UAV, information regarding the vehicle using the identification of the vehicle; andbase on the information regarding the vehicle, establishing a communication channel with the vehicle, wherein the establishing includes initiating, at the UAV, a request to the vehicle requesting the vehicle to accept communications from the UAV.
- The method of claim 1, wherein the tracking signal is transmitted in accordance with a predetermined tracking pattern.
- The method of claim 2, wherein the tracking pattern is a scan line tracking pattern for an area covered by the UAV.
- The method of claim 3, wherein obtaining the identification of the vehicle includes capturing an image of the vehicle using a camera aboard the UAV.
- The method of claim 1, wherein obtaining the identification of the vehicle includes acquiring the identification from a server operative connected to the UAV.
- The method of claim 1, wherein the information regarding the vehicle includes communication hardware information specifying one or more communication components of the vehicle.
- The method of claim 1, wherein the information regarding the vehicle is obtained from a server operatively connected to the vehicle.
- The method of claim 1, wherein establishing the communication channel with the vehicle further includes error controlling, at the UAV, the communication channel.
- The method of claim 8, wherein the error controlling includes embedded checksum information in data packets transmitted to the vehicle.
- The method of claim 8, wherein the error controlling includes receiving an acknowledgement message from the vehicle indicating that a data packet transmitted to vehicle has been received by the vehicle without error.
- A system for facilitating communication to a vehicle via an unmanned aerial vehicle (UAV) , the system comprising one or more of a processor configured by machine readable-instructions to perform:transmitting a tracking signal for tracking the vehicle;obtaining an identification of the vehicle in response to the vehicle having been tracked;obtaining information regarding the vehicle using the identification of the vehicle; andbase on the information regarding the vehicle, establishing a communication channel with the vehicle, wherein the establishing includes initiating a request to the vehicle requesting the vehicle to accept communications from the UAV.
- The system of claim 11, wherein the tracking signal is transmitted in accordance with a predetermined tracking pattern.
- The system of claim 12, wherein the tracking pattern is a scan line tracking pattern for an area covered by the UAV.
- The system of claim 13, wherein obtaining the identification of the vehicle includes capturing an image of the vehicle using a camera aboard the UAV.
- The system of claim 11, wherein obtaining the identification of the vehicle includes acquiring the identification from a server operative connected to the UAV.
- The system of claim 11, wherein the information regarding the vehicle includes communication hardware information specifying one or more communication components of the vehicle.
- The system of claim 11, wherein the information regarding the vehicle is obtained from a server operatively connected to the vehicle.
- The system of claim 11, wherein establishing the communication channel with the vehicle further includes error controlling, at the UAV, the communication channel.
- The system of claim 18, wherein the error controlling includes embedded checksum information in data packets transmitted to the vehicle.
- The system of claim 18, wherein the error controlling includes receiving an acknowledgement message from the vehicle indicating that a data packet transmitted to vehicle has been received by the vehicle without error.
Applications Claiming Priority (14)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562274112P | 2015-12-31 | 2015-12-31 | |
US62/274,112 | 2015-12-31 | ||
US15/341,831 US9786165B2 (en) | 2015-12-31 | 2016-11-02 | Facilitating location positioning service through a UAV network |
US15/341,813 US9955115B2 (en) | 2015-12-31 | 2016-11-02 | Facilitating wide view video conferencing through a drone network |
US15/341,797 | 2016-11-02 | ||
US15/341,797 US10454576B2 (en) | 2015-12-31 | 2016-11-02 | UAV network |
US15/341,818 | 2016-11-02 | ||
US15/341,824 | 2016-11-02 | ||
US15/341,818 US20170193556A1 (en) | 2015-12-31 | 2016-11-02 | Facilitating targeted information delivery through a uav network |
US15/341,809 | 2016-11-02 | ||
US15/341,809 US9800321B2 (en) | 2015-12-31 | 2016-11-02 | Facilitating communication with a vehicle via a UAV |
US15/341,831 | 2016-11-02 | ||
US15/341,824 US9826256B2 (en) | 2015-12-31 | 2016-11-02 | Facilitating multimedia information delivery through a UAV network |
US15/341,813 | 2016-11-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017114503A1 true WO2017114503A1 (en) | 2017-07-06 |
Family
ID=59165112
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/113728 WO2017114506A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating multimedia information delivery through uav network |
PCT/CN2016/113726 WO2017114505A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating targeted information delivery through a uav network |
PCT/CN2016/113724 WO2017114503A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating communication with a vehicle via a uav |
PCT/CN2016/113592 WO2017114496A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating location positioning service through a uav network |
PCT/CN2016/113725 WO2017114504A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating wide-view video conferencing through a uav network |
PCT/CN2016/113718 WO2017114501A1 (en) | 2015-12-31 | 2016-12-30 | Uav network |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/113728 WO2017114506A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating multimedia information delivery through uav network |
PCT/CN2016/113726 WO2017114505A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating targeted information delivery through a uav network |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2016/113592 WO2017114496A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating location positioning service through a uav network |
PCT/CN2016/113725 WO2017114504A1 (en) | 2015-12-31 | 2016-12-30 | Facilitating wide-view video conferencing through a uav network |
PCT/CN2016/113718 WO2017114501A1 (en) | 2015-12-31 | 2016-12-30 | Uav network |
Country Status (2)
Country | Link |
---|---|
CN (9) | CN208401845U (en) |
WO (6) | WO2017114506A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111865395A (en) * | 2020-06-12 | 2020-10-30 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Trajectory generation and tracking method and system for unmanned aerial vehicle formation communication |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102017119686A1 (en) * | 2017-08-28 | 2019-02-28 | Andreas Rheinländer | Surveillance, exploration and inspection system using drones |
CN107809277B (en) * | 2017-10-17 | 2020-02-14 | 安徽工业大学 | Emergency rescue communication network networking method based on unmanned aerial vehicle and wireless equipment |
EP3716688A4 (en) * | 2017-12-08 | 2020-12-09 | Beijing Xiaomi Mobile Software Co., Ltd. | Data transmission method and apparatus, and unmanned aerial vehicle |
US20190197890A1 (en) * | 2017-12-27 | 2019-06-27 | GM Global Technology Operations LLC | Methods, systems, and drones for assisting communication between a road vehicle and other road users |
CN108471327A (en) * | 2018-03-26 | 2018-08-31 | 广东工业大学 | A kind of UAV Communication system |
US20200089233A1 (en) * | 2018-09-13 | 2020-03-19 | Commscope Technologies Llc | Location of assets deployed in ceiling or floor spaces or other inconvenient spaces or equipment using an unmanned vehicle |
WO2020097103A2 (en) * | 2018-11-06 | 2020-05-14 | Battelle Energy Alliance, Llc | Systems, devices, and methods for millimeter wave communication for unmanned aerial vehicles |
CN109582036B (en) * | 2018-12-03 | 2021-04-27 | 南京航空航天大学 | Consistency formation control method for quad-rotor unmanned aerial vehicle |
CN110048762A (en) * | 2019-04-23 | 2019-07-23 | 南京工业职业技术学院 | A kind of implementation method of the air communication network based on solar energy unmanned plane |
CN110321951B (en) * | 2019-07-01 | 2021-03-16 | 青岛海科虚拟现实研究院 | VR simulated aircraft training evaluation method |
CN110944149A (en) * | 2019-11-12 | 2020-03-31 | 上海博泰悦臻电子设备制造有限公司 | Child care system and method for vehicle |
CN112906486B (en) * | 2021-01-26 | 2023-09-12 | 吉利汽车研究院(宁波)有限公司 | Passenger condition detection method, control method and system for unmanned taxi |
CN114940180A (en) * | 2021-02-10 | 2022-08-26 | 华为技术有限公司 | Control method and device |
CN112896193B (en) * | 2021-03-16 | 2022-06-24 | 四川骏驰智行科技有限公司 | Automobile remote auxiliary driving system and method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102006574A (en) * | 2011-01-05 | 2011-04-06 | 中国人民解放军理工大学 | Wireless self-organized network-based integrated heterogeneous emergency communication network |
CN103413444A (en) * | 2013-08-26 | 2013-11-27 | 深圳市川大智胜科技发展有限公司 | Traffic flow surveying and handling method based on unmanned aerial vehicle high-definition video |
CN103914076A (en) * | 2014-03-28 | 2014-07-09 | 浙江吉利控股集团有限公司 | Cargo transferring system and method based on unmanned aerial vehicle |
CN105119650A (en) * | 2015-08-24 | 2015-12-02 | 杨珊珊 | Signal relay system based on unmanned aircraft, and signal relay method thereof |
Family Cites Families (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060140644A1 (en) * | 2004-12-23 | 2006-06-29 | Paolella Arthur C | High performance, high efficiency fiber optic link for analog and RF systems |
US9167195B2 (en) * | 2005-10-31 | 2015-10-20 | Invention Science Fund I, Llc | Preservation/degradation of video/audio aspects of a data stream |
US20070131822A1 (en) * | 2005-06-20 | 2007-06-14 | Kevin Leigh Taylor Stallard | Aerial and ground robotic system |
CN2917134Y (en) * | 2006-03-30 | 2007-06-27 | 哈尔滨工程大学 | DSP-based embedded real-time panoramic image acquisition and processing device |
US20080018730A1 (en) * | 2006-07-20 | 2008-01-24 | Marc Roth | For-hire vehicle interactive communication systems and methods thereof |
US7848278B2 (en) * | 2006-10-23 | 2010-12-07 | Telcordia Technologies, Inc. | Roadside network unit and method of organizing, managing and maintaining local network using local peer groups as network groups |
CN101822047A (en) * | 2007-10-05 | 2010-09-01 | 松下航空电子公司 | System and method for presenting advertisement content on a mobile platform during travel |
US20100162327A1 (en) * | 2008-12-18 | 2010-06-24 | Airvod Limited | In-Flight Entertainment System |
US8515609B2 (en) * | 2009-07-06 | 2013-08-20 | Honeywell International Inc. | Flight technical control management for an unmanned aerial vehicle |
CN101651992B (en) * | 2009-09-18 | 2011-01-05 | 北京航空航天大学 | Data chain networking method used for autonomous formation of unmanned aerial vehicle |
CN101790248B (en) * | 2009-09-28 | 2012-06-20 | 长春理工大学 | Auto-management data link of micro unmanned aerial vehicles |
US9143729B2 (en) * | 2010-05-12 | 2015-09-22 | Blue Jeans Networks, Inc. | Systems and methods for real-time virtual-reality immersive multimedia communications |
JP2012212337A (en) * | 2011-03-31 | 2012-11-01 | Daihatsu Motor Co Ltd | Inter-vehicle communication device and inter-vehicle communication system |
EP2511656A1 (en) * | 2011-04-14 | 2012-10-17 | Hexagon Technology Center GmbH | Measuring system for determining the 3D coordinates of an object surface |
US20130002484A1 (en) * | 2011-07-03 | 2013-01-03 | Daniel A. Katz | Indoor navigation with gnss receivers |
DE102011113202A1 (en) * | 2011-09-10 | 2013-03-14 | Volkswagen Ag | Method for operating a data receiver and data receiver, in particular in a vehicle |
CN102436738B (en) * | 2011-09-26 | 2014-03-05 | 同济大学 | Traffic monitoring device based on unmanned aerial vehicle (UAV) |
CN102355574B (en) * | 2011-10-17 | 2013-12-25 | 上海大学 | Image stabilizing method of airborne tripod head moving target autonomous tracking system |
US9082239B2 (en) * | 2012-03-14 | 2015-07-14 | Flextronics Ap, Llc | Intelligent vehicle for assisting vehicle occupants |
CN104350740B (en) * | 2012-03-19 | 2018-04-20 | 索尼移动通信株式会社 | Video conference is carried out using wireless peripheral video conference device |
US11328325B2 (en) * | 2012-03-23 | 2022-05-10 | Secureads, Inc. | Method and/or system for user authentication with targeted electronic advertising content through personal communication devices |
CN102654940B (en) * | 2012-05-23 | 2014-05-14 | 上海交通大学 | Processing method of traffic information acquisition system based on unmanned aerial vehicle and |
KR101393539B1 (en) * | 2012-09-17 | 2014-05-09 | 기아자동차 주식회사 | Integrated network system for vehicle |
US8971274B1 (en) * | 2012-11-09 | 2015-03-03 | Google Inc. | Valuation of and marketplace for inter-network links between balloon network and terrestrial network |
CN103116994B (en) * | 2012-12-28 | 2015-01-07 | 方科峰 | Transportation system of optical communication and transportation system management method |
WO2014179235A1 (en) * | 2013-04-29 | 2014-11-06 | Oceus Networks Inc. | Mobile cellular network backhaul |
US9070289B2 (en) * | 2013-05-10 | 2015-06-30 | Palo Alto Research Incorporated | System and method for detecting, tracking and estimating the speed of vehicles from a mobile platform |
CN103280108B (en) * | 2013-05-20 | 2015-04-22 | 中国人民解放军国防科学技术大学 | Passenger car safety pre-warning system based on visual perception and car networking |
US9085370B2 (en) * | 2013-06-03 | 2015-07-21 | General Electric Company | Systems and methods for wireless data transfer during in-flight refueling of an aircraft |
US9503694B2 (en) * | 2013-08-13 | 2016-11-22 | GM Global Technology Operations LLC | Methods and apparatus for utilizing vehicle system integrated remote wireless image capture |
CN203596823U (en) * | 2013-09-24 | 2014-05-14 | 中国航天空气动力技术研究院 | Unmanned plane high-altitude base station communication system |
US9324189B2 (en) * | 2013-09-27 | 2016-04-26 | Intel Corporation | Ambulatory system to communicate visual projections |
US20150134143A1 (en) * | 2013-10-04 | 2015-05-14 | Jim Willenborg | Novel tracking system using unmanned aerial vehicles |
US20150127460A1 (en) * | 2013-11-04 | 2015-05-07 | Vixs Systems Inc. | Targeted advertising based on physical traits and anticipated trajectory |
CN103780313A (en) * | 2014-01-21 | 2014-05-07 | 桂林航天光比特科技股份公司 | Laser energy supply communication system for air vehicle |
US20150271452A1 (en) * | 2014-03-21 | 2015-09-24 | Ford Global Technologies, Llc | Vehicle-based media content capture and remote service integration |
CN103985230B (en) * | 2014-05-14 | 2016-06-01 | 深圳市大疆创新科技有限公司 | A kind of Notification Method based on image, device and notice system |
US9334052B2 (en) * | 2014-05-20 | 2016-05-10 | Verizon Patent And Licensing Inc. | Unmanned aerial vehicle flight path determination, optimization, and management |
US20150355309A1 (en) * | 2014-06-05 | 2015-12-10 | University Of Dayton | Target tracking implementing concentric ringlets associated with target features |
EP3060966B1 (en) * | 2014-07-30 | 2021-05-05 | SZ DJI Technology Co., Ltd. | Systems and methods for target tracking |
CN104168455B (en) * | 2014-08-08 | 2018-03-09 | 北京航天控制仪器研究所 | A kind of space base large scene camera system and method |
US9170117B1 (en) * | 2014-08-21 | 2015-10-27 | International Business Machines Corporation | Unmanned aerial vehicle navigation assistance |
CN104394472B (en) * | 2014-11-21 | 2018-08-03 | 成都亿盟恒信科技有限公司 | A kind of 3G onboard wireless video-on-demand system and method |
CN104699102B (en) * | 2015-02-06 | 2017-07-18 | 东北大学 | A kind of unmanned plane and intelligent vehicle collaborative navigation and investigation monitoring system and method |
CN104796611A (en) * | 2015-04-20 | 2015-07-22 | 零度智控(北京)智能科技有限公司 | Method and system for remotely controlling unmanned aerial vehicle to implement intelligent flight shooting through mobile terminal |
CN104766481A (en) * | 2015-04-29 | 2015-07-08 | 深圳市保千里电子有限公司 | Method and system for unmanned plane to conduct vehicle tracking |
CN104881650A (en) * | 2015-05-29 | 2015-09-02 | 成都通甲优博科技有限责任公司 | Vehicle tracking method based on unmanned aerial vehicle (UAV) dynamic platform |
CN105139606B (en) * | 2015-07-29 | 2019-04-02 | 重庆赛乐威航空科技有限公司 | A kind of low flyer information interaction system |
CN105100728A (en) * | 2015-08-18 | 2015-11-25 | 零度智控(北京)智能科技有限公司 | Unmanned aerial vehicle video tracking shooting system and method |
CN204887278U (en) * | 2015-09-15 | 2015-12-16 | 成都时代星光科技有限公司 | Unmanned aerial vehicle is in air from network deployment image transmission system |
-
2016
- 2016-12-30 WO PCT/CN2016/113728 patent/WO2017114506A1/en active Application Filing
- 2016-12-30 WO PCT/CN2016/113726 patent/WO2017114505A1/en active Application Filing
- 2016-12-30 WO PCT/CN2016/113724 patent/WO2017114503A1/en active Application Filing
- 2016-12-30 CN CN201621475131.6U patent/CN208401845U/en active Active
- 2016-12-30 CN CN201611255154.0A patent/CN107046710A/en active Pending
- 2016-12-30 CN CN201621475865.4U patent/CN206517444U/en active Active
- 2016-12-30 CN CN201621475987.3U patent/CN206481394U/en active Active
- 2016-12-30 CN CN201611254398.7A patent/CN106878672A/en active Pending
- 2016-12-30 CN CN201611255052.9A patent/CN107071794A/en active Pending
- 2016-12-30 WO PCT/CN2016/113592 patent/WO2017114496A1/en active Application Filing
- 2016-12-30 WO PCT/CN2016/113725 patent/WO2017114504A1/en active Application Filing
- 2016-12-30 CN CN201611255151.7A patent/CN107070531A/en active Pending
- 2016-12-30 CN CN201611254477.8A patent/CN107040754A/en active Pending
- 2016-12-30 WO PCT/CN2016/113718 patent/WO2017114501A1/en active Application Filing
- 2016-12-30 CN CN201611254487.1A patent/CN106982345A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102006574A (en) * | 2011-01-05 | 2011-04-06 | 中国人民解放军理工大学 | Wireless self-organized network-based integrated heterogeneous emergency communication network |
CN103413444A (en) * | 2013-08-26 | 2013-11-27 | 深圳市川大智胜科技发展有限公司 | Traffic flow surveying and handling method based on unmanned aerial vehicle high-definition video |
CN103914076A (en) * | 2014-03-28 | 2014-07-09 | 浙江吉利控股集团有限公司 | Cargo transferring system and method based on unmanned aerial vehicle |
CN105119650A (en) * | 2015-08-24 | 2015-12-02 | 杨珊珊 | Signal relay system based on unmanned aircraft, and signal relay method thereof |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111865395A (en) * | 2020-06-12 | 2020-10-30 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Trajectory generation and tracking method and system for unmanned aerial vehicle formation communication |
CN111865395B (en) * | 2020-06-12 | 2022-07-05 | 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) | Trajectory generation and tracking method and system for unmanned aerial vehicle formation communication |
Also Published As
Publication number | Publication date |
---|---|
CN206481394U (en) | 2017-09-08 |
WO2017114506A1 (en) | 2017-07-06 |
CN106982345A (en) | 2017-07-25 |
CN107040754A (en) | 2017-08-11 |
CN107046710A (en) | 2017-08-15 |
CN107070531A (en) | 2017-08-18 |
WO2017114501A1 (en) | 2017-07-06 |
WO2017114505A1 (en) | 2017-07-06 |
CN206517444U (en) | 2017-09-22 |
WO2017114504A1 (en) | 2017-07-06 |
CN106878672A (en) | 2017-06-20 |
CN107071794A (en) | 2017-08-18 |
CN208401845U (en) | 2019-01-18 |
WO2017114496A1 (en) | 2017-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10454564B2 (en) | Facilitating communication with a vehicle via a UAV | |
WO2017114503A1 (en) | Facilitating communication with a vehicle via a uav | |
US9955115B2 (en) | Facilitating wide view video conferencing through a drone network | |
US10454576B2 (en) | UAV network | |
CN106788677B (en) | Unmanned aerial vehicle data relay device based on mobile data network and method thereof | |
US10354521B2 (en) | Facilitating location positioning service through a UAV network | |
KR20190104016A (en) | Shooting method controlling movement of unmanned aerial robot in unmanned aerial system and apparatus for supporting same | |
WO2020149653A1 (en) | Method and apparatus for controlling radio resource for a redundant route for a dual-connecting iab-node in a wireless communication system | |
US20130169806A1 (en) | Flight system with infrared camera and communication method thereof | |
EP3796571B1 (en) | Method and device for controlling unmanned aerial vehicle to access network | |
US11044769B2 (en) | Wireless communication system, wireless relay device and wireless communication method | |
WO2022019646A1 (en) | Method and apparatus for resource allocation in wireless communication system | |
WO2020166872A1 (en) | Method and apparatus for controlling early data transmission procedure in a wireless communication system | |
WO2020251314A1 (en) | Method by which rsus transmit and receive signals in wireless communication system | |
WO2021045266A1 (en) | Method and communication apparatus for transmitting and receiving data | |
WO2020243929A1 (en) | Method and apparatus for application services over a cellular network | |
WO2020251335A1 (en) | Method for terminal to transmit and receive signal in wireless communication system | |
WO2021040069A1 (en) | Method and communication device for transmitting and receiving camera data and sensor data | |
WO2023244013A1 (en) | Method and device for misbehavior detection through integration of surrounding information | |
WO2024035118A1 (en) | Method and device for converting and transmitting sensor information | |
WO2022220377A1 (en) | Method and apparatus for handling qoe management in dual connectivity in a wireless communication system | |
WO2023043200A1 (en) | Operation method of relay ue related to sidelink connection establishment in wireless communication system | |
WO2023113522A1 (en) | Communication related to handover | |
US20230189190A1 (en) | Method and apparatus for communication services | |
WO2024029967A1 (en) | Operation method of target relay ue associated with end-to-end bearer configuration in ue-to-ue relay in wireless communication system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16881292 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15/10/2018) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16881292 Country of ref document: EP Kind code of ref document: A1 |