WO2017114505A1 - Facilitating targeted information delivery through a uav network - Google Patents

Facilitating targeted information delivery through a uav network Download PDF

Info

Publication number
WO2017114505A1
WO2017114505A1 PCT/CN2016/113726 CN2016113726W WO2017114505A1 WO 2017114505 A1 WO2017114505 A1 WO 2017114505A1 CN 2016113726 W CN2016113726 W CN 2016113726W WO 2017114505 A1 WO2017114505 A1 WO 2017114505A1
Authority
WO
WIPO (PCT)
Prior art keywords
transportation apparatus
information
passenger
uav
items
Prior art date
Application number
PCT/CN2016/113726
Other languages
French (fr)
Inventor
Wellen Sham
Original Assignee
Wellen Sham
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/341,809 external-priority patent/US9800321B2/en
Priority claimed from US15/341,813 external-priority patent/US9955115B2/en
Priority claimed from US15/341,824 external-priority patent/US9826256B2/en
Priority claimed from US15/341,831 external-priority patent/US9786165B2/en
Priority claimed from US15/341,797 external-priority patent/US10454576B2/en
Priority claimed from US15/341,818 external-priority patent/US20170193556A1/en
Application filed by Wellen Sham filed Critical Wellen Sham
Publication of WO2017114505A1 publication Critical patent/WO2017114505A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/18502Airborne stations
    • H04B7/18504Aircraft used as relay or high altitude atmospheric platform
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/18502Airborne stations
    • H04B7/18506Communications with or from aircraft, i.e. aeronautical mobile service
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/214Specialised server platform, e.g. server located in an airplane, hotel, hospital
    • H04N21/2146Specialised server platform, e.g. server located in an airplane, hotel, hospital located in mass transportation means, e.g. aircraft, train or bus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25841Management of client data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W16/00Network planning, e.g. coverage or traffic planning tools; Network deployment, e.g. resource partitioning or cells structures
    • H04W16/24Cell structures
    • H04W16/26Cell enhancers or enhancement, e.g. for tunnels, building shadow
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W40/00Communication routing or communication path finding
    • H04W40/24Connectivity information management, e.g. connectivity discovery or connectivity update
    • H04W40/248Connectivity information update
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0203Power saving arrangements in the radio access network or backbone network of wireless communication networks
    • H04W52/0206Power saving arrangements in the radio access network or backbone network of wireless communication networks in access points, e.g. base stations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/14Flying platforms with four distinct rotor axes, e.g. quadcopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/20UAVs specially adapted for particular uses or applications for use as communications relays, e.g. high-altitude platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • the present disclosure relates totargeted delivering of information, in particular targeted delivery of information to a transportation apparatus via a UAV network.
  • UAV unmanned aerial vehicle
  • UAVs commonly known as a drone and also referred by several other names, is an aircraft without a human pilot aboard.
  • the flight of UAVs may be controlled either autonomously by onboard computers or by the remote control of a pilot on the ground or in another vehicle.
  • UAVs have mostly found military and special operation applications, but also are increasingly finding uses in civil applications, such as policing, surveillance and firefighting, and nonmilitary security work, such as inspection of power or pipelines.
  • UAVs are adept at gathering an immense amount of visual information and displaying it to human operators. However, it can take a great deal of time and manpower to interpret the information gathered by UAVs. In many cases, the information gathered by UAVs is misinterpreted by human operators and analysts who have a limited time window in which to interpret the information.
  • Targeted delivery of information to a computing device is generally in the art.
  • the conventional technologies typically can deliver specific information to the computing device based on a set of user activities and/or user manifested profile from stored on the computing device. For example, user interest towards certain subject may be assessed by the conventional technologies based on user activities and user profile stored on the computing device. Based on the assessed user interest, the conventional technologies can then “push” news stories or web items to the computing device for presentation on that computing device.
  • Embodiments are provided for deliver information to a transportation apparatus via a UAV network.
  • one or more UAVs may be configured to identify and track the transportation apparatus.
  • Information regarding the transportation apparatus can be collected by the UAV (s) .
  • the information can include an identification of the transportation, one or more drivers, and/or passengers identified in the transportation apparatus, a direction, and/or a road the transportation apparatus is traveling in, and/or any other information regarding the transportation apparatus.
  • Such information regarding the transportation apparatus as captured by the UAV (s) can be transmitted to a processing center.
  • the processing center can be configured to analyze the information regarding the transportation apparatus to determine one or more categories of information to be transmitted to the transportation apparatus for presentation.
  • the one or more categories of information can include local information, such as local news, local commercial information, local site attraction information related to the area the transportation apparatus is traveling in, local marketing information that might be of interest to the drivers and/or passengers identified in the transportation apparatus, and/or any other information.
  • local information such as local news, local commercial information, local site attraction information related to the area the transportation apparatus is traveling in, local marketing information that might be of interest to the drivers and/or passengers identified in the transportation apparatus, and/or any other information.
  • the UAVs may be configured to capture one or more images of an interior of the transportation apparatus and transmit the images to the processing center.
  • the processing center may be configured to process the images to identify one or more drivers and/or passengers in the transportation apparatus.
  • image processing by the processing center may involve identifying the number of drivers and/or passengers in the transportation apparatus, their gender, their specific identity (e.g., names) , their positions in the transportation apparatus, and/or any other information regarding the drivers and/or passengers in the transportation apparatus.
  • the processing center can be configured to determine one or more items to be transmitted to the transportation apparatus and presented in the transportation apparatus.
  • the processing center may determine there are four people in the transportation apparatus, two in the front row, and two in the back row.
  • the processing center may be further configured to determine an identity of the two people in the front row, i.e., their names; and the gender and age about the two people in the back row.
  • the processing center may be configured to deliver local site attraction, weather and driving condition information to the two people in the front row for presentation on a screen located on a dashboard of the transportation apparatus.
  • the processing center may be configured to deliver local commercial or marketing information for presentation on one or more screens located in the back row.
  • one or more displays in the transportation apparatus may be equipped with a network connection.
  • the displays can receive multimedia data from the processing center through the network connection.
  • the displays may be operatively connected to a computing device and the computing device may be configured to receive the multimedia data.
  • the transportation apparatus is a vehicle.
  • the vehicle may have at least one cabin.
  • the transportation apparatus is equipped with a wide-view display such as a dashboard covered by a LCD screen; and individual displays mounted on seats for passengers in one or more back rows.
  • FIG. 1 illustrates an exemplary UAV network in accordance with the disclosure.
  • FIG. 2 conceptually illustrates facilitating targeted delivery of information to a transportation apparatus using UAVs in accordance with the disclosure.
  • FIG. 3 illustrates an example of a processing center shown in FIG. 2.
  • FIG. 4 illustrates an exemplary method for facilitating targeted delivery of information to a transportation apparatus
  • FIG. 5 illustrates a simplified computer system that can be used implement various embodiments described and illustrated herein.
  • UAVs are well suited for applications where the payload consists of optical image sensors such as cameras with powerful lightweight sensors suited for a variety of commercial applications such as surveillance, video conferencing, vehicle positioning, and/or any other applications.
  • a UAV in accordance with the disclosure can collect multi-spectral imagery of any object in an area covered the UAV.
  • the UAV in accordance with the disclosure can fly up to 65,000 feet and can cover as much as 500 km in range.
  • One motivation of the present disclosure is to employ UAVs to facilitate video-conferencing involving at least one transportation apparatus, such as an automobile, a bus, or a train.
  • One or more UAVs can be employed to capture video images of an interior of the transportation apparatus, such as a cabin of the transportation apparatus. Since UAV can be configured to move at a speed consistent with a speed of the transportation apparatus above the transportation apparatus, video images of the transportation apparatus can be restively simply captured by the UAV when the transportation apparatus moves.
  • the UAV equipped with a wide-view, e.g., 360 degree, camera can be used to capture wide-view video images of an interior of the transportation apparatus so along as there is clear view of the interior of the transportation apparatus from the UAV.
  • the images can be transmitted from the UAV to a processing center via the UAV network.
  • the processing center can be configured to obtain information regarding the transportation apparatus, such the make of the transportation apparatus, one or more registration numbers of the transportation apparatus in response to receiving the images of the transportation apparatus.
  • the processing center can be further configured to analyze the images to obtain passenger information and/or driver information regarding one or more passengers and/or drivers in the transportation apparatus.
  • the passenger information can include information indicating a gender of each passenger, an age group of each passenger, an identity of each passenger, a position of each passenger within the transportation apparatus, and/or any other passenger information.
  • the driver information can include similar information regarding the driver.
  • the processing center can be configured to determine one or more items to be presented to the passengers and/or the driver within the transportation apparatus. For example, based on the passenger information, the processing center can determine an age group of the passengers sitting in a back row of the transportation apparatus and determine to present local marketing items that might be of interest to the passengers.
  • transportation apparatus may be referred to as an apparatus capable of moving in distance for transportation of people and/or goods.
  • Examples of a transportation apparatus may include a vehicle (e.g., a car or truck) , a bike, a motorcycle, a train, a ship, a plane or a space ship, just to name view. It should be understood, in the examples given below, although vehicle is used in those examples, this is not intended to be limiting. Other type of transportation apparatus may also be used in those examples in some embodiments.
  • FIG. 1 illustrates an exemplary UAV network 100 for facilitating communications for a vehicle in accordance with the disclosure.
  • the UAV network 100 can comprise multiple UAVs 102, such as UAVs 102a-f.
  • the UAV network 100 in certain embodiments, can comprise hundreds, thousands, or even tens of thousands of UAVs 102.
  • the individual UAVs 102 in UAV network 100 can fly above the ground, between 50,000 to 65,000 feet altitude. However, this is not intended to be limiting. In some examples, some or all of the UAVs 102 in the UAV network 100 can fly at hundreds or thousands feet above the ground.
  • the individual UAVs 102 in the UAV network 100 can communicate with each other through communication hardware carried by or installed on UAVs 102.
  • the communication hardware onboard a UAV 102 can include an antenna, a high frequency radio transceiver, an optical transceiver, and/or any other communication components for long range communications.
  • a communication channel between any two given UAVs 102 in UAV network 100, for example, UAV 102c and UAV 102d, can be established.
  • UAVs 102a, 102b and 102c are neighboring UAVs such that they cover neighboring areas 104a, 104b, and 104c respectively. They can be configured to communicate with each other once they are within a threshold distance.
  • the threshold distance can be the maximum communications range of the transceivers onboard the UAVs 102a, 102b, and 102c. In this way, UAVs 102a, 102b, and 102c can send data to each other without an access point.
  • a controller may be referred to as a piece of hardware and/or software configured to control communications within UAV network 100.
  • the controller can be provided by a ground processing station, such as ground controller 110a, 110b, or 110c.
  • the controller 110 can be implemented by a computer server housed in a controller 110.
  • the controller 110 can be provided by a UAV 102 in the UAV network 100.
  • a given UAV 102 such as an unmanned helicopter or a balloon, in the UAV network 100 can carry payloads including one or more of a processor configured to implement the controller 110.
  • the controller 110 can be configured to determine network requirements based on an application supported by UAV network 100, and/or to perform any other operations.
  • control signals can be transmitted via a control link from the controller 110 to the UAVs 102 shown in FIG. 1.
  • an important criteria to a UAV 102 in the network is altitude.
  • the signals emitted by UAV 102 become weaker.
  • a UAV 102 flying at an altitude of 65,000 feet can cover an area up to 100 kilometers on the ground, but the signal loss can be significantly higher than would occur for a terrestrial network.
  • Radio signals typically require a large amount of power for transmission in long distance.
  • the payloads can be carried by a UAV 102 that stays in the air for an extended period of time is limited.
  • solar energy can be used to power the UAV 102. However this limits the weight of payloads that can be carried by a UAV 102 due to the limited rate at which solar irritation can be absorbed and converted to electricity.
  • Free-space optical communication is an optical communication technology that transmits light in free space to wirelessly transmit data for telecommunications.
  • Commercially available FSO systems use wave length close to visible spectrum around 850 to 1550 nm.
  • two FSO transceivers can be placed on both sides of transmission path that has unobstructed line-of-sight between the two FSO transceivers.
  • a variety of light sources can be used for the transmission of data using FSO transceivers. For example, LED and laser can be used to transmit data in a FSO system.
  • a FSO unit can be included in the payloads of a UAV 102 for communication.
  • the FSO unit can include an optical transceiver with a laser transmitter and a receiver to provide full duplex bi-directional capability.
  • the FSO unit can use a high-power optical source, i.e., laser, and a lens to transmit the laser beam through the atmosphere to another lens receiving the information embodied in the laser beam.
  • the receiving lens can connect to a high-sensitivity receiver via optical fiber.
  • the FSO unit included in a UAV 102 in accordance with the disclosure can enable optical transmission at speeds up to 10Gbps.
  • a given vehicle 106 can be equipped with communication hardware.
  • the communication hardware in the given vehicle 106 can include a FSO unit described above, a radio transceiver, and/or any other type of communication hardware.
  • the communication hardware included in the vehicle 106 can be used to establish a communication channel between the vehicles 106 via the UAVs 102.
  • a controller 110 can include a FSO unit configured to establish a communication channel FSO unit through laser beam. Through the communication channel, UAV 102 can be configured to communicate its geo-locations to controller 110. Since ground controller 110 is stationary, the geo-location of ground controller 110 can be preconfigured into an onboard computer in UAVs 102.
  • the ground controller 110 can be connected to a wired or wireless network.
  • Information intended for vehicle 106 can be communicated through the wired or wireless network from or to another entity connected to the wired or wireless network.
  • the information intended for vehicle 106 can be first communicated to the UAV 102 through laser beam, and the UAV 102 can forward the information to vehicle 106 through laser beam 204a.
  • a tracking signal can be transmitted from UAV 102 for tracking vehicle 106.
  • the tracking signal can be in various forms.
  • the UAV 102 may scan the covered area 104 with a camera onboard UAV 102 in a pre-determined pattern.
  • the UAV 102 may scan the covered area 104 in a scan line fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104.
  • the UAV 102 may scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104.
  • the UAV 102 may scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104.
  • the UAV 102 may carry a radio transmitter configured to broadcast in radio signals within the covered area 104.
  • the broadcast radio signals can serve as tracking signals such that once they are intercepted by a vehicle 106 passing through the covered area 104, the UAV 102 can be configured to location a position of the vehicle 106 within the covered area 104.
  • An identification of the vehicle 106 can be captured after the vehicle 106 has been tracked by UAV 102.
  • the identification of the vehicle 106 can be captured by a camera carried by the UAV 102.
  • the UAV 102 may be configured to capture a picture of a license plate of vehicle 106 once it has been tracked.
  • the UAV 102 may be configured to transmit a request to vehicle 106 to inquire about its identification, and the vehicle 106 can send its identification to the UAV 102 in response to the request.
  • any one of the UAVs 102 shown in FIG. 1 may be instructed to “monitor” or “zoom-in onto” a corresponding vehicle 106.
  • the UAV 102a may receive location information regarding vehicle 106a and instructions to zoom-in onto vehicle 106a.
  • the UAV 102a may be configured to track vehicle 106a based on the received location information. This may involve moving the UAV 102a into a vicinity of vehicle 106a such that UAV 102a has a clear view of vehicle 106.
  • the instructions received by UAV 102a may include capturing one or more images of interior of vehicle 106a.
  • UAV 102a may be equipped with one or more cameras.
  • the camera (s) carried by UAV 102a may include a wide-view camera capable of capturing a wide field of view.
  • the wide-view camera carried by UAV 102a is an omnidirectional camera with a 360-degree field of view in a horizontal plane, or with a visual field that covers (approximately) the entire sphere.
  • the cameras carried by UAV 102a may include multiple cameras fixed at corresponding locations on an underbody ofUAV 102a. In one embodiment, the multiple cameras may be arranged on the underbody ofUAV 102a to form a ring. In one configuration, 8 cameras are used to form such a ring. One or more of those cameras can be employed to capture the interior of vehicle 106a depending on a distance between UAV 102a and vehicle 106a, an angle between the two, and/or any other factors. For example, three cameras in the ring may be employed by UAV 102a to capture images of the interior of vehicle 106a from different angles. In some implementations, individual cameras carried by UAV 102a may have panoramic view capability. For example, various types of panoramic view cameras may be carried by UAV 102a, including short rotation, full rotation, fixed lens, and any other types of panoramic view cameras.
  • FIG. 2 conceptually illustrates facilitating targeted delivery of information to a transportation apparatus using UAVs in accordance with the disclosure.
  • individual UAVs 102 in the UAV network 100 can be instructed to one or more images of interior of a vehicle 102 as described above.
  • UAV 102a on a request, can be positioned such that it captures the one or more images of the interior of vehicle 106a.
  • the UAV 102a may be configured to detect the vehicle 106a when vehicle 106a enters the area 104a covered by UAV 102a.
  • UAV 102a In response to detecting vehicle 106a has entered area 104a, UAV 102a can be configured to be position itself such that UAV 102a has a clear line of sight with respect to vehicle 106a. In some implementations, the position ofUAV 102a with respect to vehicle 106a can be adjusted based on the images of vehicle 106a as captured by UAV 102a For instance, UAV 102a, controller 110a and/or processing center 202 can be configured to determine a quality of the images captured by UAV 102a. In that instance, when the image qualities are determined not to show a good view of the interior of vehicle 106a, the UAV 102a can be instructed to reposition itself until acceptable images of the interior of vehicle 106a are received.
  • This may involve instructing the UAV 102a to adjust its angle, distance, speed, and/or any other aspects with respect to vehicle 106a.
  • an instruction may be generated by the processing center 202 and transmitted to UAV 102a through the UAV network 100 via the controller 110a.
  • UAV 102a can be configured to transmit the captured images of vehicle 106a to processing center 202 through UAV network 100.
  • the images of the vehicle 106a may be first transmitted to controller 110a on the ground.
  • the image transmission from UAV 102a to the controller 110 may vary.
  • the image data may be first transmitted from UAV 102a to another UAV in the UAV network 100.
  • UAV 102a may have more computing power or capability than UAV 102a, which may be a lightweight UAV configured to follow moving vehicles and to capture images of interiors of moving vehicles.
  • the UAV with more computing power can be used as a relay station to relay image data from UAV 102a to controller 110a.
  • the image data may be transmitted to more than one UAV in the network 100 before it reaches the controller 110a.
  • the controller 110a may be configured to 1) communicate control instructions with processing center 202 and with the UAV 102a; 2) receive image data from UAV 102a; 3) transmit the image data from UAV 102a to the processing center 202; and/or to perform any other operations. However, it should be understood that in some other embodiments, transmitting image data through controller 110a may not be necessary. In those embodiments, the image data can be transmitted from UAV 102a to the processing center 202a via the UAV network 100 without going through controller 110a.
  • the processing center 202 can be configured to process the vehicle images received from UAV 102a to obtain vehicle information related to the vehicle 106a. For example, in response to receiving the vehicle images captured by the UAV 102a, the processing center 202 can be configured to obtain information regarding the vehicle 106a as captured in the images. For example, the images can contain license plate information indicating a license number of vehicle 106a.
  • the processing center 202 can obtain certain information regarding the vehicle 106a, such as the make of vehicle 106a, one or more presentation capabilities of vehicle 106a (e.g., audio, video, multimedia presentation capabilities: does the vehicle 106a have a display device, how many display devices does vehicle 106a have, what type of display devices does vehicle 106a have and/or any other capability information) , one or more communication channels with the vehicle 106a (e.g., an internet address of the one or more display devices equipped within vehicle 106a, a telephone number of vehicle 106a) , and/or any other information related to vehicle 106a.
  • presentation capabilities of vehicle 106a e.g., audio, video, multimedia presentation capabilities: does the vehicle 106a have a display device, how many display devices does vehicle 106a have, what type of display devices does vehicle 106a have and/or any other capability information
  • communication channels with the vehicle 106a e.g., an internet address of the one or more display devices equipped within vehicle 106a, a telephone number
  • the processing center 202 can be configured to analyze the images and to obtain passenger information and/or driver information related to one or more passengers and/or drivers in vehicle 106.
  • processing center 202 in response to receiving the images, can be configured to analyze the images by employing image analysis algorithms.
  • the image analysis performed by the processing center 202 in that example, can include analyzing the images to identify the one or more passengers and/or drivers.
  • facial feature analysis may be employed by processing center 202 to extract one or more facial features for each passenger and/or driver in vehicle 106a.
  • the extracted features can be used to match one or more passengers and/or drivers registered for vehicle 106a. Upon a match is found, the identity of the passenger and/or driver can be determined and other information such as gender, age, user interest, user experience can be obtained for the identified driver and/or passenger.
  • the facial features extracted for each passenger can be used to determine a gender of the passenger, an age group of the passenger, and/or any other characteristic information regarding the one or more passengers. For instance, in certain situation, the exact identity of a particular passenger in vehicle 106a may not be readily determinable based on the images received. In that situation, certain characteristic information can still be determined using the facial features such as the passenger is a male in the age group of teens.
  • the processing center 202 can be configured to determine positions of the passengers within vehicle 106a. For example, a position of each passenger with respect to front rows or back rows in vehicle 106a can be determined by analyzing the images.
  • such image analysis may include obtaining information regarding vehicle 106a, such as the number of rows of seats vehicle 106a has, a size of the interior of vehicle 106a, and/or any other information regarding a specification of vehicle 106a.
  • a particular passenger’s position for example passenger A is sitting in the left rear seat can be determined.
  • the processing center 202 can be configured to determine one or more items for presentation to the one or more passengers and/or drivers based on the information related to the vehicle 106a, and/or the passenger information and/or driver information.
  • the passenger information may indicate vehicle 106a has a particular passenger that is male in his teens.
  • the information related to vehicle 106a may indicate that vehicle 106a has entered area 104a and has travelled within area 104a for a certain time period.
  • the processing center 202 can be configured to determine to push one or more local marketing items such as fast food restaurants available, local youth events (e.g., a county fair, or amusement park) to be presented to that passenger.
  • the processing center 202 can be configured to transmit the items to vehicle 106a for presentation on a display device appropriate for that passenger.
  • the image analysis mentioned above may indicate that passenger is sitting in the rear left seat and the information related to vehicle 106a may indicate the rear left seat has a display device with a specific internet address.
  • the processing center 202 can be configured to transmit the items to the display device through the specific internet address.
  • the transmission of the items by the processing center 202 can be through the UAV network 100.
  • the processing center 202 may include one or more of a processor 302 configured to execute program components.
  • the program components may include a transportation apparatus image component 304, a transportation apparatus information component 306, an image analysis component 308, a targeted information component 310, a transmission component 312 and/or any other components.
  • the transportation apparatus image component 304 can be configured to receive one or more images of a transportation apparatus, such as vehicle 106a.
  • the images received by the transportation apparatus image component 304 can include images of an interior of the vehicle 106a captured from different angles by a UAV, such as UAV 102a.
  • the transportation apparatus image component 304 can include information readily indicating an identity of the vehicle. For example, one or more of the images may indicate a license plate number of vehicle 106a. However, this is not necessarily the only case. In certain situations, the images received by transportation apparatus image component 304 may not contain such information. To address such a situation, the transportation apparatus image component 304 may be configured to generate a control instruction to instruct the UAV, e.g., UAV 102, to recapture the images; and transmit the control instruction to the UAV 102 via UAV network 100. As mentioned above, the control instruction can be transmitted to the UAV 102 via UAV network.
  • the UAV e.g., UAV 102
  • the transportation apparatus information component 306 can be configured to obtain information related to the transportation apparatus based on the images received by transportation apparatus image component 304.
  • the images received by transportation apparatus image component 304 may contain information indicating a license number of vehicle 106a.
  • the transportation apparatus information component 306 can be configured to obtain information regarding vehicle 106a based on such license information.
  • the transportation apparatus information component 306 can be configured to make an inquiry for vehicle 106a to a vehicle registration database using the license plate number of vehicle 106a.
  • the information related to vehicle 106 as obtained by transportation apparatus information component 306 may include a make of vehicle 106a (e.g., Toyota Corolla 2014, Hyundai Accord 2016, etc.
  • one or more presentation capabilities of vehicle 106a e.g., audio, video, multimedia presentation capabilities: does the vehicle 106a have a display device, how many display devices does vehicle 106a have, what type of display devices does vehicle 106a have, where is each display located within vehicle 106a if vehicle 106a have more than one display, and/or any other capability information
  • one or more communication channels supported by vehicle 106a one or more multimedia formats supported by vehicle 106a. and/or any other information related to the vehicle 106a.
  • the information related to vehicle 106a may include information indicating that vehicle 106a has 3 display devices capable of presenting audio, video and animation, with the first display device being located on a dashboard of vehicle 106, the second display device being located on the back of a left front seat and the third display device being located on the back of a right front seat.
  • the information related to vehicle 106a as obtained by transportation apparatus information component 306 can include information indicating various statistics about vehicle 106a.
  • the information may indicate an area in which vehicle 106a is traveling in, such as area 104a, how long has vehicle 106a been traveling within the area (e.g., 5 minutes) , on which road is vehicle 106a travelling, a speed of vehicle 106, towards which area vehicle 106a is traveling, e.g., area 104b, a size of vehicle 106a, and/or any other statistical information about vehicle 106a.
  • the image analysis component 308 can be configured to analyze the images received by transportation apparatus image component 304 and to obtain passenger information and/or driver information related to one or more passengers and/or drivers in vehicle 106a.
  • image analysis component 308 in response to the images being received by transportation apparatus image component 304, image analysis component 308 can be configured to analyze the images by employing image analysis algorithms.
  • the image analysis performed by image analysis component 308 can include analyzing the images to identify the one or more passengers and/or drivers in vehicle 106a.
  • facial feature analysis may be employed to extract one or more facial features for each passenger and/or driver in vehicle 106a.
  • the extracted features can be used to match one or more passengers and/or drivers registered for vehicle 106a.
  • the identity of the passenger and/or driver can be determined by image analysis component 308 and other information such as gender, age, user interest, user experience can be obtained for the identified driver and/or passenger.
  • the facial features extracted by image analysis component 308 for each passenger can be used by image analysis component 308 to determine a gender of the passenger, an age group of the passenger, and/or any other characteristic information regarding the one or more passengers. For instance, in certain situation, the exact identity of a particular passenger in vehicle 106a may not be readily determinable by image analysis component 308. In that situation, certain characteristic information can still be determined by image analysis component 308 using the facial features such as the passenger is a male in the age group of teens. In some implementations, the image analysis component 308 can be configured to determine positions of the passengers within vehicle 106a.
  • a position of each passenger with respect to front rows or back rows in vehicle 106a can be determined by image analysis component 308 by analyzing and collaborating contents in the images.
  • image analysis may include obtaining information regarding vehicle 106a, such as the number of rows of seats vehicle 106a has, a size of the interior of vehicle 106a, and/or any other information regarding a specification of vehicle 106a as obtained by transportation apparatus information component 306.
  • a particular passenger’s position for example passenger A is sitting in the left rear seat may be determined.
  • the targeted information component 310 can be configured to determine one or more items for presentation to the one or more passengers and/or drivers based on the information related to the vehicle 106a, and/or the passenger information and/or driver information.
  • the passenger information may indicate vehicle 106a has a particular passenger that is male in his 20s.
  • the information related to vehicle 106a may indicate that vehicle 106a has entered area 104a and has travelled within area 104a for a certain time period.
  • the targeted information component 310 can be configured to determine to push one or more local marketing items such as fast food restaurants available, local youth events (e.g., a county fair, or amusement park) to be presented to that passenger based on a general interest manifested by males in that age group.
  • the targeted information component 310 can be configured to obtain general interests for various age groups and determine the one or more items for presentation to the passenger (s) and/or driver in vehicle 106a based on the obtained general interests for the age groups.
  • the targeted information component 310 can be configured to retrieve local information in response to the location of vehicle 106a is obtained.
  • the information related to vehicle 106a as obtained by transportation apparatus information component 306 can include information indicating vehicle 106a is in area 104a.
  • the targeted information component 310 can obtain local information related to area 104, such as local attraction information, local commercial information, local events information, local traffic condition information, and/or any other local information.
  • the targeted information component 310 can be configured to select a subset of the local information for presentation to the passenger (s) and/or driver (s) in the vehicle 106a.
  • the driver information determined by image analysis component 308 may identify a specific driver registered with vehicle 106a. Based on that information, the specific driver’s interest towards certain activities can be obtained by targeted information component 310, for example from a database storing such information. Once obtaining user interest information, targeted information component 310 can be configured to determine one or more items to be pushed to the driver of vehicle 106a for presentation.
  • the items may include one or more local events of interest to the driver (for example, if the driver has a known interest in music, a local event of music festival that is taking place in area 104a can be pushed to the driver) .
  • the items may include one or more local attractions of interest to the driver (for example, if the driver has a known interest in wine, one or more local wineries may be pushed to the driver) .
  • certain local Italian restaurants may be pushed to the driver if the driver has a known interest in Italian food.
  • Other items that may be pushed to the driver and/or passenger (s) by targeted information component 310 may include local news story, weather information, relevant traffic condition information, local commercial information (e.g., location and hours of operation of retail stores and/or malls) , local ongoing events and/or any other items.
  • the transmission component 312 can be configured to transmit the items to vehicle 106a for presentation on a display device appropriate for a passenger.
  • the transmission component 312 can be configured to determine a format in which the items to be presented on the display device.
  • the passenger information as determined by image analysis component 308 may indicate that passenger is sitting in the rear left seat and the information related to vehicle 106a may indicate the rear left seat has a display device with a specific internet address and is capable of present video.
  • the transmission component 312 can be configured to transmit the items to that display device through the specific internet address and generate a control instruction instructing that display device to present the items in video.
  • the transmission of the items by transmission component 312 can be through the UAV network 100.
  • FIG. 4 an exemplary method 400 for facilitating targeted delivery of information to a transportation apparatus.
  • the particular series of processing steps depicted in FIG. 4 is not intended to be limiting. It is appreciated that the processing steps may be performed in an order different from that depicted in FIG. 4 and that not all the steps depicted in FIG. 4 need be performed.
  • the method 400 may be implemented by a video processing center, such as the video processing center shown in FIG. 5.
  • the method depicted in method 400 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information) .
  • the one or more processing devices may include one or more devices executing some or all of the operations of method 400 in response to instructions stored electronically on an electronic storage medium.
  • the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 400.
  • one or more images of an interior of a transportation apparatus can be received.
  • the images received at 402 can include images of the interior of the transportation apparatus captured from different angles by a UAV, such as UAV 102a.
  • the images received at 402 can include information readily indicating an identity of the transportation apparatus.
  • operations involved in 402 can be implemented by a transportation apparatus image component the same as or substantially similar to transportation apparatus image component 304 illustrated and described herein.
  • information related to the transportation apparatus can be obtained based on the images received at 402.
  • the images received at 402 may contain information indicating a license number of transportation apparatus.
  • the information regarding transportation apparatus can be obtained.
  • the information obtained at 404 may include a make of transportation apparatus (e.g., Toyota Corolla 2014, Hyundai Accord 2016, etc.
  • one or more presentation capabilities of transportation apparatus e.g., audio, video, multimedia presentation capabilities: does the transportation apparatus have a display device, how many display devices does transportation apparatus have, what type of display devices does transportation apparatus have, where is each display located within transportation apparatus if transportation apparatus have more than one display, and/or any other capability information
  • one or more communication channels supported by transportation apparatus e.g., one or more multimedia formats supported by transportation apparatus, and/or any other information related to the transportation apparatus.
  • the information related to transportation apparatus as obtained at 410 can include information indicating various statistics about transportation apparatus.
  • the information may indicate an area in which transportation apparatus is traveling in, such as area 104a, how long has transportation apparatus been traveling within the area (e.g., 5 minutes) , on which road is transportation apparatus travelling, a speed of vehicle 106, towards which area transportation apparatus is traveling, e.g., area 104b, a size of transportation apparatus, and/or any other statistical information about transportation apparatus.
  • operations involved in 404 can be implemented by transportation apparatus information component the same as or substantially similar to the UAV communication component transportation apparatus information component 306 illustrated and described herein.
  • the images received at 402 can be analyzed to obtain passenger information regarding one or more passengers and/or driver information regarding one or more drivers in the transportation apparatus.
  • the image analysis performed at 406 can include analyzing the images to identify the one or more passengers and/or drivers in the transportation apparatus.
  • facial feature analysis may be employed at 406 to extract one or more facial features for each passenger and/or driver in transportation apparatus.
  • the extracted features can be used to match one or more passengers and/or drivers registered for transportation apparatus. Upon a match is found, the identity of the passenger and/or driver can be determined and other information such as gender, age, user interest, user experience can be obtained for the identified driver and/or passenger.
  • the facial features extracted at 406 for each passenger can be used to determine a gender of the passenger, an age group of the passenger, and/or any other characteristic information regarding the one or more passengers.
  • operations involved in 406 can be implemented by an image analysis component the same as or substantially similar to the image analysis component 308 illustrated and described herein.
  • one or more items can be determined for presentation the passenger (s) and/or driver in the transportation apparatus based on the information related to transportation apparatus as obtained at 404, and the passenger and/or driver information obtained at 406.
  • operations involved in 408 can be implemented by a targeted information component the same as or substantially similar to targeted information component 310 illustrated and described herein.
  • the one or more items determined at 408 can be transmitted to the transportation apparatus for presentation to the passenger (s) and/or driver in the transportation apparatus.
  • operations involved in 410 can be implemented by a transmission component the same as or substantially similar to transmission component 312 illustrated and described herein.
  • FIG. 5 illustrates a simplified computer system that can be used implement various embodiments described and illustrated herein.
  • a computer system 500 as illustrated in FIG. 5 may be incorporated into devices such as a portable electronic device, mobile phone, or other device as described herein.
  • FIG. 5 provides a schematic illustration of one embodiment of a computer system 500 that can perform some or all of the steps of the methods provided by various embodiments. It should be noted that FIG. 5 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
  • the computer system 500 is shown comprising hardware elements that can be electrically coupled via a bus 505, or may otherwise be in communication, as appropriate.
  • the hardware elements may include one or more processors 510, including without limitation one or more general-purpose processors and/or one or more special-purpose processors such as digital signal processing chips, graphics acceleration processors, and/or the like; one or more input devices 515, which can include without limitation a mouse, a keyboard, a camera, and/or the like; and one or more output devices 520, which can include without limitation a display device, a printer, and/or the like.
  • the computer system 500 may further include and/or be in communication with one or more non-transitory storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory ( “RAM” ) , and/or a read-only memory ( “ROM” ) , which can be programmable, flash-updateable, and/or the like.
  • RAM random access memory
  • ROM read-only memory
  • Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
  • the computer system 500 might also include a communications subsystem 530, which can include without limitation a modem, a network card (wireless or wired) , an infrared communication device, a wireless communication device, and/or a chipset such as a Bluetooth TM device, an 502.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc., and/or the like.
  • the communications subsystem 530 may include one or more input and/or output communication interfaces to permit data to be exchanged with a network such as the network described below to name one example, other computer systems, television, and/or any other devices described herein.
  • a portable electronic device or similar device may communicate image and/or other information via the communications subsystem 530.
  • a portable electronic device e.g. the first electronic device
  • the computer system 500 may further comprise a working memory 535, which can include a RAM or ROM device, as described above.
  • the computer system 500 al so can include software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • application programs 545 may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein.
  • one or more procedures described with respect to the methods discussed above, such as those described in relation to FIG. 5 might be implemented as code and/or instructions executable by a computer and/or a processor within a computer; in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer or other device to perform one or more operations in accordance with the described methods.
  • a set of these instructions and/or code may be stored on a non-transitory computer-readable storage medium, such as the storage device (s) 525 described above.
  • the storage medium might be incorporated within a computer system, such as computer system 500.
  • the storage medium might be separate from a computer system e.g., a removable medium, such as a compact disc, and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon.
  • These instructions might take the form of executable code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 500 e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc., then takes the form of executable code.
  • some embodiments may employ a computer system such as the computer system 500 to perform methods in accordance with various embodiments of the technology. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 500 in response to processor 510 executing one or more sequences of one or more instructions, which might be incorporated into the operating system 540 and/or other code, such as an application program 545, contained in the working memory 535. Such instructions may be read into the working memory 535 from another computer-readable medium, such as one or more of the storage device (s) 525. Merely by way of example, execution of the sequences of instructions contained in the working memory 535 might cause the processor (s) 510 to perform one or more procedures of the methods described herein. Additionally or alternatively, portions of the methods described herein may be executed through specialized hardware.
  • machine-readable medium and “computer-readable medium, ” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion.
  • various computer-readable media might be involved in providing instructions/code to processor (s) 510 for execution and/or might be used to store and/or carry such instructions/code.
  • a computer-readable medium is a physical and/or tangible storage medium.
  • Such a medium may take the form of a non-volatile media or volatile media.
  • Non-volatile media include, for example, optical and/or magnetic disks, such as the storage device (s) 525.
  • Volatile media include, without limitation, dynamic memory, such as the working memory 535.
  • Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
  • Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor (s) 510 for execution.
  • the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer.
  • a remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 500.
  • the communications sub sy stem 530 and/or components thereof generally will receive signals, and the bus 505 then might carry the signals and/or the data, instructions, etc. carried by the signals to the working memory 535, from which the processor (s) 510 retrieves and executes the instructions.
  • the instructions received by the working memory 535 may optionally be stored on a non-transitory storage device 525 either before or after execution by the processor (s) 510.
  • configurations may be described as a process which is depicted as a schematic flowchart or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
  • examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.

Abstract

Embodiments are provided for deliver information to a transportation apparatus via a UAV network. After the transportation apparatus enters an area, one or more UAVs may be configured to capture one or more images of an interior of the transportation apparatus. Information regarding the transportation apparatus can be collected by the UAV (s), such as the make and multimedia presentation capability of the transportation apparatus, in response to the images being received. Image analysis may be employed to analyze the images to obtain passenger and/or driver information. Based on the information regard the transportation apparatus, and passenger and/or driver information, one or more items can be determined for presentation to the passenger (s) and/or driver (s) in the transportation apparatus. The one or more items may include local information of interest to the passenger (s) and driver (s). The one or more items can be transmitted to transportation apparatus for presentation.

Description

FACILITATING TARGETED INFORMATION DELIVERY THROUGH A UAV NETWORK
CROSS-REFERENCE TO RELATED APPLICATIONS
The present application claims priority to U.S. Provisional Patent Application No. 62/274,112, filed on December 31, 2015, the disclosure of which is hereby incorporated by reference in its entirety for all purposes.
The present application is related to the following co-pending U.S. Nonprovisional Patent Applications: U.S. Nonprovisional Application No. 15/341,809 (Attorney Docket No. 101534-0969605 (004920US) filed concurrently herewith; U.S. Nonprovisional Application No. 15/341,813 (Attorney Docket No. 101534-0969607 (004930US) filed concurrently herewith; U.S. Nonprovisional Application No. 15/341,824 (Attorney Docket No. 101534-0969608 (004950US) filed concurrently herewith; and U.S. Nonprovisional Application No. 15/341,831 (Attorney Docket No. 101534-0969609 (004960US) filed concurrently herewith. The entire disclosures of each of these applications are hereby incorporated by reference in their entireties for all purposes.
BACKGROUND
The present disclosure relates totargeted delivering of information, in particular targeted delivery of information to a transportation apparatus via a UAV network.
An unmanned aerial vehicle (UAV) , commonly known as a drone and also referred by several other names, is an aircraft without a human pilot aboard. The flight of UAVs may be controlled either autonomously by onboard computers or by the remote control of a pilot on the ground or in another vehicle. UAVs have mostly found military and special operation applications, but also are increasingly finding uses in civil applications, such as policing, surveillance and firefighting, and nonmilitary security work, such as inspection of power or pipelines. UAVs are adept at gathering an immense amount of visual information and displaying it to human operators. However, it can take a great deal of time and manpower to interpret the information gathered by UAVs. In many cases, the information gathered by UAVs is  misinterpreted by human operators and analysts who have a limited time window in which to interpret the information.
Targeted delivery of information to a computing device is generally in the art. The conventional technologies typically can deliver specific information to the computing device based on a set of user activities and/or user manifested profile from stored on the computing device. For example, user interest towards certain subject may be assessed by the conventional technologies based on user activities and user profile stored on the computing device. Based on the assessed user interest, the conventional technologies can then “push” news stories or web items to the computing device for presentation on that computing device.
SUMMARY
Embodiments are provided for deliver information to a transportation apparatus via a UAV network. After the transportation apparatus enters an area, one or more UAVs may be configured to identify and track the transportation apparatus. Information regarding the transportation apparatus can be collected by the UAV (s) . The information can include an identification of the transportation, one or more drivers, and/or passengers identified in the transportation apparatus, a direction, and/or a road the transportation apparatus is traveling in, and/or any other information regarding the transportation apparatus. Such information regarding the transportation apparatus as captured by the UAV (s) can be transmitted to a processing center. The processing center can be configured to analyze the information regarding the transportation apparatus to determine one or more categories of information to be transmitted to the transportation apparatus for presentation. The one or more categories of information can include local information, such as local news, local commercial information, local site attraction information related to the area the transportation apparatus is traveling in, local marketing information that might be of interest to the drivers and/or passengers identified in the transportation apparatus, and/or any other information.
In some implementations, the UAVs may be configured to capture one or more images of an interior of the transportation apparatus and transmit the images to the processing center. The processing center may be configured to process the images to identify one or more drivers and/or passengers in the transportation apparatus. Such image processing by the processing center may involve identifying the number of drivers and/or passengers in the transportation  apparatus, their gender, their specific identity (e.g., names) , their positions in the transportation apparatus, and/or any other information regarding the drivers and/or passengers in the transportation apparatus. In some implementations, based on such information, the processing center can be configured to determine one or more items to be transmitted to the transportation apparatus and presented in the transportation apparatus. For example, based on analysis of the images, the processing center may determine there are four people in the transportation apparatus, two in the front row, and two in the back row. In that example, the processing center may be further configured to determine an identity of the two people in the front row, i.e., their names; and the gender and age about the two people in the back row. Based on such identified information about the two people in the front row, the processing center may be configured to deliver local site attraction, weather and driving condition information to the two people in the front row for presentation on a screen located on a dashboard of the transportation apparatus. Based on the identified information about the two people in the back row, the processing center may be configured to deliver local commercial or marketing information for presentation on one or more screens located in the back row.
For presenting the targeted information as determined by the processing center in the transportation apparatus, one or more displays in the transportation apparatus may be equipped with a network connection. For example, the displays can receive multimedia data from the processing center through the network connection. In some implementations, the displays may be operatively connected to a computing device and the computing device may be configured to receive the multimedia data. In one embodiment, the transportation apparatus is a vehicle. The vehicle may have at least one cabin. In that embodiment, the transportation apparatus is equipped with a wide-view display such as a dashboard covered by a LCD screen; and individual displays mounted on seats for passengers in one or more back rows.
Other objects and advantages of the invention will be apparent to those skilled in the art based on the following drawings and detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are included to provide a further understanding of the invention, are incorporated in and constitute a part of this specification, illustrate  embodiments of the invention and together with the detailed description serve to explain the principles of the invention. No attempt is made to show structural details of the invention in more detail than may be necessary for a fundamental understanding of the invention and various ways in which it may be practiced.
FIG. 1 illustrates an exemplary UAV network in accordance with the disclosure.
FIG. 2 conceptually illustrates facilitating targeted delivery of information to a transportation apparatus using UAVs in accordance with the disclosure.
FIG. 3, illustrates an example of a processing center shown in FIG. 2.
FIG. 4 illustrates an exemplary method for facilitating targeted delivery of information to a transportation apparatus
FIG. 5 illustrates a simplified computer system that can be used implement various embodiments described and illustrated herein.
In the appended figures, similar components and/or features may have the same numerical reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components and/or features. If only the first numerical reference label is used in the specification, the description is applicable to any one of the similar components and/or features having the same first numerical reference label irrespective of the letter suffix.
DETAILED DESCRIPTION OF THE INVENTION
Various specific embodiments of the present disclosure will be described below with reference to the accompanying drawings constituting a part of this specification. It should be understood that, although structural parts and components of various examples of the present disclosure are described by using terms expressing directions, e.g., “front” , “back” , “upper” , “lower” , “left” , “right” and the like in the present disclosure, these terms are merely used for the purpose of convenient description and are determined on the basis of exemplary directions displayed in the accompanying drawings. Since the embodiments disclosed by the present disclosure may be set according to different directions, these terms expressing directions are  merely used for describing rather than limiting. Under possible conditions, identical or similar reference numbers used in the present disclosure indicate identical components.
UAVs are well suited for applications where the payload consists of optical image sensors such as cameras with powerful lightweight sensors suited for a variety of commercial applications such as surveillance, video conferencing, vehicle positioning, and/or any other applications. A UAV in accordance with the disclosure can collect multi-spectral imagery of any object in an area covered the UAV. In certain embodiments, the UAV in accordance with the disclosure can fly up to 65,000 feet and can cover as much as 500 km in range. One motivation of the present disclosure is to employ UAVs to facilitate video-conferencing involving at least one transportation apparatus, such as an automobile, a bus, or a train. One or more UAVs can be employed to capture video images of an interior of the transportation apparatus, such as a cabin of the transportation apparatus. Since UAV can be configured to move at a speed consistent with a speed of the transportation apparatus above the transportation apparatus, video images of the transportation apparatus can be restively simply captured by the UAV when the transportation apparatus moves.
Another advantage of using the UAV to capture video images of a moving transportation apparatus is that the UAV equipped with a wide-view, e.g., 360 degree, camera, can be used to capture wide-view video images of an interior of the transportation apparatus so along as there is clear view of the interior of the transportation apparatus from the UAV. The images can be transmitted from the UAV to a processing center via the UAV network. The processing center can be configured to obtain information regarding the transportation apparatus, such the make of the transportation apparatus, one or more registration numbers of the transportation apparatus in response to receiving the images of the transportation apparatus. In some implementations, the processing center can be further configured to analyze the images to obtain passenger information and/or driver information regarding one or more passengers and/or drivers in the transportation apparatus. The passenger information can include information indicating a gender of each passenger, an age group of each passenger, an identity of each passenger, a position of each passenger within the transportation apparatus, and/or any other passenger information. The driver information can include similar information regarding the driver. Based on the passenger information and/or driver information, and the information related  to the transportation apparatus, the processing center can be configured to determine one or more items to be presented to the passengers and/or the driver within the transportation apparatus. For example, based on the passenger information, the processing center can determine an age group of the passengers sitting in a back row of the transportation apparatus and determine to present local marketing items that might be of interest to the passengers.
As used herein, transportation apparatus may be referred to as an apparatus capable of moving in distance for transportation of people and/or goods. Examples of a transportation apparatus may include a vehicle (e.g., a car or truck) , a bike, a motorcycle, a train, a ship, a plane or a space ship, just to name view. It should be understood, in the examples given below, although vehicle is used in those examples, this is not intended to be limiting. Other type of transportation apparatus may also be used in those examples in some embodiments.
FIG. 1 illustrates an exemplary UAV network 100 for facilitating communications for a vehicle in accordance with the disclosure. As shown, the UAV network 100 can comprise multiple UAVs 102, such as UAVs 102a-f. It should be understood the UAV network 100, in certain embodiments, can comprise hundreds, thousands, or even tens of thousands of UAVs 102. The individual UAVs 102 in UAV network 100, such as UAV 102a, can fly above the ground, between 50,000 to 65,000 feet altitude. However, this is not intended to be limiting. In some examples, some or all of the UAVs 102 in the UAV network 100 can fly at hundreds or thousands feet above the ground. As shown, the individual UAVs 102 in the UAV network 100 can communicate with each other through communication hardware carried by or installed on UAVs 102. For example, the communication hardware onboard a UAV 102 can include an antenna, a high frequency radio transceiver, an optical transceiver, and/or any other communication components for long range communications. A communication channel between any two given UAVs 102 in UAV network 100, for example, UAV 102c and UAV 102d, can be established.
One way of establishing a communication channel between any two given UAVs is to have them autonomously establish the communication channel through the communication hardware onboard the two given UAVs 102. In this example, UAVs 102a, 102b and 102c are neighboring UAVs such that they cover neighboring  areas  104a, 104b, and 104c respectively. They can be configured to communicate with each other once they are within a threshold  distance. The threshold distance can be the maximum communications range of the transceivers onboard the  UAVs  102a, 102b, and 102c. In this way, UAVs 102a, 102b, and 102c can send data to each other without an access point.
Another way of establishing a communication channel between any two given UAVs 102 in UAV network 100 is to have them establish communication channel through a controller. As used herein, a controller may be referred to as a piece of hardware and/or software configured to control communications within UAV network 100. The controller can be provided by a ground processing station, such as  ground controller  110a, 110b, or 110c. For instance, the controller 110 can be implemented by a computer server housed in a controller 110. In certain embodiments, the controller 110 can be provided by a UAV 102 in the UAV network 100. For instance, a given UAV 102, such as an unmanned helicopter or a balloon, in the UAV network 100 can carry payloads including one or more of a processor configured to implement the controller 110. In any case, the controller 110 can be configured to determine network requirements based on an application supported by UAV network 100, and/or to perform any other operations. In implementations, control signals can be transmitted via a control link from the controller 110 to the UAVs 102 shown in FIG. 1.
As mentioned above, an important criteria to a UAV 102 in the network is altitude. However, as the UAV 102 altitude increases, the signals emitted by UAV 102 become weaker. A UAV 102 flying at an altitude of 65,000 feet can cover an area up to 100 kilometers on the ground, but the signal loss can be significantly higher than would occur for a terrestrial network. Radio signals typically require a large amount of power for transmission in long distance. On the other end, the payloads can be carried by a UAV 102 that stays in the air for an extended period of time is limited. As mentioned above, solar energy can be used to power the UAV 102. However this limits the weight of payloads that can be carried by a UAV 102 due to the limited rate at which solar irritation can be absorbed and converted to electricity.
Free-space optical communication (FSO) is an optical communication technology that transmits light in free space to wirelessly transmit data for telecommunications. Commercially available FSO systems use wave length close to visible spectrum around 850 to 1550 nm. In a basis point-to-point FSO system, two FSO transceivers can be placed on both sides of transmission path that has unobstructed line-of-sight between the two FSO transceivers. A  variety of light sources can be used for the transmission of data using FSO transceivers. For example, LED and laser can be used to transmit data in a FSO system.
Lasers used in FSO systems provide extremely high bandwidths and capacity, on par with terrestrial fiber optic networks, but they also consume much less power than microwave systems. A FSO unit can be included in the payloads of a UAV 102 for communication. The FSO unit can include an optical transceiver with a laser transmitter and a receiver to provide full duplex bi-directional capability. The FSO unit can use a high-power optical source, i.e., laser, and a lens to transmit the laser beam through the atmosphere to another lens receiving the information embodied in the laser beam. The receiving lens can connect to a high-sensitivity receiver via optical fiber. The FSO unit included in a UAV 102 in accordance with the disclosure can enable optical transmission at speeds up to 10Gbps.
Also shown in FIG. 1 are vehicles 106a-f. A given vehicle 106 can be equipped with communication hardware. The communication hardware in the given vehicle 106 can include a FSO unit described above, a radio transceiver, and/or any other type of communication hardware. The communication hardware included in the vehicle 106 can be used to establish a communication channel between the vehicles 106 via the UAVs 102. A controller 110 can include a FSO unit configured to establish a communication channel FSO unit through laser beam. Through the communication channel, UAV 102 can be configured to communicate its geo-locations to controller 110. Since ground controller 110 is stationary, the geo-location of ground controller 110 can be preconfigured into an onboard computer in UAVs 102. Through the ground controller 110, information intended for vehicle 106 can be forwarded to vehicle 106. The ground controller 110 can be connected to a wired or wireless network. Information intended for vehicle 106 can be communicated through the wired or wireless network from or to another entity connected to the wired or wireless network. The information intended for vehicle 106 can be first communicated to the UAV 102 through laser beam, and the UAV 102 can forward the information to vehicle 106 through laser beam 204a.
In implementations, for locating a vehicle 106, a tracking signal can be transmitted from UAV 102 for tracking vehicle 106. The tracking signal can be in various forms. For example, the UAV 102 may scan the covered area 104 with a camera onboard UAV 102 in a pre-determined pattern. For example, the UAV 102 may scan the covered area 104 in a scan line  fashion from on one corner of the covered area 104 to the opposite corner of the covered area 104. As another example, the UAV 102 may scan the covered area 104 in a concentric sphere fashion starting from an outer sphere within the covered area 104, gradually into inner spheres within the covered area 104 until the center of the covered area 104. Still as another example, the UAV 102 may scan the covered area along predefined lines of areas 104, for example a portion of a road that enters area 104 and another portion of the road that exits area 104. In certain embodiments, the UAV 102 may carry a radio transmitter configured to broadcast in radio signals within the covered area 104. In those examples, the broadcast radio signals can serve as tracking signals such that once they are intercepted by a vehicle 106 passing through the covered area 104, the UAV 102 can be configured to location a position of the vehicle 106 within the covered area 104.
An identification of the vehicle 106 can be captured after the vehicle 106 has been tracked by UAV 102. In certain implementations, the identification of the vehicle 106 can be captured by a camera carried by the UAV 102. For example, the UAV 102 may be configured to capture a picture of a license plate of vehicle 106 once it has been tracked. As another example, the UAV 102 may be configured to transmit a request to vehicle 106 to inquire about its identification, and the vehicle 106 can send its identification to the UAV 102 in response to the request.
Any one of the UAVs 102 shown in FIG. 1 may be instructed to “monitor” or “zoom-in onto” a corresponding vehicle 106. For example, the UAV 102a may receive location information regarding vehicle 106a and instructions to zoom-in onto vehicle 106a. In that example, in response to receiving such location information and instructions, the UAV 102a may be configured to track vehicle 106a based on the received location information. This may involve moving the UAV 102a into a vicinity of vehicle 106a such that UAV 102a has a clear view of vehicle 106. As will be discussed below, the instructions received by UAV 102a may include capturing one or more images of interior of vehicle 106a. For achieving this, UAV 102a may be equipped with one or more cameras. In some embodiments, the camera (s) carried by UAV 102a may include a wide-view camera capable of capturing a wide field of view. In one embodiment, the wide-view camera carried by UAV 102a is an omnidirectional camera with a  360-degree field of view in a horizontal plane, or with a visual field that covers (approximately) the entire sphere.
In some embodiments, the cameras carried by UAV 102a may include multiple cameras fixed at corresponding locations on an underbody ofUAV 102a. In one embodiment, the multiple cameras may be arranged on the underbody ofUAV 102a to form a ring. In one configuration, 8 cameras are used to form such a ring. One or more of those cameras can be employed to capture the interior of vehicle 106a depending on a distance between UAV 102a and vehicle 106a, an angle between the two, and/or any other factors. For example, three cameras in the ring may be employed by UAV 102a to capture images of the interior of vehicle 106a from different angles. In some implementations, individual cameras carried by UAV 102a may have panoramic view capability. For example, various types of panoramic view cameras may be carried by UAV 102a, including short rotation, full rotation, fixed lens, and any other types of panoramic view cameras.
With UAV network 100 having been generally described, attention is now directed to FIG. 2, which conceptually illustrates facilitating targeted delivery of information to a transportation apparatus using UAVs in accordance with the disclosure. It will be described with reference to FIG. 1. As shown, individual UAVs 102 in the UAV network 100 can be instructed to one or more images of interior of a vehicle 102 as described above. In FIG. 2, it is shown that UAV 102a, on a request, can be positioned such that it captures the one or more images of the interior of vehicle 106a. In implementations, as described above, the UAV 102a may be configured to detect the vehicle 106a when vehicle 106a enters the area 104a covered by UAV 102a. In response to detecting vehicle 106a has entered area 104a, UAV 102a can be configured to be position itself such that UAV 102a has a clear line of sight with respect to vehicle 106a. In some implementations, the position ofUAV 102a with respect to vehicle 106a can be adjusted based on the images of vehicle 106a as captured by UAV 102a For instance, UAV 102a, controller 110a and/or processing center 202 can be configured to determine a quality of the images captured by UAV 102a. In that instance, when the image qualities are determined not to show a good view of the interior of vehicle 106a, the UAV 102a can be instructed to reposition itself until acceptable images of the interior of vehicle 106a are received. This may involve instructing the UAV 102a to adjust its angle, distance, speed, and/or any other aspects with  respect to vehicle 106a. In one implementation, such an instruction may be generated by the processing center 202 and transmitted to UAV 102a through the UAV network 100 via the controller 110a.
UAV 102a can be configured to transmit the captured images of vehicle 106a to processing center 202 through UAV network 100. As shown in this example, in some embodiments, the images of the vehicle 106a may be first transmitted to controller 110a on the ground. The image transmission from UAV 102a to the controller 110 may vary. For example, the image data may be first transmitted from UAV 102a to another UAV in the UAV network 100. For instance, that UAV 102a may have more computing power or capability than UAV 102a, which may be a lightweight UAV configured to follow moving vehicles and to capture images of interiors of moving vehicles. In that example, the UAV with more computing power can be used as a relay station to relay image data from UAV 102a to controller 110a. In some embodiments, the image data may be transmitted to more than one UAV in the network 100 before it reaches the controller 110a.
The controller 110a may be configured to 1) communicate control instructions with processing center 202 and with the UAV 102a; 2) receive image data from UAV 102a; 3) transmit the image data from UAV 102a to the processing center 202; and/or to perform any other operations. However, it should be understood that in some other embodiments, transmitting image data through controller 110a may not be necessary. In those embodiments, the image data can be transmitted from UAV 102a to the processing center 202a via the UAV network 100 without going through controller 110a.
The processing center 202 can be configured to process the vehicle images received from UAV 102a to obtain vehicle information related to the vehicle 106a. For example, in response to receiving the vehicle images captured by the UAV 102a, the processing center 202 can be configured to obtain information regarding the vehicle 106a as captured in the images. For example, the images can contain license plate information indicating a license number of vehicle 106a. Based on the license number of vehicle 106a, the processing center 202 can obtain certain information regarding the vehicle 106a, such as the make of vehicle 106a, one or more presentation capabilities of vehicle 106a (e.g., audio, video, multimedia presentation capabilities: does the vehicle 106a have a display device, how many display devices does vehicle 106a have,  what type of display devices does vehicle 106a have and/or any other capability information) , one or more communication channels with the vehicle 106a (e.g., an internet address of the one or more display devices equipped within vehicle 106a, a telephone number of vehicle 106a) , and/or any other information related to vehicle 106a.
The processing center 202 can be configured to analyze the images and to obtain passenger information and/or driver information related to one or more passengers and/or drivers in vehicle 106. For example, in response to receiving the images, processing center 202 can be configured to analyze the images by employing image analysis algorithms. The image analysis performed by the processing center 202, in that example, can include analyzing the images to identify the one or more passengers and/or drivers. For example, facial feature analysis may be employed by processing center 202 to extract one or more facial features for each passenger and/or driver in vehicle 106a. The extracted features can be used to match one or more passengers and/or drivers registered for vehicle 106a. Upon a match is found, the identity of the passenger and/or driver can be determined and other information such as gender, age, user interest, user experience can be obtained for the identified driver and/or passenger.
As another example, the facial features extracted for each passenger can be used to determine a gender of the passenger, an age group of the passenger, and/or any other characteristic information regarding the one or more passengers. For instance, in certain situation, the exact identity of a particular passenger in vehicle 106a may not be readily determinable based on the images received. In that situation, certain characteristic information can still be determined using the facial features such as the passenger is a male in the age group of teens. In some implementations, the processing center 202 can be configured to determine positions of the passengers within vehicle 106a. For example, a position of each passenger with respect to front rows or back rows in vehicle 106a can be determined by analyzing the images. In some embodiments, such image analysis may include obtaining information regarding vehicle 106a, such as the number of rows of seats vehicle 106a has, a size of the interior of vehicle 106a, and/or any other information regarding a specification of vehicle 106a. In that example, a particular passenger’s position, for example passenger A is sitting in the left rear seat can be determined.
The processing center 202 can be configured to determine one or more items for presentation to the one or more passengers and/or drivers based on the information related to the vehicle 106a, and/or the passenger information and/or driver information. For example, the passenger information may indicate vehicle 106a has a particular passenger that is male in his teens. The information related to vehicle 106a may indicate that vehicle 106a has entered area 104a and has travelled within area 104a for a certain time period. In that example, the processing center 202 can be configured to determine to push one or more local marketing items such as fast food restaurants available, local youth events (e.g., a county fair, or amusement park) to be presented to that passenger. Once such items are determined, the processing center 202 can be configured to transmit the items to vehicle 106a for presentation on a display device appropriate for that passenger. For example, the image analysis mentioned above may indicate that passenger is sitting in the rear left seat and the information related to vehicle 106a may indicate the rear left seat has a display device with a specific internet address. in that example, the processing center 202 can be configured to transmit the items to the display device through the specific internet address. In some implementations, the transmission of the items by the processing center 202 can be through the UAV network 100.
Attention is now directed to FIG. 3, where an example of processing center 202 is shown. As shown, the processing center 202 may include one or more of a processor 302 configured to execute program components. The program components may include a transportation apparatus image component 304, a transportation apparatus information component 306, an image analysis component 308, a targeted information component 310, a transmission component 312 and/or any other components. The transportation apparatus image component 304 can be configured to receive one or more images of a transportation apparatus, such as vehicle 106a. The images received by the transportation apparatus image component 304 can include images of an interior of the vehicle 106a captured from different angles by a UAV, such as UAV 102a. The images received by transportation apparatus image component 
304 can include information readily indicating an identity of the vehicle. For example, one or more of the images may indicate a license plate number of vehicle 106a. However, this is not necessarily the only case. In certain situations, the images received by transportation apparatus image component 304 may not contain such information. To address such a situation, the transportation apparatus image component 304 may be configured to generate a control  instruction to instruct the UAV, e.g., UAV 102, to recapture the images; and transmit the control instruction to the UAV 102 via UAV network 100. As mentioned above, the control instruction can be transmitted to the UAV 102 via UAV network.
The transportation apparatus information component 306 can be configured to obtain information related to the transportation apparatus based on the images received by transportation apparatus image component 304. As mentioned above, the images received by transportation apparatus image component 304 may contain information indicating a license number of vehicle 106a. In some embodiments, the transportation apparatus information component 306 can be configured to obtain information regarding vehicle 106a based on such license information. For example, the transportation apparatus information component 306 can be configured to make an inquiry for vehicle 106a to a vehicle registration database using the license plate number of vehicle 106a. The information related to vehicle 106 as obtained by transportation apparatus information component 306 may include a make of vehicle 106a (e.g., Toyota Corolla 2014, Honda Accord 2016, etc. ) , one or more presentation capabilities of vehicle 106a (e.g., audio, video, multimedia presentation capabilities: does the vehicle 106a have a display device, how many display devices does vehicle 106a have, what type of display devices does vehicle 106a have, where is each display located within vehicle 106a if vehicle 106a have more than one display, and/or any other capability information) , one or more communication channels supported by vehicle 106a, one or more multimedia formats supported by vehicle 106a. and/or any other information related to the vehicle 106a. For example, the information related to vehicle 106a may include information indicating that vehicle 106a has 3 display devices capable of presenting audio, video and animation, with the first display device being located on a dashboard of vehicle 106, the second display device being located on the back of a left front seat and the third display device being located on the back of a right front seat.
In some implementations, the information related to vehicle 106a as obtained by transportation apparatus information component 306 can include information indicating various statistics about vehicle 106a. For example, the information may indicate an area in which vehicle 106a is traveling in, such as area 104a, how long has vehicle 106a been traveling within the area (e.g., 5 minutes) , on which road is vehicle 106a travelling, a speed of vehicle 106,  towards which area vehicle 106a is traveling, e.g., area 104b, a size of vehicle 106a, and/or any other statistical information about vehicle 106a.
The image analysis component 308 can be configured to analyze the images received by transportation apparatus image component 304 and to obtain passenger information and/or driver information related to one or more passengers and/or drivers in vehicle 106a. For example, in response to the images being received by transportation apparatus image component 304, image analysis component 308 can be configured to analyze the images by employing image analysis algorithms. The image analysis performed by image analysis component 308 can include analyzing the images to identify the one or more passengers and/or drivers in vehicle 106a. For example, facial feature analysis may be employed to extract one or more facial features for each passenger and/or driver in vehicle 106a. The extracted features can be used to match one or more passengers and/or drivers registered for vehicle 106a. Upon a match is found, the identity of the passenger and/or driver can be determined by image analysis component 308 and other information such as gender, age, user interest, user experience can be obtained for the identified driver and/or passenger.
As another example, the facial features extracted by image analysis component 308 for each passenger can be used by image analysis component 308 to determine a gender of the passenger, an age group of the passenger, and/or any other characteristic information regarding the one or more passengers. For instance, in certain situation, the exact identity of a particular passenger in vehicle 106a may not be readily determinable by image analysis component 308. In that situation, certain characteristic information can still be determined by image analysis component 308 using the facial features such as the passenger is a male in the age group of teens. In some implementations, the image analysis component 308 can be configured to determine positions of the passengers within vehicle 106a. For example, a position of each passenger with respect to front rows or back rows in vehicle 106a can be determined by image analysis component 308 by analyzing and collaborating contents in the images. In some embodiments, such image analysis may include obtaining information regarding vehicle 106a, such as the number of rows of seats vehicle 106a has, a size of the interior of vehicle 106a, and/or any other information regarding a specification of vehicle 106a as obtained by transportation apparatus  information component 306. In that example, a particular passenger’s position, for example passenger A is sitting in the left rear seat may be determined.
The targeted information component 310 can be configured to determine one or more items for presentation to the one or more passengers and/or drivers based on the information related to the vehicle 106a, and/or the passenger information and/or driver information. For example, the passenger information may indicate vehicle 106a has a particular passenger that is male in his 20s. The information related to vehicle 106a may indicate that vehicle 106a has entered area 104a and has travelled within area 104a for a certain time period. In that example, the targeted information component 310 can be configured to determine to push one or more local marketing items such as fast food restaurants available, local youth events (e.g., a county fair, or amusement park) to be presented to that passenger based on a general interest manifested by males in that age group. In that example, the targeted information component 310 can be configured to obtain general interests for various age groups and determine the one or more items for presentation to the passenger (s) and/or driver in vehicle 106a based on the obtained general interests for the age groups.
In some implementations, the targeted information component 310 can be configured to retrieve local information in response to the location of vehicle 106a is obtained. For example, the information related to vehicle 106a as obtained by transportation apparatus information component 306 can include information indicating vehicle 106a is in area 104a. In response to such information being received, the targeted information component 310 can obtain local information related to area 104, such as local attraction information, local commercial information, local events information, local traffic condition information, and/or any other local information. Once the local information is obtained by targeted information component 310, the targeted information component 310 can be configured to select a subset of the local information for presentation to the passenger (s) and/or driver (s) in the vehicle 106a.
As illustration, the driver information determined by image analysis component 308 may identify a specific driver registered with vehicle 106a. Based on that information, the specific driver’s interest towards certain activities can be obtained by targeted information component 310, for example from a database storing such information. Once obtaining user interest information, targeted information component 310 can be configured to determine one or  more items to be pushed to the driver of vehicle 106a for presentation. For example, the items may include one or more local events of interest to the driver (for example, if the driver has a known interest in music, a local event of music festival that is taking place in area 104a can be pushed to the driver) . As another example, the items may include one or more local attractions of interest to the driver (for example, if the driver has a known interest in wine, one or more local wineries may be pushed to the driver) . Still as another illustration, certain local Italian restaurants may be pushed to the driver if the driver has a known interest in Italian food. Other items that may be pushed to the driver and/or passenger (s) by targeted information component 310 may include local news story, weather information, relevant traffic condition information, local commercial information (e.g., location and hours of operation of retail stores and/or malls) , local ongoing events and/or any other items.
The transmission component 312 can be configured to transmit the items to vehicle 106a for presentation on a display device appropriate for a passenger. In some implementations, the transmission component 312 can be configured to determine a format in which the items to be presented on the display device. For example, the passenger information as determined by image analysis component 308 may indicate that passenger is sitting in the rear left seat and the information related to vehicle 106a may indicate the rear left seat has a display device with a specific internet address and is capable of present video. In that example, the transmission component 312 can be configured to transmit the items to that display device through the specific internet address and generate a control instruction instructing that display device to present the items in video. In some implementations, the transmission of the items by transmission component 312 can be through the UAV network 100.
Attention is now is directed to FIG. 4 where an exemplary method 400 for facilitating targeted delivery of information to a transportation apparatus. The particular series of processing steps depicted in FIG. 4 is not intended to be limiting. It is appreciated that the processing steps may be performed in an order different from that depicted in FIG. 4 and that not all the steps depicted in FIG. 4 need be performed. In certain implementations, the method 400 may be implemented by a video processing center, such as the video processing center shown in FIG. 5.
In some embodiments, the method depicted in method 400 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit  designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information) . The one or more processing devices may include one or more devices executing some or all of the operations of method 400 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 400.
At 402, one or more images of an interior of a transportation apparatus can be received. The images received at 402 can include images of the interior of the transportation apparatus captured from different angles by a UAV, such as UAV 102a. The images received at 402 can include information readily indicating an identity of the transportation apparatus. In some implementations, operations involved in 402 can be implemented by a transportation apparatus image component the same as or substantially similar to transportation apparatus image component 304 illustrated and described herein.
At 404, information related to the transportation apparatus can be obtained based on the images received at 402. As mentioned above, the images received at 402 may contain information indicating a license number of transportation apparatus. In some embodiments, based on such information, the information regarding transportation apparatus can be obtained. The information obtained at 404 may include a make of transportation apparatus (e.g., Toyota Corolla 2014, Honda Accord 2016, etc. ) , one or more presentation capabilities of transportation apparatus (e.g., audio, video, multimedia presentation capabilities: does the transportation apparatus have a display device, how many display devices does transportation apparatus have, what type of display devices does transportation apparatus have, where is each display located within transportation apparatus if transportation apparatus have more than one display, and/or any other capability information) , one or more communication channels supported by transportation apparatus, one or more multimedia formats supported by transportation apparatus, and/or any other information related to the transportation apparatus. In some implementations, the information related to transportation apparatus as obtained at 410 can include information indicating various statistics about transportation apparatus. For example, the information may indicate an area in which transportation apparatus is traveling in, such as area 104a, how long has  transportation apparatus been traveling within the area (e.g., 5 minutes) , on which road is transportation apparatus travelling, a speed of vehicle 106, towards which area transportation apparatus is traveling, e.g., area 104b, a size of transportation apparatus, and/or any other statistical information about transportation apparatus. In some implementations, operations involved in 404 can be implemented by transportation apparatus information component the same as or substantially similar to the UAV communication component transportation apparatus information component 306 illustrated and described herein.
At 406, the images received at 402 can be analyzed to obtain passenger information regarding one or more passengers and/or driver information regarding one or more drivers in the transportation apparatus. The image analysis performed at 406 can include analyzing the images to identify the one or more passengers and/or drivers in the transportation apparatus. For example, facial feature analysis may be employed at 406 to extract one or more facial features for each passenger and/or driver in transportation apparatus. The extracted features can be used to match one or more passengers and/or drivers registered for transportation apparatus. Upon a match is found, the identity of the passenger and/or driver can be determined and other information such as gender, age, user interest, user experience can be obtained for the identified driver and/or passenger. As another example, the facial features extracted at 406 for each passenger can be used to determine a gender of the passenger, an age group of the passenger, and/or any other characteristic information regarding the one or more passengers. In some implementations, operations involved in 406 can be implemented by an image analysis component the same as or substantially similar to the image analysis component 308 illustrated and described herein.
At 408, one or more items can be determined for presentation the passenger (s) and/or driver in the transportation apparatus based on the information related to transportation apparatus as obtained at 404, and the passenger and/or driver information obtained at 406. In some implementations, operations involved in 408 can be implemented by a targeted information component the same as or substantially similar to targeted information component 310 illustrated and described herein.
At 410, the one or more items determined at 408 can be transmitted to the transportation apparatus for presentation to the passenger (s) and/or driver in the transportation  apparatus. In some implementations, operations involved in 410 can be implemented by a transmission component the same as or substantially similar to transmission component 312 illustrated and described herein.
FIG. 5 illustrates a simplified computer system that can be used implement various embodiments described and illustrated herein. A computer system 500 as illustrated in FIG. 5 may be incorporated into devices such as a portable electronic device, mobile phone, or other device as described herein. FIG. 5 provides a schematic illustration of one embodiment of a computer system 500 that can perform some or all of the steps of the methods provided by various embodiments. It should be noted that FIG. 5 is meant only to provide a generalized illustration of various components, any or all of which may be utilized as appropriate. FIG. 5, therefore, broadly illustrates how individual system elements may be implemented in a relatively separated or relatively more integrated manner.
The computer system 500 is shown comprising hardware elements that can be electrically coupled via a bus 505, or may otherwise be in communication, as appropriate. The hardware elements may include one or more processors 510, including without limitation one or more general-purpose processors and/or one or more special-purpose processors such as digital signal processing chips, graphics acceleration processors, and/or the like; one or more input devices 515, which can include without limitation a mouse, a keyboard, a camera, and/or the like; and one or more output devices 520, which can include without limitation a display device, a printer, and/or the like.
The computer system 500 may further include and/or be in communication with one or more non-transitory storage devices 525, which can comprise, without limitation, local and/or network accessible storage, and/or can include, without limitation, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory ( “RAM” ) , and/or a read-only memory ( “ROM” ) , which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
The computer system 500 might also include a communications subsystem 530, which can include without limitation a modem, a network card (wireless or wired) , an infrared  communication device, a wireless communication device, and/or a chipset such as a BluetoothTM device, an 502.11 device, a WiFi device, a WiMax device, cellular communication facilities, etc., and/or the like. The communications subsystem 530 may include one or more input and/or output communication interfaces to permit data to be exchanged with a network such as the network described below to name one example, other computer systems, television, and/or any other devices described herein. Depending on the desired functionality and/or other implementation concerns, a portable electronic device or similar device may communicate image and/or other information via the communications subsystem 530. In other embodiments, a portable electronic device, e.g. the first electronic device, may be incorporated into the computer system 500, e.g., an electronic device as an input device 515. In some embodiments, the computer system 500 will further comprise a working memory 535, which can include a RAM or ROM device, as described above.
The computer system 500 al so can include software elements, shown as being currently located within the working memory 535, including an operating system 540, device drivers, executable libraries, and/or other code, such as one or more application programs 545, which may comprise computer programs provided by various embodiments, and/or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the methods discussed above, such as those described in relation to FIG. 5, might be implemented as code and/or instructions executable by a computer and/or a processor within a computer; in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer or other device to perform one or more operations in accordance with the described methods.
A set of these instructions and/or code may be stored on a non-transitory computer-readable storage medium, such as the storage device (s) 525 described above. In some cases, the storage medium might be incorporated within a computer system, such as computer system 500. In other embodiments, the storage medium might be separate from a computer system e.g., a removable medium, such as a compact disc, and/or provided in an installation package, such that the storage medium can be used to program, configure, and/or adapt a general purpose computer with the instructions/code stored thereon. These instructions might take the form of executable  code, which is executable by the computer system 500 and/or might take the form of source and/or installable code, which, upon compilation and/or installation on the computer system 500 e.g., using any of a variety of generally available compilers, installation programs, compression/decompression utilities, etc., then takes the form of executable code.
It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used, and/or particular elements might be implemented in hardware, software including portable software, such as applets, etc., or both. Further, connection to other computing devices such as network input/output devices may be employed.
As mentioned above, in one aspect, some embodiments may employ a computer system such as the computer system 500 to perform methods in accordance with various embodiments of the technology. According to a set of embodiments, some or all of the procedures of such methods are performed by the computer system 500 in response to processor 510 executing one or more sequences of one or more instructions, which might be incorporated into the operating system 540 and/or other code, such as an application program 545, contained in the working memory 535. Such instructions may be read into the working memory 535 from another computer-readable medium, such as one or more of the storage device (s) 525. Merely by way of example, execution of the sequences of instructions contained in the working memory 535 might cause the processor (s) 510 to perform one or more procedures of the methods described herein. Additionally or alternatively, portions of the methods described herein may be executed through specialized hardware.
The terms “machine-readable medium” and “computer-readable medium, ” as used herein, refer to any medium that participates in providing data that causes a machine to operate in a specific fashion. In an embodiment implemented using the computer system 500, various computer-readable media might be involved in providing instructions/code to processor (s) 510 for execution and/or might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take the form of a non-volatile media or volatile media. Non-volatile media include, for example, optical and/or magnetic disks, such as the storage device (s) 525. Volatile media include, without limitation, dynamic memory, such as the working memory 535.
Common forms of physical and/or tangible computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punchcards, papertape, any other physical medium with patterns of holes, a RAM, a PROM, EPROM, a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to the processor (s) 510 for execution. Merely by way of example, the instructions may initially be carried on a magnetic disk and/or optical disc of a remote computer. A remote computer might load the instructions into its dynamic memory and send the instructions as signals over a transmission medium to be received and/or executed by the computer system 500.
The communications sub sy stem 530 and/or components thereof generally will receive signals, and the bus 505 then might carry the signals and/or the data, instructions, etc. carried by the signals to the working memory 535, from which the processor (s) 510 retrieves and executes the instructions. The instructions received by the working memory 535 may optionally be stored on a non-transitory storage device 525 either before or after execution by the processor (s) 510.
The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims.
Specific details are given in the description to provide a thorough understanding of exemplary configurations including implementations. However, configurations may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the configurations. This description provides example configurations only, and does  not limit the scope, applicability, or configurations of the claims. Rather, the preceding description of the configurations will provide those skilled in the art with an enabling description for implementing described techniques. Various changes may be made in the function and arrangement of elements without departing from the spirit or scope of the disclosure.
Also, configurations may be described as a process which is depicted as a schematic flowchart or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure. Furthermore, examples of the methods may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a non-transitory computer-readable medium such as a storage medium. Processors may perform the described tasks.
Having described several example configurations, various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the disclosure. For example, the above elements may be components of a larger system, wherein other rules may take precedence over or otherwise modify the application of the technology. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not bind the scope of the claims.
As used herein and in the appended claims, the singular forms “a” , “an” , and “the” include plural references unless the context clearly dictates otherwise. Thus, for example, reference to “a user” includes a plurality of such users, and reference to “the processor” includes reference to one or more processors and equivalents thereof known to those skilled in the art, and so forth.
Also, the words “comprise” , “comprising” , “contains” , “containing” , “include” , “including” , and “includes” , when used in this specification and in the following claims, are intended to specify the presence of stated features, integers, components, or steps, but they do not preclude the presence or addition of one or more other features, integers, components, steps, acts, or groups.

Claims (18)

  1. A method for facilitating targeted delivery of information to a transportation apparatus via an unmanned aerial vehicle (UAV) network, the method being implemented in one or more of a processor configured to execute programmed components, the method comprising:
    receiving, via the UAV network, one or more images of the transportation apparatus captured by a UAV;
    obtaining information related to the transportation apparatus in response to receiving the images of the transportation apparatus;
    analyzing the one or more images to obtain passenger information related to one or more passengers in the transportation apparatus;
    determining one or more items for presentation to the one or more passengers based on the information related to transportation apparatus and the passenger information; and
    transmitting the determined one or more items to the transportation apparatus for presentation.
  2. The method of claim 1, wherein the transportation apparatus includes a vehicle.
  3. The method of claim 1, wherein the passenger information includes a quantity of passengers in the transportation apparatus, gender of each passenger in the transportation apparatus, age group of each passenger in the transportation apparatus and identity of each passenger in the transportation apparatus.
  4. The method of claim 1, further comprising processing the passenger information and one or more images to obtain position information regarding positions of the one or more passengers within the transportation apparatus.
  5. The method of claim 1, wherein the information related to the transportation apparatus includes information indicating a make of the transportation apparatus, a geographical area the transportation apparatus is traveling in, a transmission channel employed  by the transportation apparatus for communication, and/or multimedia presentation capability of the transportation apparatus.
  6. The method of claim 1, wherein the one or more items determined to be presented to the one or more passengers include a marketing item related to a local business in a geographical area the transportation apparatus is traveling in.
  7. The method of claim 1, wherein the one or more items are presented to the one or more passengers through audio and/or video within the transportation apparatus.
  8. The method of claim 1, wherein determining one or more items for presentation to the one or more passengers based on the information related to transportation apparatus and the passenger information comprises:
    determining a specific display device within the transportation apparatus for presentation of the one or more items based on the information related to transportation apparatus and the passenger information.
  9. The method of claim 1, wherein determining one or more items for presentation to the one or more passengers based on the information related to transportation apparatus and the passenger information comprises:
    determining a manner in which the one or more items are to be presented within the transportation apparatus based on the information related to transportation apparatus and the passenger information.
  10. A system for facilitating targeted delivery of information to a transportation apparatus via an unmanned aerial vehicle (UAV) network, the system comprising one or more of a processor configured to execute machine-readable instructions such that when the machine-readable instructions are executed, the processor is caused to perform:
    receiving, via the UAV network, one or more images of the transportation apparatus captured by a UAV;
    obtaining information related to the transportation apparatus in response to receiving the images of the transportation apparatus;
    analyzing the one or more images to obtain passenger information related to one or more passengers in the transportation apparatus;
    determining one or more items for presentation to the one or more passengers based on the information related to transportation apparatus and the passenger information; and
    transmitting the determined one or more items to the transportation apparatus for presentation.
  11. The system of claim 10, wherein the transportation apparatus includes a vehicle.
  12. The system of claim 10, wherein the passenger information includes a quantity of passengers in the transportation apparatus, gender of each passenger in the transportation apparatus, age group of each passenger in the transportation apparatus and identity of each passenger in the transportation apparatus.
  13. The system of claim 10, wherein the processor is further caused to perform processing the passenger information and one or more images to obtain position information regarding positions of the one or more passengers within the transportation apparatus.
  14. The system of claim 10, wherein the information related to the transportation apparatus includes information indicating a make of the transportation apparatus, a geographical area the transportation apparatus is traveling in, a transmission channel employed by the transportation apparatus for communication, and/or multimedia presentation capability of the transportation apparatus.
  15. The system of claim 10, wherein the one or more items determined to be presented to the one or more passengers include a marketing item related to a local business in a geographical area the transportation apparatus is traveling in.
  16. The system of claim 10, wherein the one or more items are presented to the one or more passengers through audio and/or video within the transportation apparatus.
  17. The system of claim 10, wherein determining one or more items for presentation to the one or more passengers based on the information related to transportation apparatus and the passenger information comprises:
    determining a specific display device within the transportation apparatus for presentation of the one or more items based on the information related to transportation apparatus and the passenger information.
  18. The system of claim 10, wherein determining one or more items for presentation to the one or more passengers based on the information related to transportation apparatus and the passenger information comprises:
    determining a manner in which the one or more items are to be presented within the transportation apparatus based on the information related to transportation apparatus and the passenger information.
PCT/CN2016/113726 2015-12-31 2016-12-30 Facilitating targeted information delivery through a uav network WO2017114505A1 (en)

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US201562274112P 2015-12-31 2015-12-31
US62/274,112 2015-12-31
US15/341,809 US9800321B2 (en) 2015-12-31 2016-11-02 Facilitating communication with a vehicle via a UAV
US15/341,813 2016-11-02
US15/341,818 2016-11-02
US15/341,813 US9955115B2 (en) 2015-12-31 2016-11-02 Facilitating wide view video conferencing through a drone network
US15/341,797 2016-11-02
US15/341,824 US9826256B2 (en) 2015-12-31 2016-11-02 Facilitating multimedia information delivery through a UAV network
US15/341,831 US9786165B2 (en) 2015-12-31 2016-11-02 Facilitating location positioning service through a UAV network
US15/341,831 2016-11-02
US15/341,809 2016-11-02
US15/341,797 US10454576B2 (en) 2015-12-31 2016-11-02 UAV network
US15/341,818 US20170193556A1 (en) 2015-12-31 2016-11-02 Facilitating targeted information delivery through a uav network
US15/341,824 2016-11-02

Publications (1)

Publication Number Publication Date
WO2017114505A1 true WO2017114505A1 (en) 2017-07-06

Family

ID=59165112

Family Applications (6)

Application Number Title Priority Date Filing Date
PCT/CN2016/113592 WO2017114496A1 (en) 2015-12-31 2016-12-30 Facilitating location positioning service through a uav network
PCT/CN2016/113718 WO2017114501A1 (en) 2015-12-31 2016-12-30 Uav network
PCT/CN2016/113725 WO2017114504A1 (en) 2015-12-31 2016-12-30 Facilitating wide-view video conferencing through a uav network
PCT/CN2016/113726 WO2017114505A1 (en) 2015-12-31 2016-12-30 Facilitating targeted information delivery through a uav network
PCT/CN2016/113728 WO2017114506A1 (en) 2015-12-31 2016-12-30 Facilitating multimedia information delivery through uav network
PCT/CN2016/113724 WO2017114503A1 (en) 2015-12-31 2016-12-30 Facilitating communication with a vehicle via a uav

Family Applications Before (3)

Application Number Title Priority Date Filing Date
PCT/CN2016/113592 WO2017114496A1 (en) 2015-12-31 2016-12-30 Facilitating location positioning service through a uav network
PCT/CN2016/113718 WO2017114501A1 (en) 2015-12-31 2016-12-30 Uav network
PCT/CN2016/113725 WO2017114504A1 (en) 2015-12-31 2016-12-30 Facilitating wide-view video conferencing through a uav network

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/CN2016/113728 WO2017114506A1 (en) 2015-12-31 2016-12-30 Facilitating multimedia information delivery through uav network
PCT/CN2016/113724 WO2017114503A1 (en) 2015-12-31 2016-12-30 Facilitating communication with a vehicle via a uav

Country Status (2)

Country Link
CN (9) CN106878672A (en)
WO (6) WO2017114496A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017119686A1 (en) * 2017-08-28 2019-02-28 Andreas Rheinländer Surveillance, exploration and inspection system using drones
CN107809277B (en) * 2017-10-17 2020-02-14 安徽工业大学 Emergency rescue communication network networking method based on unmanned aerial vehicle and wireless equipment
EP3716688A4 (en) 2017-12-08 2020-12-09 Beijing Xiaomi Mobile Software Co., Ltd. Data transmission method and apparatus, and unmanned aerial vehicle
US20190197890A1 (en) * 2017-12-27 2019-06-27 GM Global Technology Operations LLC Methods, systems, and drones for assisting communication between a road vehicle and other road users
CN108471327A (en) * 2018-03-26 2018-08-31 广东工业大学 A kind of UAV Communication system
EP3849901A4 (en) * 2018-09-13 2022-05-18 CommScope Technologies LLC Location of assets deployed in ceiling or floor spaces or other inconvenient spaces or equipment using an unmanned vehicle
US20210373552A1 (en) * 2018-11-06 2021-12-02 Battelle Energy Alliance, Llc Systems, devices, and methods for millimeter wave communication for unmanned aerial vehicles
CN109582036B (en) * 2018-12-03 2021-04-27 南京航空航天大学 Consistency formation control method for quad-rotor unmanned aerial vehicle
CN110048762A (en) * 2019-04-23 2019-07-23 南京工业职业技术学院 A kind of implementation method of the air communication network based on solar energy unmanned plane
CN110321951B (en) * 2019-07-01 2021-03-16 青岛海科虚拟现实研究院 VR simulated aircraft training evaluation method
CN110944149A (en) * 2019-11-12 2020-03-31 上海博泰悦臻电子设备制造有限公司 Child care system and method for vehicle
CN111865395B (en) * 2020-06-12 2022-07-05 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Trajectory generation and tracking method and system for unmanned aerial vehicle formation communication
CN112906486B (en) * 2021-01-26 2023-09-12 吉利汽车研究院(宁波)有限公司 Passenger condition detection method, control method and system for unmanned taxi
CN114940180A (en) * 2021-02-10 2022-08-26 华为技术有限公司 Control method and device
CN112896193B (en) * 2021-03-16 2022-06-24 四川骏驰智行科技有限公司 Automobile remote auxiliary driving system and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102654940A (en) * 2012-05-23 2012-09-05 上海交通大学 Traffic information acquisition system based on unmanned aerial vehicle and processing method of traffic information acquisition system
US9170117B1 (en) * 2014-08-21 2015-10-27 International Business Machines Corporation Unmanned aerial vehicle navigation assistance
US20150355309A1 (en) * 2014-06-05 2015-12-10 University Of Dayton Target tracking implementing concentric ringlets associated with target features

Family Cites Families (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060140644A1 (en) * 2004-12-23 2006-06-29 Paolella Arthur C High performance, high efficiency fiber optic link for analog and RF systems
US9167195B2 (en) * 2005-10-31 2015-10-20 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US20070131822A1 (en) * 2005-06-20 2007-06-14 Kevin Leigh Taylor Stallard Aerial and ground robotic system
CN2917134Y (en) * 2006-03-30 2007-06-27 哈尔滨工程大学 DSP-based embedded real-time panoramic image acquisition and processing device
US20080018730A1 (en) * 2006-07-20 2008-01-24 Marc Roth For-hire vehicle interactive communication systems and methods thereof
US7848278B2 (en) * 2006-10-23 2010-12-07 Telcordia Technologies, Inc. Roadside network unit and method of organizing, managing and maintaining local network using local peer groups as network groups
JP2010541504A (en) * 2007-10-05 2010-12-24 パナソニック・アビオニクス・コーポレイション System and method for outputting advertising content to a moving mobile platform
US20100162327A1 (en) * 2008-12-18 2010-06-24 Airvod Limited In-Flight Entertainment System
US8515609B2 (en) * 2009-07-06 2013-08-20 Honeywell International Inc. Flight technical control management for an unmanned aerial vehicle
CN101651992B (en) * 2009-09-18 2011-01-05 北京航空航天大学 Data chain networking method used for autonomous formation of unmanned aerial vehicle
CN101790248B (en) * 2009-09-28 2012-06-20 长春理工大学 Auto-management data link of micro unmanned aerial vehicles
US9041765B2 (en) * 2010-05-12 2015-05-26 Blue Jeans Network Systems and methods for security and privacy controls for videoconferencing
CN102006574B (en) * 2011-01-05 2013-04-24 中国人民解放军理工大学 Wireless self-organized network-based integrated heterogeneous emergency communication network
JP2012212337A (en) * 2011-03-31 2012-11-01 Daihatsu Motor Co Ltd Inter-vehicle communication device and inter-vehicle communication system
EP2511656A1 (en) * 2011-04-14 2012-10-17 Hexagon Technology Center GmbH Measuring system for determining the 3D coordinates of an object surface
US20130002484A1 (en) * 2011-07-03 2013-01-03 Daniel A. Katz Indoor navigation with gnss receivers
DE102011113202A1 (en) * 2011-09-10 2013-03-14 Volkswagen Ag Method for operating a data receiver and data receiver, in particular in a vehicle
CN102436738B (en) * 2011-09-26 2014-03-05 同济大学 Traffic monitoring device based on unmanned aerial vehicle (UAV)
CN102355574B (en) * 2011-10-17 2013-12-25 上海大学 Image stabilizing method of airborne tripod head moving target autonomous tracking system
WO2014172369A2 (en) * 2013-04-15 2014-10-23 Flextronics Ap, Llc Intelligent vehicle for assisting vehicle occupants and incorporating vehicle crate for blade processors
WO2013142470A1 (en) * 2012-03-19 2013-09-26 Sony Mobile Communications Ab Video conferencing using wireless peripheral video conferencing device
US11328325B2 (en) * 2012-03-23 2022-05-10 Secureads, Inc. Method and/or system for user authentication with targeted electronic advertising content through personal communication devices
KR101393539B1 (en) * 2012-09-17 2014-05-09 기아자동차 주식회사 Integrated network system for vehicle
US8971274B1 (en) * 2012-11-09 2015-03-03 Google Inc. Valuation of and marketplace for inter-network links between balloon network and terrestrial network
CN103116994B (en) * 2012-12-28 2015-01-07 方科峰 Transportation system of optical communication and transportation system management method
WO2014179235A1 (en) * 2013-04-29 2014-11-06 Oceus Networks Inc. Mobile cellular network backhaul
US9070289B2 (en) * 2013-05-10 2015-06-30 Palo Alto Research Incorporated System and method for detecting, tracking and estimating the speed of vehicles from a mobile platform
CN103280108B (en) * 2013-05-20 2015-04-22 中国人民解放军国防科学技术大学 Passenger car safety pre-warning system based on visual perception and car networking
US9085370B2 (en) * 2013-06-03 2015-07-21 General Electric Company Systems and methods for wireless data transfer during in-flight refueling of an aircraft
US9503694B2 (en) * 2013-08-13 2016-11-22 GM Global Technology Operations LLC Methods and apparatus for utilizing vehicle system integrated remote wireless image capture
CN103413444B (en) * 2013-08-26 2015-08-19 深圳市川大智胜科技发展有限公司 A kind of traffic flow based on unmanned plane HD video is investigated method
CN203596823U (en) * 2013-09-24 2014-05-14 中国航天空气动力技术研究院 Unmanned plane high-altitude base station communication system
US9324189B2 (en) * 2013-09-27 2016-04-26 Intel Corporation Ambulatory system to communicate visual projections
US20150134143A1 (en) * 2013-10-04 2015-05-14 Jim Willenborg Novel tracking system using unmanned aerial vehicles
US20150127460A1 (en) * 2013-11-04 2015-05-07 Vixs Systems Inc. Targeted advertising based on physical traits and anticipated trajectory
CN103780313A (en) * 2014-01-21 2014-05-07 桂林航天光比特科技股份公司 Laser energy supply communication system for air vehicle
US20150271452A1 (en) * 2014-03-21 2015-09-24 Ford Global Technologies, Llc Vehicle-based media content capture and remote service integration
CN103914076B (en) * 2014-03-28 2017-02-15 浙江吉利控股集团有限公司 Cargo transferring system and method based on unmanned aerial vehicle
CN103985230B (en) * 2014-05-14 2016-06-01 深圳市大疆创新科技有限公司 A kind of Notification Method based on image, device and notice system
US9334052B2 (en) * 2014-05-20 2016-05-10 Verizon Patent And Licensing Inc. Unmanned aerial vehicle flight path determination, optimization, and management
CN107703963B (en) * 2014-07-30 2020-12-01 深圳市大疆创新科技有限公司 Target tracking system and method
CN104168455B (en) * 2014-08-08 2018-03-09 北京航天控制仪器研究所 A kind of space base large scene camera system and method
CN104394472B (en) * 2014-11-21 2018-08-03 成都亿盟恒信科技有限公司 A kind of 3G onboard wireless video-on-demand system and method
CN104699102B (en) * 2015-02-06 2017-07-18 东北大学 A kind of unmanned plane and intelligent vehicle collaborative navigation and investigation monitoring system and method
CN104796611A (en) * 2015-04-20 2015-07-22 零度智控(北京)智能科技有限公司 Method and system for remotely controlling unmanned aerial vehicle to implement intelligent flight shooting through mobile terminal
CN104766481A (en) * 2015-04-29 2015-07-08 深圳市保千里电子有限公司 Method and system for unmanned plane to conduct vehicle tracking
CN104881650A (en) * 2015-05-29 2015-09-02 成都通甲优博科技有限责任公司 Vehicle tracking method based on unmanned aerial vehicle (UAV) dynamic platform
CN105139606B (en) * 2015-07-29 2019-04-02 重庆赛乐威航空科技有限公司 A kind of low flyer information interaction system
CN105100728A (en) * 2015-08-18 2015-11-25 零度智控(北京)智能科技有限公司 Unmanned aerial vehicle video tracking shooting system and method
CN105119650B (en) * 2015-08-24 2018-03-02 杨珊珊 Signal relay system and its signal trunking method based on unmanned vehicle
CN204887278U (en) * 2015-09-15 2015-12-16 成都时代星光科技有限公司 Unmanned aerial vehicle is in air from network deployment image transmission system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102654940A (en) * 2012-05-23 2012-09-05 上海交通大学 Traffic information acquisition system based on unmanned aerial vehicle and processing method of traffic information acquisition system
US20150355309A1 (en) * 2014-06-05 2015-12-10 University Of Dayton Target tracking implementing concentric ringlets associated with target features
US9170117B1 (en) * 2014-08-21 2015-10-27 International Business Machines Corporation Unmanned aerial vehicle navigation assistance

Also Published As

Publication number Publication date
WO2017114504A1 (en) 2017-07-06
WO2017114503A1 (en) 2017-07-06
CN107046710A (en) 2017-08-15
CN107040754A (en) 2017-08-11
CN107070531A (en) 2017-08-18
CN106878672A (en) 2017-06-20
WO2017114506A1 (en) 2017-07-06
CN206517444U (en) 2017-09-22
CN206481394U (en) 2017-09-08
CN208401845U (en) 2019-01-18
WO2017114496A1 (en) 2017-07-06
WO2017114501A1 (en) 2017-07-06
CN106982345A (en) 2017-07-25
CN107071794A (en) 2017-08-18

Similar Documents

Publication Publication Date Title
WO2017114505A1 (en) Facilitating targeted information delivery through a uav network
US10097862B2 (en) Facilitating multimedia information delivery through a UAV network
US20170193556A1 (en) Facilitating targeted information delivery through a uav network
US10354521B2 (en) Facilitating location positioning service through a UAV network
US10440323B2 (en) Facilitating wide view video conferencing through a drone network
US10454564B2 (en) Facilitating communication with a vehicle via a UAV
US11243530B2 (en) Detection and communication of safety events
CN106092197A (en) Environment detection method and system based on unmanned plane
US20190096215A1 (en) Amber alert monitoring and support
JP2019526846A (en) Passive optical detection method and system for vehicle
US20200126413A1 (en) Uav network assisted situational self-driving
EP3251107A1 (en) Remote accident monitoring and vehcile diagnostic distributed database
US11787346B2 (en) Systems and methods for a housing equipment for a security vehicle
KR101883292B1 (en) Disaster rescue and response system and operating method threrof
US20190014456A1 (en) Systems and methods for collaborative vehicle mission operations
JP6943025B2 (en) Parking support system and parking support method
CN112002145B (en) Unmanned aerial vehicle illegal flight reporting method and system
CN114640794A (en) Camera, camera processing method, server processing method, and information processing apparatus
US20220222473A1 (en) Vehicle communication systems and methods for detecting and capturing relevant object data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16881294

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04/10/18)

122 Ep: pct application non-entry in european phase

Ref document number: 16881294

Country of ref document: EP

Kind code of ref document: A1