US20210116918A1 - Control aerial movement of drone based on line-of-sight of humans using devices - Google Patents
Control aerial movement of drone based on line-of-sight of humans using devices Download PDFInfo
- Publication number
- US20210116918A1 US20210116918A1 US16/985,576 US202016985576A US2021116918A1 US 20210116918 A1 US20210116918 A1 US 20210116918A1 US 202016985576 A US202016985576 A US 202016985576A US 2021116918 A1 US2021116918 A1 US 2021116918A1
- Authority
- US
- United States
- Prior art keywords
- drones
- location
- determining
- drone
- sight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000033001 locomotion Effects 0.000 title abstract description 23
- 238000000034 method Methods 0.000 claims description 26
- 230000006870 function Effects 0.000 claims description 6
- 230000002547 anomalous effect Effects 0.000 claims description 4
- 230000006399 behavior Effects 0.000 claims description 4
- 230000003287 optical effect Effects 0.000 claims description 3
- 241000282412 Homo Species 0.000 description 32
- 238000004891 communication Methods 0.000 description 29
- 230000001960 triggered effect Effects 0.000 description 13
- 230000001276 controlling effect Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 6
- 230000033228 biological regulation Effects 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000007921 spray Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000010267 cellular communication Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000012517 data analytics Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 230000001105 regulatory effect Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- B64C2201/127—
-
- B64C2201/128—
-
- B64C2201/14—
-
- B64C2201/145—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
- B64U10/14—Flying platforms with four distinct rotor axes, e.g. quadcopters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/60—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
- B64U2201/104—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/50—Receiving or transmitting feedback, e.g. replies, status updates, acknowledgements, from the controlled devices
- G08C2201/51—Remote controlling of devices based on replies, status thereof
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/90—Additional features
- G08C2201/93—Remote control using other portable devices, e.g. mobile phone, PDA, laptop
Definitions
- Drones are unmanned aerial vehicles. Some drones can be controlled autonomously by onboard computers while other drones can be controlled via remote control or other means. Drones can be used for a wide array of functionality, from recreational use to commercial use to military use.
- FIG. 1 is a system including a drone capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example;
- FIG. 2 is a system including drones capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example;
- FIG. 3 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device on the human, where the wearable device is tracked, according to an example;
- FIG. 4 is a block diagram of a drone capable of tracking humans based on wearable devices and controlling aerial movement of the drone to stay within a line-of-sight of at least one of the humans, according to an example;
- FIG. 5 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device, where an alert is provided if a rule is triggered based on location information, according to an example.
- Drones can be used for various functions, such as helping optimize crop yield on a farm, monitoring children at a school, monitoring prison inmates, recreation, etc.
- Tracking individual humans with drones can be useful for various reasons. For example, parents may prefer to know the whereabouts of their children. Individuals may be concerned about their elderly parents or loved ones and may wish to track them, with their consent. Further, people may be more comfortable using a car service if a drone tracking system was able to confirm their whereabouts.
- Governments may decide to provide safety rules for usage of drones. For example, governments may choose to make a rule to require line-of-sight from a human being to the drone, may require that an operator be able to control the drone, make a rule to limit speed of the drone, provide rules for the altitude of the drone, limit particular airspace, etc.
- Drones can include multiple sensors. Further, people can have devices (e.g., wearable devices) within a threshold proximity that can help the drone track and monitor the people. Moreover, the devices can also be used to confirm line-of-sight between a human being and a drone. In some examples, information from drones and the devices can be sent to a computing system (e.g., a cloud computing system). The computing system can be used to provide tracking alerts or other services from information provided by the drones and/or wearable devices.
- a computing system e.g., a cloud computing system. The computing system can be used to provide tracking alerts or other services from information provided by the drones and/or wearable devices.
- an entity that controls the drone can use a controlling device, such as a handheld controller with communication to the drone, a mobile phone, another wearable device, etc. to communicate with the drone to control the drone. Further, in some examples, the communication can be routed (e.g., via the Internet) to the drone. Moreover, in some examples, the computing system can be used to control the drone via a communication infrastructure (e.g., using cellular communications).
- a controlling device such as a handheld controller with communication to the drone, a mobile phone, another wearable device, etc.
- the communication can be routed (e.g., via the Internet) to the drone.
- the computing system can be used to control the drone via a communication infrastructure (e.g., using cellular communications).
- sensory information can be collected from the devices associated with humans and the drones to allow the computing system to provide services.
- a cloud-based application can provide an information hub for subscribers. People can register their devices to the service. When certain conditions are met (e.g., the device is on and at a particular location), the device can provide information to the computing system. The computing system can also coordinate with the drones. Thus, the computing system can act as an information hub allowing for processing of the data to provide services, such as notifications, data retention (e.g., via video from the drone, location of the drone, location information about the devices from the drone and/or the devices themselves, etc.). Moreover, alerts can be integrated into existing systems via an Application Programming Interface (API).
- API Application Programming Interface
- Example systems can include emergency management systems (EMS) such as Amber Alert or other emergency notification systems.
- EMS emergency management systems
- the drone can follow the child and an alert can go out (e.g., to the parent, the school, an entity controlling the drone, combinations thereof, etc.).
- Similar integrations can occur for proprietary systems (e.g., an alert to a prison guard for a prison security context).
- FIG. 1 is a system including a drone capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example.
- the system 100 can include a drone 110 that communicates with devices 150 a - 150 n . Communications can occur via a communication network (e.g., via network connections, via the Internet, using cell technologies, etc.), transmissions from the devices to and from the drone (e.g., using radio frequency transmissions), etc.
- the drone 110 includes components to perform aerial movement, sensors, and computing components.
- a navigation engine 120 can be used to control movement of the drone 110 .
- the line-of-sight engine 122 can use sensor information to determine whether each of the devices 150 a - 150 n associated with respective humans 152 a - 152 n and/or the humans 152 a - 152 n are within line-of-sight.
- the use of sensors can be used for various activity by the drone 110 .
- FIG. 2 is a system 200 including drones capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example.
- line-of-sight represents the visibility of a human to the drone.
- Line-of-sight can be defined by criteria used by the drone.
- the system 200 can include drone 110 as well as drones 210 a - 210 m , the devices 150 associated with respective humans 152 a - 152 n , a drone control device 160 to control one or more of the drones 110 , 210 , and a platform 170 that can provide services based on information provided by the drones 110 , 210 and/or devices 150 a - 150 n .
- the drone 110 can also include rules 124 , a location engine 126 , sensors 128 , a tracking engine 130 , an alert engine 132 , and aerial components 134 .
- the drone 110 may include at least one processor 230 , memory 232 , and input/output interfaces 234 .
- the navigation engine 120 can control aerial movement of the drone by using aerial components.
- aerial components 134 can include one or more motors to control moving parts of the drone.
- the navigation engine 120 can control the motors to turn propellers to control drone movement.
- Various technologies can be used to implement drone motion (e.g., by creating aerodynamic forces to create lift of the drone). Examples of types of drones can include helicopter drones, quadcopter drones, plane drones, etc.
- Drones can also come in various sizes. For example, a drone may be a large size (e.g., more than 55 pounds), can be a smaller size (e.g., under 55 pounds) that may be regulated under different rules, and even smaller size (e.g., less than 4.4 pounds), etc. Size examples can vary, and can be determined based on one or more regulations from a government entity (e.g., the Federal Aviation Administration).
- the line-of-sight engine 122 can be used to determine whether the drone is within line-of-sight with at least one of the devices 150 a - 150 n within a physical proximity to a human 152 a - 152 n .
- physical proximity to a human means that the device is located on the human (e.g., a cell phone in a pocket, a smart device in a hat, a smart watch, a bracelet, smart glasses, etc.).
- the devices are wearable devices. Wearable devices are clothing and accessories incorporating computer or advanced electronic technologies.
- the devices 150 a - 150 n are located external to the human.
- a hat may include technology to allow the drone 110 to determine that there is line-of-sight with the human and/or the device 150 .
- the device 150 can include global positioning system (GPS) technology.
- GPS global positioning system
- the GPS technology can be used to determine a position of the device.
- the position can be provided to the drone 110 either directly via input/output interfaces 234 or via a platform 170 .
- the device 150 may include a locator (e.g., a beacon such as a radio beacon, an infrared beacon, Wi-Fi beacon, etc.).
- Drones 110 , 210 can use the beacon to determine the position of the device 150 and/or human (e.g., via triangulation).
- sensors from multiple drones can be used to triangulate the position of a human.
- a combination of technologies can be used to determine whether the drone 110 has a line-of-sight with the respective devices 150 and/or the associated human 152 .
- the drone 110 may use the GPS or beacon technology to determine the general location of the device and/or human and then use a sensor 128 (e.g., an image sensor, an infrared sensor, etc.) to confirm that there is a line-of-sight between the drone 110 and the respective device and/or human.
- a sensor 128 e.g., an image sensor, an infrared sensor, etc.
- an infrared beacon can be used to confirm line-of-sight.
- the location of the device can be used to determine a place for the drone 110 to look for a human.
- the drone 110 can use a sensor 128 , such as an optical sensor, to look for a human in that area. Recognition technology can be used to determine whether there is a human in that area. If so, then there is line-of-sight to that human.
- the device itself can be used for line-of-sight determination in one step.
- the device can send out a beacon that can be received by the drone 110 if there is line-of-sight (e.g., an infrared beacon). If the drone 110 receives the signal, then there is line-of-sight.
- line-of-sight e.g., an infrared beacon
- these beacons can be configured to provide a signal that lasts a particular range. As such, distance can be taken into account.
- the line-of-sight determination can be based on line-of-sight based communications.
- the tracking of the devices/humans can be used for dual purposes.
- the first purpose is to track the respective humans (e.g., to ensure a location of the humans 152 are within particular parameters).
- the second purpose is to ensure that drones 110 are within line-of-sight of at least one human.
- the drone 110 can be controlled via the navigation engine 120 to become within line-of-sight of at least one human.
- Rules 124 can be used to specify when the drone 110 is not within line-of-sight.
- the rules 124 can include a rule that specifies distance criteria such as a threshold distance between the drone 110 and the devices.
- the location of the devices 150 /humans 152 can be determined. Also, the location of the drone 110 can be determined.
- the location of the drone 110 can be maintained using sensors 128 (e.g., accelerometer, gyroscope, compass, GPS, cell tower tracking, other positioning systems, altimeter, thermal sensors, combinations thereof, etc.).
- the location engine 126 can determine the location of the drone and a location of the respective devices 150 .
- the distance criteria can be a customizable criteria that indicates a distance that can be deemed to be associated with a lack of line-of-sight.
- the criteria can take into account a size of the drone 110 and a sighting capability of a human being (e.g., at a particular visual acuity).
- the criteria may also take into account dynamic visual controls, such as weather.
- the particular human and associated device can be considered to possibly lack a line-of-sight with the drone 110 .
- a line-of-sight determination can be based on the criteria.
- the criteria can be used to determine a potential lack of line-of-sight and another sensor 128 can be used to confirm a lack of line-of-sight.
- the distance criteria can be used to determine whether there is a potential lack of line-of-sight and an image sensor, infrared sensor, etc. can be used to confirm a lack of line-of-sight or confirm that there is a line-of-sight.
- three-dimensional maps of the terrain can be used to determine whether the location a human is at has line-of-sight with the drone 110 based on obstructions. This can be based on a land location of the user and any obstructions.
- sensor data e.g., image data, sonar data, etc.
- the three-dimensional map processing can be accomplished at the drones 110 .
- the three-dimensional map processing can be off-loaded to a platform 170 , such as a cloud system or computing system.
- rules 124 can include an action to take by the drone 110 based on what criteria has been fulfilled.
- the drone 110 can be caused to return to a line-of-sight of at least one of the devices 150 based on a determination of a potential lack of line-of-sight or a confirmed lack of line-of-sight.
- the rules 124 may further specify where the drone 110 is to go. For example, the drone 110 can be instructed to move to within a certain distance of one of the devices 150 (e.g., a closest one of the devices 150 ).
- the drone 110 can have a pre-determined path to take, a dynamic path to take, a static area to patrol, a dynamic area to patrol (e.g., based on locations of the devices 150 ), etc.
- the drone 110 may be instructed to move to within a certain location or distance from one of the devices 150 within its future path.
- the lack of line-of-sight is based on a determination that one of the devices is within the criteria and the other devices are already determined to lack the line-of-sight.
- one of the rules 124 can specify boundary criteria for the humans 152 and/or devices 150 .
- the tracking engine 130 can determine whether the location of a device meets the boundary criteria.
- the boundary criteria can include a set of location coordinates that can be mapped.
- the alert engine 132 can determine an alert based on criteria, such as the boundary criteria. If a respective device 150 is outside of the boundary or within a boundary threshold, an alert can be set. The alert can cause the drone 110 to go to the respective device 150 .
- boundary criteria may be to monitor children at a school or playground. If the child moves past the boundary, particular actions can be taken by the drone 110 , such as the navigation engine 120 moving the drone 110 towards the child, sending an alert to a user (e.g., registered parent), sending causing video to start recording and target the child, etc.
- the devices 150 and drones 110 , 210 can be used to track inmates.
- the drones 110 , 210 can be used for agriculture.
- the drones 110 , 210 can be used to track worker movement while also performing other activities.
- the drones can be used to spray pesticides, irrigate, etc. over a portion of a field.
- the devices 150 can be used to ensure that humans 152 are not in the field during spray.
- the line-of-sight engine 122 can be used to ensure that proper supervision of the drone occurs.
- one or more of the drones 110 , 210 can be controlled by a drone control device 160 (e.g., a remote control, a mobile device using an app, etc.).
- the rules 124 can be used to implement other functionality.
- a rule 124 can specify conditions that show that a respective device 150 is not within proximity of an associated human 152 .
- the device 150 may include accelerometer information that can be sent to the drone 110 and/or platform 170 .
- the accelerometer information can be compared to a profile or other function to determine whether anomalous behavior is present.
- anomalous behavior includes no motion from the device 150 .
- a device 150 located on a human 152 would show some motion (e.g., from breathing). Therefore, the lack of any movement could show that the human 152 is no longer associated with the device 150 .
- the rule 124 can further specify that in response to the condition occurring, the navigation engine 120 controls aerial movement of the drone 110 towards a location of the device 150 and/or human 152 based on the recognition that the device is not within proximity of the human 152 .
- the drone 110 , 210 can include an altimeter and the drone 110 , 210 can have an altitude range to fly within.
- the drone 110 , 210 can include rules 124 to keep the drones 110 , 210 from flying directly overhead of a human. Rules 124 can be used in conjunction with sensors 128 to provide various navigation adjustments.
- the platform 170 can import government regulations and base the rules based on the government regulations.
- the platform 170 can be a computing system, such as a cloud computing system that can be used to communicate (e.g., with the drone 110 , with devices 150 , with other devices, etc.) via a communication engine 174 .
- Various functionality described herein as being performed by the drone 110 can be offloaded to the platform 170 .
- the location of the devices 150 a - 150 n can be determined by the platform 170 .
- information about the drones 110 , 210 can be sent to the platform 170 via a communication engine 174 .
- the information can include locations, navigation programming, etc.
- the information from the devices 150 can be sent to the communication engine 174 .
- the information can include location of the devices, other sensor information (e.g., accelerometer information), etc.
- the platform 170 can perform data analytics on the information to determine whether one or more rules or alerts are triggered. If certain rules are triggered, the control engine 176 can send the drone 110 instructions to move accordingly (e.g., to a device location, within a boundary, etc.).
- an alert can be sent to the user.
- the subscription engine 172 can be used to register users to a database.
- the database can include devices 150 and/or humans 152 that the registered user is interested in. If an alert occurs, the user can be sent an alert.
- the user can be an administrator at the school, a parent, etc. In the prison example, the user can be a warden, a prison guard, etc. Similar examples can be used for tracking others such as elderly people, disabled people, etc.
- Users can register devices 150 with humans 152 and an alert location (e.g., an email address, a phone number, etc.). When criteria associated with an alert is met, the subscription engine 172 can cause sending of an alert to the alert location. Further, in some scenarios, other information can be provided such as a video feed of the human 152 , control over a drone looking for the human, etc.
- the alerts can be sent to an emergency alert system.
- the communication engine 174 can use APIs to communicate the alerts to systems associated with the triggered rule. For example, in a rule context of a missing child, an API associated with an Amber Alert system can be used.
- multiple drones 110 , 210 a - 210 m can be used to monitor multiple humans 152 a - 152 n .
- the drones 110 , 210 can be coordinated via navigation engines and/or a platform 170 that can centralize control.
- Multiple drones 110 , 210 can be used to track humans 152 associated with devices 150 (e.g., wearable devices) as well as to keep within line-of-sight of at least one of the humans 152 . This can ensure that the drone is supervised while also ensuring that the humans are tracked.
- a communication network can be used to connect one or more of the drones 110 , 210 , devices 150 , platform 170 , other devices, etc.
- the communication network can use wired communications, wireless communications, or combinations thereof.
- the communication network can include multiple sub communication networks such as data networks, wireless networks, telephony networks, etc.
- Such networks can include, for example, a public data network such as the Internet, local area networks (LANs), wide area networks (WANs), metropolitan area networks (MANs), cable networks, fiber optic networks, combinations thereof, or the like.
- wireless networks may include cellular networks, satellite communications, wireless LANs, etc.
- the communication network can be in the form of a direct network link between devices.
- Various communications structures and infrastructure can be utilized to implement the communication network(s).
- devices 150 and drones 110 , 210 can have multiple means of communication.
- devices can communicate with each other and other components with access to the communication network via a communication protocol or multiple protocols.
- a protocol can be a set of rules that defines how nodes of the communication network interact with other nodes.
- communications between network nodes can be implemented by exchanging discrete packets of data or sending messages. Packets can include header information associated with a protocol (e.g., information on the location of the network node(s) to contact) as well as payload information.
- the engines 120 , 122 , 126 , 130 , 132 , 172 , 174 , 176 include hardware and/or combinations of hardware and programming to perform functions provided herein.
- the modules can include programing functions and/or combinations of programming functions to be executed by hardware as provided herein.
- functionality attributed to an engine can also be attributed to the corresponding module and vice versa.
- functionality attributed to a particular module and/or engine may also be implemented using another module and/or engine.
- a processor 230 such as a central processing unit (CPU) or a microprocessor suitable for retrieval and execution of instructions and/or electronic circuits can be configured to perform the functionality of any of the engines described herein.
- instructions and/or other information such as location information, registration information, etc., can be included in memory 232 or other memory.
- Input/output interfaces 234 may additionally be provided by the drone 110 .
- some components can be utilized to implement functionality of other components described herein.
- Input/output devices such as communication devices like network communication devices or wireless devices can also be considered devices capable of using the input/output interfaces 234 .
- Each of the modules may include, for example, hardware devices including electronic circuitry for implementing the functionality described herein.
- each module may be implemented as a series of instructions encoded on a machine-readable storage medium of a computing device and executable by a processor. It should be noted that, in some embodiments, some modules are implemented as hardware devices, while other modules are implemented as executable instructions.
- FIG. 3 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device on the human, where the wearable device is tracked, according to an example.
- FIG. 4 is a block diagram of a drone capable of tracking humans based on wearable devices and controlling aerial movement of the drone to stay within a line-of-sight of at least one of the humans, according to an example.
- method 300 may be implemented in the form of executable instructions stored on a machine-readable storage medium, such as storage medium 420 , and/or in the form of electronic circuitry.
- the drone 400 includes, for example, a processor 410 , and a machine-readable storage medium 420 including instructions 422 , 424 , 426 for controlling the drone 400 according to rules and information about wearable devices located on humans.
- Processor 410 may be at least one central processing unit (CPU), at least one semiconductor-based microprocessor, at least one graphics processing unit (GPU), other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 420 , or combinations thereof.
- the processor 410 may include multiple cores on a chip, include multiple cores across multiple chips, multiple cores across multiple devices (e.g., between the drone and a cloud system), or combinations thereof.
- Processor 410 may fetch, decode, and execute instructions 422 , 424 , 426 to implement tracking of users with wearable devices and changing aerial movement based on a line-of-sight with one or more of the users/wearable devices.
- processor 410 may include at least one integrated circuit (IC), other control logic, other electronic circuits, or combinations thereof that include a number of electronic components for performing the functionality of instructions 422 , 424 , 426 .
- IC integrated circuit
- Machine-readable storage medium 420 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions.
- machine-readable storage medium may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like.
- RAM Random Access Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- CD-ROM Compact Disc Read Only Memory
- machine-readable storage medium can be non-transitory.
- machine-readable storage medium 420 may be encoded with a series of executable instructions for controlling aerial movement of a drone based on a location of a wearable device.
- aerial movement instructions 424 can be executed by processor 410 to control the drone 400 (e.g., by controlling aerial components of the drone).
- the drone 400 can be controlled using programmed instructions executed by the processor 410 .
- the drone 400 can be set to patrol an area, can be set to follow a pattern, can be set to dynamically alter the patrol or pattern based on conditions (e.g., movement of tracked devices, weather, etc.), or the like.
- the drone 400 can receive other control instructions from a control unit (e.g., a remote control, a remote application on a smart device, etc.).
- the drone 400 can be used to track humans using wearable devices.
- line-of-sight instructions 422 can be executed by processor 410 to determine whether the drone 400 is within a line-of-sight of the wearable device and/or the respective humans ( 304 ).
- the drone 400 can use this information to determine whether the drone 400 is within line-of-sight of at least one of the humans. Further, the drone 400 can track the respective humans using the wearable devices.
- the drone 400 can determine that it is within a buffer distance from one of the wearable devices indicative of a possible lack of line-of-sight to the wearable device and that there is a lack of line-of-sight from the other wearable devices.
- the term “possible lack of line-of-sight” means that the drone does not lack the line-of-sight, but is within the buffer distance and/or the drone does lack the line-of-sight.
- the determination can be according to a rule and sensor information.
- the lack of the line-of-sight of the other wearable devices can be based on location information received from the wearable devices, sensor data captured at the drone, combinations thereof, etc.
- a certain distance can be indicative of a lack of line-of-sight.
- the distance can be augmented by weather conditions (e.g., fog, cloudiness, etc.).
- the lack of line-of-sight can be determined based on a visual or infrared sensor on the drone 400 , laser communication between the drone 400 and wearable devices, etc.
- the wearable device is a head device, such as a cap or helmet.
- the head device can include a beacon that can be used to facilitate the line-of-sight determination.
- the buffer distance is a threshold distance that is smaller in value than a distance indicative of a lack of line-of-sight.
- the buffer distance can be used to cause augmentation of the drone's path before the drone 400 has a lack of line-of-sight from the one wearable device.
- the aerial movement instructions 424 can be executed to move the drone 400 towards the wearable device. This ensures that at least one of the many wearable devices is within a line-of-sight of the drone.
- government regulations may specify that a line-of-sight between a human and a drone be maintained.
- one of the wearable devices can be selected based on a triggered rule by executing selection instructions 426 .
- triggers can include the wearer of the wearable device moving outside of a boundary, indications that the wearable device is no longer associated with the wearer, a stoppage of communication from the wearable device, etc.
- the processor 410 can determine a location of the wearable device (e.g., based on GPS coordinates, other location information, etc.).
- Aerial movement instructions 424 can be executed to cause the drone 400 to move towards the location.
- the location can be updated and the drone can follow the wearable device.
- the wearable device is followed until a manual control signal is received from a control device (e.g., a remote control).
- alerts can be associated with rules.
- an alert can be sent.
- the alert can be sent to registered user devices.
- FIG. 5 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device, where an alert is provided if a rule is triggered based on location information, according to an example.
- execution of method 500 is described below with reference to a computing system, other suitable components for execution of method 500 can be utilized (e.g., platform 170 , other computing devices, cloud computing systems, etc.). Additionally, the components for executing the method 500 may be spread among multiple devices.
- Method 500 may be implemented in the form of executable instructions stored on a machine-readable storage medium, such as storage medium, and/or in the form of electronic circuitry.
- a drone can be configured to move throughout a static or dynamic area.
- the computing system can control the drone's motions (e.g., by sending programming for the drone to execute).
- the computing system receives location information about multiple wearable devices that are monitored by a drone via aerial monitoring.
- the wearable devices can be worn by the respective humans and thus be within a physical proximity to the human.
- the wearable devices can be registered to alerts.
- the alerts may include an identifier of the wearable device and/or a human associated with the wearable device, a rule associated with when the alert should occur, and contact information (e.g., an email address, a phone number, Internet address, etc.) to send the alert.
- the alerts can be sent in response to registration for the alerts.
- a parent may be interested in tracking their child at school and can have an alert associated with when the child is within a particular distance to/from a school boundary, when the child is a certain distance from a teacher wearing another wearable device, etc.
- the drone can be controlled to stay within a line-of-sight of at least one of the wearable devices from multiple wearable devices (e.g., wearable devices respectively associated with particular children).
- the computing system determines that one of the wearable devices is acting according to a triggered rule based on the received location information.
- various rules can be used.
- alerts can be associated with the rules.
- the computing system can provide an associated alert for the rule (e.g., based on registration for the alert).
- the alert can be based on a determination that the wearable device is outside of a boundary associated with the triggered rule based on the location information of the wearable device (e.g., notify a registered user that a child is outside of a school boundary).
- the triggered rule can alert a user that the drone is outside of a line-of-sight of at least one of the wearable devices or is in a buffer distance from at least one of the wearable devices that indicates as possible lack of line-of-sight from the set of wearable devices.
- the rule can indicate that if each of the wearable devices is out of the line-of-sight of the drone and/or within a threshold distance away (e.g., at a buffer range), the rule is triggered.
- the drone can then be controlled to move towards one of the wearable devices.
- the wearable device to move towards can be selected based on criteria (e.g., the closest wearable device to the drone, a close wearable device within a path the drone is on, etc.).
- the computing system can send a control command to cause the drone to move towards the selected wearable device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Traffic Control Systems (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Examples disclosed herein relate to control of a drone. In one example, aerial movement of the drone is controlled. In the example, it is determined, based on a plurality of devices, whether the drone is within a line-of-sight with at least a respective one of a plurality of humans within a physical proximity to a respective one of the devices. In the example, the devices are used by the drone to track the humans. In the example, when the drone is determined to lack the line-of-sight, aerial movement of the drone is controlled to move the drone to become within the line-of-sight.
Description
- This application is a continuation of U.S. application Ser. No. 15/739,589, filed on Dec. 22, 2017 and titled “CONTROL AERIAL MOVEMENT OF DRONE BASED ON LINE-OF-SIGHT OF HUMANS USING DEVICES” which is national stage application pursuant to 35 U.S.C. § 371 of International Application No. PCT/US2015/037428, filed Jun. 24, 2015, which are all incorporated herein by reference.
- Drones are unmanned aerial vehicles. Some drones can be controlled autonomously by onboard computers while other drones can be controlled via remote control or other means. Drones can be used for a wide array of functionality, from recreational use to commercial use to military use.
- The following detailed description references the drawings, wherein:
-
FIG. 1 is a system including a drone capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example; -
FIG. 2 is a system including drones capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example; -
FIG. 3 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device on the human, where the wearable device is tracked, according to an example; -
FIG. 4 is a block diagram of a drone capable of tracking humans based on wearable devices and controlling aerial movement of the drone to stay within a line-of-sight of at least one of the humans, according to an example; and -
FIG. 5 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device, where an alert is provided if a rule is triggered based on location information, according to an example. - Recent development of drones unlocks a number of opportunities to better life for mankind. Drones can be used for various functions, such as helping optimize crop yield on a farm, monitoring children at a school, monitoring prison inmates, recreation, etc.
- Tracking individual humans with drones can be useful for various reasons. For example, parents may prefer to know the whereabouts of their children. Individuals may be concerned about their elderly parents or loved ones and may wish to track them, with their consent. Further, people may be more comfortable using a car service if a drone tracking system was able to confirm their whereabouts.
- Governments may decide to provide safety rules for usage of drones. For example, governments may choose to make a rule to require line-of-sight from a human being to the drone, may require that an operator be able to control the drone, make a rule to limit speed of the drone, provide rules for the altitude of the drone, limit particular airspace, etc.
- Drones can include multiple sensors. Further, people can have devices (e.g., wearable devices) within a threshold proximity that can help the drone track and monitor the people. Moreover, the devices can also be used to confirm line-of-sight between a human being and a drone. In some examples, information from drones and the devices can be sent to a computing system (e.g., a cloud computing system). The computing system can be used to provide tracking alerts or other services from information provided by the drones and/or wearable devices.
- In some examples, an entity that controls the drone can use a controlling device, such as a handheld controller with communication to the drone, a mobile phone, another wearable device, etc. to communicate with the drone to control the drone. Further, in some examples, the communication can be routed (e.g., via the Internet) to the drone. Moreover, in some examples, the computing system can be used to control the drone via a communication infrastructure (e.g., using cellular communications).
- In some examples, sensory information can be collected from the devices associated with humans and the drones to allow the computing system to provide services. For example, a cloud-based application can provide an information hub for subscribers. People can register their devices to the service. When certain conditions are met (e.g., the device is on and at a particular location), the device can provide information to the computing system. The computing system can also coordinate with the drones. Thus, the computing system can act as an information hub allowing for processing of the data to provide services, such as notifications, data retention (e.g., via video from the drone, location of the drone, location information about the devices from the drone and/or the devices themselves, etc.). Moreover, alerts can be integrated into existing systems via an Application Programming Interface (API). Example systems can include emergency management systems (EMS) such as Amber Alert or other emergency notification systems. As such, if the drone is tracking a child at a school and the child is moved outside of a boundary associated with the school, the drone can follow the child and an alert can go out (e.g., to the parent, the school, an entity controlling the drone, combinations thereof, etc.). Similar integrations can occur for proprietary systems (e.g., an alert to a prison guard for a prison security context).
-
FIG. 1 is a system including a drone capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example. Thesystem 100 can include adrone 110 that communicates with devices 150 a-150 n. Communications can occur via a communication network (e.g., via network connections, via the Internet, using cell technologies, etc.), transmissions from the devices to and from the drone (e.g., using radio frequency transmissions), etc. In certain examples, thedrone 110 includes components to perform aerial movement, sensors, and computing components. For example, anavigation engine 120 can be used to control movement of thedrone 110. Further, the line-of-sight engine 122 can use sensor information to determine whether each of the devices 150 a-150 n associated with respective humans 152 a-152 n and/or the humans 152 a-152 n are within line-of-sight. The use of sensors can be used for various activity by thedrone 110. -
FIG. 2 is asystem 200 including drones capable of tracking humans based on devices and control the drone to be within line-of-sight of at least one of the devices, according to an example. In various examples, line-of-sight represents the visibility of a human to the drone. Line-of-sight can be defined by criteria used by the drone. Thesystem 200 can includedrone 110 as well as drones 210 a-210 m, the devices 150 associated with respective humans 152 a-152 n, a drone control device 160 to control one or more of thedrones 110, 210, and aplatform 170 that can provide services based on information provided by thedrones 110, 210 and/or devices 150 a-150 n. In some examples, thedrone 110 can also includerules 124, alocation engine 126,sensors 128, atracking engine 130, analert engine 132, andaerial components 134. Moreover, thedrone 110 may include at least oneprocessor 230,memory 232, and input/output interfaces 234. - The
navigation engine 120 can control aerial movement of the drone by using aerial components. In some examples,aerial components 134 can include one or more motors to control moving parts of the drone. For example, thenavigation engine 120 can control the motors to turn propellers to control drone movement. Various technologies can be used to implement drone motion (e.g., by creating aerodynamic forces to create lift of the drone). Examples of types of drones can include helicopter drones, quadcopter drones, plane drones, etc. Drones can also come in various sizes. For example, a drone may be a large size (e.g., more than 55 pounds), can be a smaller size (e.g., under 55 pounds) that may be regulated under different rules, and even smaller size (e.g., less than 4.4 pounds), etc. Size examples can vary, and can be determined based on one or more regulations from a government entity (e.g., the Federal Aviation Administration). - The line-of-
sight engine 122 can be used to determine whether the drone is within line-of-sight with at least one of the devices 150 a-150 n within a physical proximity to a human 152 a-152 n. As used herein, physical proximity to a human means that the device is located on the human (e.g., a cell phone in a pocket, a smart device in a hat, a smart watch, a bracelet, smart glasses, etc.). In some examples, the devices are wearable devices. Wearable devices are clothing and accessories incorporating computer or advanced electronic technologies. In some examples, the devices 150 a-150 n are located external to the human. For example, a hat may include technology to allow thedrone 110 to determine that there is line-of-sight with the human and/or the device 150. - In one example, the device 150 can include global positioning system (GPS) technology. The GPS technology can be used to determine a position of the device. The position can be provided to the
drone 110 either directly via input/output interfaces 234 or via aplatform 170. In another example, the device 150 may include a locator (e.g., a beacon such as a radio beacon, an infrared beacon, Wi-Fi beacon, etc.).Drones 110, 210 can use the beacon to determine the position of the device 150 and/or human (e.g., via triangulation). In some examples, sensors from multiple drones can be used to triangulate the position of a human. Moreover, a combination of technologies can be used to determine whether thedrone 110 has a line-of-sight with the respective devices 150 and/or the associated human 152. For example, thedrone 110 may use the GPS or beacon technology to determine the general location of the device and/or human and then use a sensor 128 (e.g., an image sensor, an infrared sensor, etc.) to confirm that there is a line-of-sight between thedrone 110 and the respective device and/or human. In some examples, an infrared beacon can be used to confirm line-of-sight. - In one example, the location of the device (e.g., based on GPS coordinates) can be used to determine a place for the
drone 110 to look for a human. In this example, thedrone 110 can use asensor 128, such as an optical sensor, to look for a human in that area. Recognition technology can be used to determine whether there is a human in that area. If so, then there is line-of-sight to that human. - In some examples, the device itself can be used for line-of-sight determination in one step. For example, the device can send out a beacon that can be received by the
drone 110 if there is line-of-sight (e.g., an infrared beacon). If thedrone 110 receives the signal, then there is line-of-sight. These beacons can be configured to provide a signal that lasts a particular range. As such, distance can be taken into account. In some examples, the line-of-sight determination can be based on line-of-sight based communications. - The tracking of the devices/humans can be used for dual purposes. The first purpose is to track the respective humans (e.g., to ensure a location of the humans 152 are within particular parameters). The second purpose is to ensure that
drones 110 are within line-of-sight of at least one human. - In some examples, if the line-of-
sight engine 122 determines that there is no line-of-sight, thedrone 110 can be controlled via thenavigation engine 120 to become within line-of-sight of at least one human.Rules 124 can be used to specify when thedrone 110 is not within line-of-sight. For example, therules 124 can include a rule that specifies distance criteria such as a threshold distance between thedrone 110 and the devices. As noted, the location of the devices 150/humans 152 can be determined. Also, the location of thedrone 110 can be determined. In some examples, the location of thedrone 110 can be maintained using sensors 128 (e.g., accelerometer, gyroscope, compass, GPS, cell tower tracking, other positioning systems, altimeter, thermal sensors, combinations thereof, etc.). Thelocation engine 126 can determine the location of the drone and a location of the respective devices 150. The distance criteria can be a customizable criteria that indicates a distance that can be deemed to be associated with a lack of line-of-sight. In one example, the criteria can take into account a size of thedrone 110 and a sighting capability of a human being (e.g., at a particular visual acuity). In another example, the criteria may also take into account dynamic visual controls, such as weather. - If the criteria is satisfied, the particular human and associated device can be considered to possibly lack a line-of-sight with the
drone 110. Thus, a line-of-sight determination can be based on the criteria. Further, in some examples, the criteria can be used to determine a potential lack of line-of-sight and anothersensor 128 can be used to confirm a lack of line-of-sight. For example, the distance criteria can be used to determine whether there is a potential lack of line-of-sight and an image sensor, infrared sensor, etc. can be used to confirm a lack of line-of-sight or confirm that there is a line-of-sight. - In other examples, three-dimensional maps of the terrain can be used to determine whether the location a human is at has line-of-sight with the
drone 110 based on obstructions. This can be based on a land location of the user and any obstructions. In some examples, sensor data (e.g., image data, sonar data, etc.) taken by thedrones 110, 210 can be used to determine the three-dimensional maps. In some examples, the three-dimensional map processing can be accomplished at thedrones 110. In other examples, the three-dimensional map processing can be off-loaded to aplatform 170, such as a cloud system or computing system. - In some examples,
rules 124 can include an action to take by thedrone 110 based on what criteria has been fulfilled. In one example, thedrone 110 can be caused to return to a line-of-sight of at least one of the devices 150 based on a determination of a potential lack of line-of-sight or a confirmed lack of line-of-sight. Therules 124 may further specify where thedrone 110 is to go. For example, thedrone 110 can be instructed to move to within a certain distance of one of the devices 150 (e.g., a closest one of the devices 150). In other examples, thedrone 110 can have a pre-determined path to take, a dynamic path to take, a static area to patrol, a dynamic area to patrol (e.g., based on locations of the devices 150), etc. In one example, thedrone 110 may be instructed to move to within a certain location or distance from one of the devices 150 within its future path. In some examples, the lack of line-of-sight is based on a determination that one of the devices is within the criteria and the other devices are already determined to lack the line-of-sight. - In one example, one of the
rules 124 can specify boundary criteria for the humans 152 and/or devices 150. Thetracking engine 130 can determine whether the location of a device meets the boundary criteria. In some examples, the boundary criteria can include a set of location coordinates that can be mapped. Thealert engine 132 can determine an alert based on criteria, such as the boundary criteria. If a respective device 150 is outside of the boundary or within a boundary threshold, an alert can be set. The alert can cause thedrone 110 to go to the respective device 150. - An example of use of boundary criteria may be to monitor children at a school or playground. If the child moves past the boundary, particular actions can be taken by the
drone 110, such as thenavigation engine 120 moving thedrone 110 towards the child, sending an alert to a user (e.g., registered parent), sending causing video to start recording and target the child, etc. In a similar case, the devices 150 anddrones 110, 210 can be used to track inmates. - In another example, the
drones 110, 210 can be used for agriculture. Thedrones 110, 210 can be used to track worker movement while also performing other activities. For example, the drones can be used to spray pesticides, irrigate, etc. over a portion of a field. The devices 150 can be used to ensure that humans 152 are not in the field during spray. Further, the line-of-sight engine 122 can be used to ensure that proper supervision of the drone occurs. In some examples, one or more of thedrones 110, 210 can be controlled by a drone control device 160 (e.g., a remote control, a mobile device using an app, etc.). - The
rules 124 can be used to implement other functionality. For example, arule 124 can specify conditions that show that a respective device 150 is not within proximity of an associated human 152. For example, the device 150 may include accelerometer information that can be sent to thedrone 110 and/orplatform 170. The accelerometer information can be compared to a profile or other function to determine whether anomalous behavior is present. One example of anomalous behavior includes no motion from the device 150. A device 150 located on a human 152 would show some motion (e.g., from breathing). Therefore, the lack of any movement could show that the human 152 is no longer associated with the device 150. Therule 124 can further specify that in response to the condition occurring, thenavigation engine 120 controls aerial movement of thedrone 110 towards a location of the device 150 and/or human 152 based on the recognition that the device is not within proximity of the human 152. - Other rules can be implemented, for example, to ensure that the
drone 110, 210 meets government regulations. In one example, thedrone 110, 210 can include an altimeter and thedrone 110, 210 can have an altitude range to fly within. In another example, thedrone 110, 210 can includerules 124 to keep thedrones 110, 210 from flying directly overhead of a human.Rules 124 can be used in conjunction withsensors 128 to provide various navigation adjustments. In one example, theplatform 170 can import government regulations and base the rules based on the government regulations. - In some examples, the
platform 170 can be a computing system, such as a cloud computing system that can be used to communicate (e.g., with thedrone 110, with devices 150, with other devices, etc.) via a communication engine 174. Various functionality described herein as being performed by thedrone 110 can be offloaded to theplatform 170. For example, the location of the devices 150 a-150 n can be determined by theplatform 170. Further, information about thedrones 110, 210 can be sent to theplatform 170 via a communication engine 174. The information can include locations, navigation programming, etc. Moreover, the information from the devices 150 can be sent to the communication engine 174. The information can include location of the devices, other sensor information (e.g., accelerometer information), etc. Theplatform 170 can perform data analytics on the information to determine whether one or more rules or alerts are triggered. If certain rules are triggered, thecontrol engine 176 can send thedrone 110 instructions to move accordingly (e.g., to a device location, within a boundary, etc.). - In other examples, if a rule is triggered to alert a user, an alert can be sent to the user. In one example, the subscription engine 172 can be used to register users to a database. The database can include devices 150 and/or humans 152 that the registered user is interested in. If an alert occurs, the user can be sent an alert. In the example of a school setting, the user can be an administrator at the school, a parent, etc. In the prison example, the user can be a warden, a prison guard, etc. Similar examples can be used for tracking others such as elderly people, disabled people, etc. Users can register devices 150 with humans 152 and an alert location (e.g., an email address, a phone number, etc.). When criteria associated with an alert is met, the subscription engine 172 can cause sending of an alert to the alert location. Further, in some scenarios, other information can be provided such as a video feed of the human 152, control over a drone looking for the human, etc.
- As noted above, the alerts can be sent to an emergency alert system. As such, the communication engine 174 can use APIs to communicate the alerts to systems associated with the triggered rule. For example, in a rule context of a missing child, an API associated with an Amber Alert system can be used.
- With the approaches used herein,
multiple drones 110, 210 a-210 m can be used to monitor multiple humans 152 a-152 n. Thedrones 110, 210 can be coordinated via navigation engines and/or aplatform 170 that can centralize control.Multiple drones 110, 210 can be used to track humans 152 associated with devices 150 (e.g., wearable devices) as well as to keep within line-of-sight of at least one of the humans 152. This can ensure that the drone is supervised while also ensuring that the humans are tracked. - A communication network can be used to connect one or more of the
drones 110, 210, devices 150,platform 170, other devices, etc. The communication network can use wired communications, wireless communications, or combinations thereof. Further, the communication network can include multiple sub communication networks such as data networks, wireless networks, telephony networks, etc. Such networks can include, for example, a public data network such as the Internet, local area networks (LANs), wide area networks (WANs), metropolitan area networks (MANs), cable networks, fiber optic networks, combinations thereof, or the like. In certain examples, wireless networks may include cellular networks, satellite communications, wireless LANs, etc. Further, the communication network can be in the form of a direct network link between devices. Various communications structures and infrastructure can be utilized to implement the communication network(s). Moreover, devices 150 anddrones 110, 210 can have multiple means of communication. - By way of example, devices can communicate with each other and other components with access to the communication network via a communication protocol or multiple protocols. A protocol can be a set of rules that defines how nodes of the communication network interact with other nodes. Further, communications between network nodes can be implemented by exchanging discrete packets of data or sending messages. Packets can include header information associated with a protocol (e.g., information on the location of the network node(s) to contact) as well as payload information.
- The
engines - A
processor 230, such as a central processing unit (CPU) or a microprocessor suitable for retrieval and execution of instructions and/or electronic circuits can be configured to perform the functionality of any of the engines described herein. In certain scenarios, instructions and/or other information, such as location information, registration information, etc., can be included inmemory 232 or other memory. Input/output interfaces 234 may additionally be provided by thedrone 110. Moreover, in certain embodiments, some components can be utilized to implement functionality of other components described herein. Input/output devices such as communication devices like network communication devices or wireless devices can also be considered devices capable of using the input/output interfaces 234. - Each of the modules may include, for example, hardware devices including electronic circuitry for implementing the functionality described herein. In addition or as an alternative, each module may be implemented as a series of instructions encoded on a machine-readable storage medium of a computing device and executable by a processor. It should be noted that, in some embodiments, some modules are implemented as hardware devices, while other modules are implemented as executable instructions.
-
FIG. 3 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device on the human, where the wearable device is tracked, according to an example.FIG. 4 is a block diagram of a drone capable of tracking humans based on wearable devices and controlling aerial movement of the drone to stay within a line-of-sight of at least one of the humans, according to an example. - Although execution of
method 300 is described below with reference todrone 400, other suitable components for execution ofmethod 300 can be utilized (e.g., drones 110, 210). Additionally, the components for executing themethod 300 may be spread among multiple devices (e.g., part of the functionality may be accomplished on the drone and part of the functionality may be offloaded to a cloud system).Method 300 may be implemented in the form of executable instructions stored on a machine-readable storage medium, such asstorage medium 420, and/or in the form of electronic circuitry. - The
drone 400 includes, for example, aprocessor 410, and a machine-readable storage medium 420 includinginstructions drone 400 according to rules and information about wearable devices located on humans. -
Processor 410 may be at least one central processing unit (CPU), at least one semiconductor-based microprocessor, at least one graphics processing unit (GPU), other hardware devices suitable for retrieval and execution of instructions stored in machine-readable storage medium 420, or combinations thereof. For example, theprocessor 410 may include multiple cores on a chip, include multiple cores across multiple chips, multiple cores across multiple devices (e.g., between the drone and a cloud system), or combinations thereof.Processor 410 may fetch, decode, and executeinstructions processor 410 may include at least one integrated circuit (IC), other control logic, other electronic circuits, or combinations thereof that include a number of electronic components for performing the functionality ofinstructions - Machine-
readable storage medium 420 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, machine-readable storage medium may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage drive, a Compact Disc Read Only Memory (CD-ROM), and the like. As such, the machine-readable storage medium can be non-transitory. As described in detail herein, machine-readable storage medium 420 may be encoded with a series of executable instructions for controlling aerial movement of a drone based on a location of a wearable device. - At 302,
aerial movement instructions 424 can be executed byprocessor 410 to control the drone 400 (e.g., by controlling aerial components of the drone). Thedrone 400 can be controlled using programmed instructions executed by theprocessor 410. For example, thedrone 400 can be set to patrol an area, can be set to follow a pattern, can be set to dynamically alter the patrol or pattern based on conditions (e.g., movement of tracked devices, weather, etc.), or the like. Further, in some examples, thedrone 400 can receive other control instructions from a control unit (e.g., a remote control, a remote application on a smart device, etc.). - The
drone 400 can be used to track humans using wearable devices. For example, line-of-sight instructions 422 can be executed byprocessor 410 to determine whether thedrone 400 is within a line-of-sight of the wearable device and/or the respective humans (304). Thedrone 400 can use this information to determine whether thedrone 400 is within line-of-sight of at least one of the humans. Further, thedrone 400 can track the respective humans using the wearable devices. - At 306, the
drone 400 can determine that it is within a buffer distance from one of the wearable devices indicative of a possible lack of line-of-sight to the wearable device and that there is a lack of line-of-sight from the other wearable devices. As used herein, the term “possible lack of line-of-sight” means that the drone does not lack the line-of-sight, but is within the buffer distance and/or the drone does lack the line-of-sight. The determination can be according to a rule and sensor information. In one example, the lack of the line-of-sight of the other wearable devices can be based on location information received from the wearable devices, sensor data captured at the drone, combinations thereof, etc. In one example, a certain distance can be indicative of a lack of line-of-sight. In another example, the distance can be augmented by weather conditions (e.g., fog, cloudiness, etc.). In a further example, the lack of line-of-sight can be determined based on a visual or infrared sensor on thedrone 400, laser communication between thedrone 400 and wearable devices, etc. In one example, the wearable device is a head device, such as a cap or helmet. The head device can include a beacon that can be used to facilitate the line-of-sight determination. - The buffer distance is a threshold distance that is smaller in value than a distance indicative of a lack of line-of-sight. The buffer distance can be used to cause augmentation of the drone's path before the
drone 400 has a lack of line-of-sight from the one wearable device. As such, at 308, theaerial movement instructions 424 can be executed to move thedrone 400 towards the wearable device. This ensures that at least one of the many wearable devices is within a line-of-sight of the drone. In some examples, government regulations may specify that a line-of-sight between a human and a drone be maintained. - In one example, one of the wearable devices can be selected based on a triggered rule by executing
selection instructions 426. As noted above, such triggers can include the wearer of the wearable device moving outside of a boundary, indications that the wearable device is no longer associated with the wearer, a stoppage of communication from the wearable device, etc. Theprocessor 410 can determine a location of the wearable device (e.g., based on GPS coordinates, other location information, etc.).Aerial movement instructions 424 can be executed to cause thedrone 400 to move towards the location. In some examples, the location can be updated and the drone can follow the wearable device. In one example, the wearable device is followed until a manual control signal is received from a control device (e.g., a remote control). - As noted above, alerts can be associated with rules. As such, in one example, when a rule is triggered, an alert can be sent. As noted above, the alert can be sent to registered user devices.
-
FIG. 5 is a flowchart of a method for controlling a drone to be within a line-of-sight of a human based on a wearable device, where an alert is provided if a rule is triggered based on location information, according to an example. Although execution ofmethod 500 is described below with reference to a computing system, other suitable components for execution ofmethod 500 can be utilized (e.g.,platform 170, other computing devices, cloud computing systems, etc.). Additionally, the components for executing themethod 500 may be spread among multiple devices.Method 500 may be implemented in the form of executable instructions stored on a machine-readable storage medium, such as storage medium, and/or in the form of electronic circuitry. - As noted above, a drone can be configured to move throughout a static or dynamic area. The computing system can control the drone's motions (e.g., by sending programming for the drone to execute). At 502, the computing system receives location information about multiple wearable devices that are monitored by a drone via aerial monitoring. The wearable devices can be worn by the respective humans and thus be within a physical proximity to the human. The wearable devices can be registered to alerts. The alerts may include an identifier of the wearable device and/or a human associated with the wearable device, a rule associated with when the alert should occur, and contact information (e.g., an email address, a phone number, Internet address, etc.) to send the alert. As noted above, the alerts can be sent in response to registration for the alerts.
- For example, a parent may be interested in tracking their child at school and can have an alert associated with when the child is within a particular distance to/from a school boundary, when the child is a certain distance from a teacher wearing another wearable device, etc. The drone can be controlled to stay within a line-of-sight of at least one of the wearable devices from multiple wearable devices (e.g., wearable devices respectively associated with particular children).
- At 504, the computing system determines that one of the wearable devices is acting according to a triggered rule based on the received location information. As noted above, various rules can be used. Further, alerts can be associated with the rules. As such, at 506, when a rule is triggered, the computing system can provide an associated alert for the rule (e.g., based on registration for the alert). For example, the alert can be based on a determination that the wearable device is outside of a boundary associated with the triggered rule based on the location information of the wearable device (e.g., notify a registered user that a child is outside of a school boundary).
- In another example, the triggered rule can alert a user that the drone is outside of a line-of-sight of at least one of the wearable devices or is in a buffer distance from at least one of the wearable devices that indicates as possible lack of line-of-sight from the set of wearable devices. In one example, the rule can indicate that if each of the wearable devices is out of the line-of-sight of the drone and/or within a threshold distance away (e.g., at a buffer range), the rule is triggered. The drone can then be controlled to move towards one of the wearable devices. The wearable device to move towards can be selected based on criteria (e.g., the closest wearable device to the drone, a close wearable device within a path the drone is on, etc.). The computing system can send a control command to cause the drone to move towards the selected wearable device.
Claims (20)
1. A computer-implemented method comprising:
receiving, by a computing system, subscription information associated with a device, wherein the device is registrable on a database based on the subscription information;
determining, by the computing system, that one or more drones are to be dispatched to the device;
determining, by the computing system, a location of the device; and
dispatching, by the computing system, the one or more drones to the location.
2. The computer-implemented method of claim 1 , further comprising:
registering, by the computing system, a reference associated with the device in the database;
determining, by the computing system, that a criterion associated with an alert and the device is satisfied; and
causing, by the computing system, the alert to be sent to the reference.
3. The computer-implemented method of claim 2 , wherein the criterion is a boundary criterion including a set of location coordinates, wherein the determining that the one or more drones are to be dispatched to the device comprises determining that the device is outside of a boundary associated with the set of location coordinates.
4. The computer-implemented method of claim 1 , wherein the device comprises a global positioning system (GPS) sensor capable of providing a coordinate associated with the device, wherein the determining the location of the device comprises:
determining the location of the device based on the coordinate.
5. The computer-implemented method of claim 4 , wherein the dispatching the one or more drones to the location comprises:
dispatching the one or more drones to the coordinate associated with the device; and
establishing a line-of-sight between the one or more drones and the device based on an infrared beacon associated with the device.
6. The computer-implemented method of claim 4 , wherein the dispatching the one or more drones to the location comprises:
dispatching the one or more drones to the coordinate associated with the device;
searching for a human associated with the device at the location based on image data from at least one optical sensor of the one or more drones; and
establishing a line-of-sight to the human.
7. The computer-implemented method of claim 6 , wherein the establishing a line-of-sight to the human comprises:
determining a three-dimensional map of the location, wherein the three-dimensional map is determined based on sensor data captured by the one or more drones; and
using the three-dimensional map to determine whether the one or more drones have the line-of-sight to the human.
8. The computer-implemented method of claim 1 , wherein the device comprises a beacon, wherein the determining the location of the device comprises:
determining the location of the device based on location information about the beacon from one or more sensors of the one or more drones.
9. The computer-implemented method of claim 8 , wherein the beacon is at least one of a radio beacon, an infrared beacon, or a Wi-Fi beacon, wherein the determining the location of the device further comprises:
using triangulation based on the location information about the beacon.
10. The computer-implemented method of claim 1 , wherein the device is a wearable device worn by a wearer, wherein the determining that one or more drones are to be dispatched to the device comprises:
receiving an indication of anomalous behavior associated with the wearer, wherein the anomalous behavior is determined based on a comparison of sensor data from the wearable device to a profile or a function.
11. A system comprising:
at least one processor; and
a memory storing instructions that, when executed by the at least one processor, cause the system to perform a method comprising:
receiving subscription information associated with a device, wherein the device is registrable on a database based on the subscription information;
determining that one or more drones are to be dispatched to the device;
determining a location of the device; and
dispatching the one or more drones to the location.
12. The system of claim 11 , wherein the instructions cause the system to perform the method further comprising:
registering a reference associated with the device in the database;
determining that a criterion associated with an alert and the device is satisfied; and
causing the alert to be sent to the reference.
13. The system of claim 12 , wherein the criterion is a boundary criterion including a set of location coordinates, wherein the determining that the one or more drones are to be dispatched to the device comprises determining that the device is outside of a boundary associated with the set of location coordinates.
14. The system of claim 11 , wherein the device comprises a global positioning system (GPS) sensor capable of providing a coordinate associated with the device, wherein the determining the location of the device comprises:
determining the location of the device based on the coordinate.
15. The system of claim 14 , wherein the dispatching the one or more drones to the location comprises:
dispatching the one or more drones to the coordinate associated with the device; and
establishing a line-of-sight between the one or more drones and the device based on an infrared beacon associated with the device.
16. A non-transitory computer-readable storage medium including instructions that, when executed by at least one processor of a computing system, cause the computing system to perform a method comprising:
receiving subscription information associated with a device, wherein the device is registrable on a database based on the subscription information;
determining that one or more drones are to be dispatched to the device;
determining a location of the device; and
dispatching the one or more drones to the location.
17. The non-transitory computer-readable storage medium of claim 16 , wherein the instructions cause the computing system to perform the method further comprising:
registering a reference associated with the device in the database;
determining that a criterion associated with an alert and the device is satisfied; and
causing the alert to be sent to the reference.
18. The non-transitory computer-readable storage medium of claim 17 , wherein the criterion is a boundary criterion including a set of location coordinates, wherein the determining that the one or more drones are to be dispatched to the device comprises determining that the device is outside of a boundary associated with the set of location coordinates.
19. The non-transitory computer-readable storage medium of claim 16 , wherein the device comprises a global positioning system (GPS) sensor capable of providing a coordinate associated with the device, wherein the determining the location of the device comprises:
determining the location of the device based on the coordinate.
20. The non-transitory computer-readable storage medium of claim 19 , wherein the dispatching the one or more drones to the location comprises:
dispatching the one or more drones to the coordinate associated with the device; and
establishing a line-of-sight between the one or more drones and the device based on an infrared beacon associated with the device.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/985,576 US20210116918A1 (en) | 2015-06-24 | 2020-08-05 | Control aerial movement of drone based on line-of-sight of humans using devices |
US18/217,443 US20230350411A1 (en) | 2015-06-24 | 2023-06-30 | Control aerial movement of drone based on line-of-sight of humans using devices |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2015/037428 WO2016209225A1 (en) | 2015-06-24 | 2015-06-24 | Control aerial movement of drone based on line-of-sight of humans using devices |
US201715739589A | 2017-12-22 | 2017-12-22 | |
US16/985,576 US20210116918A1 (en) | 2015-06-24 | 2020-08-05 | Control aerial movement of drone based on line-of-sight of humans using devices |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2015/037428 Continuation WO2016209225A1 (en) | 2015-06-24 | 2015-06-24 | Control aerial movement of drone based on line-of-sight of humans using devices |
US15/739,589 Continuation US10816976B2 (en) | 2015-06-24 | 2015-06-24 | Control aerial movement of drone based on line-of-sight of humans using devices |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/217,443 Continuation US20230350411A1 (en) | 2015-06-24 | 2023-06-30 | Control aerial movement of drone based on line-of-sight of humans using devices |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210116918A1 true US20210116918A1 (en) | 2021-04-22 |
Family
ID=57586158
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/739,589 Active 2036-06-08 US10816976B2 (en) | 2015-06-24 | 2015-06-24 | Control aerial movement of drone based on line-of-sight of humans using devices |
US16/985,576 Abandoned US20210116918A1 (en) | 2015-06-24 | 2020-08-05 | Control aerial movement of drone based on line-of-sight of humans using devices |
US18/217,443 Pending US20230350411A1 (en) | 2015-06-24 | 2023-06-30 | Control aerial movement of drone based on line-of-sight of humans using devices |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/739,589 Active 2036-06-08 US10816976B2 (en) | 2015-06-24 | 2015-06-24 | Control aerial movement of drone based on line-of-sight of humans using devices |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/217,443 Pending US20230350411A1 (en) | 2015-06-24 | 2023-06-30 | Control aerial movement of drone based on line-of-sight of humans using devices |
Country Status (3)
Country | Link |
---|---|
US (3) | US10816976B2 (en) |
EP (1) | EP3313731B1 (en) |
WO (1) | WO2016209225A1 (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180025649A1 (en) * | 2016-02-08 | 2018-01-25 | Unmanned Innovation Inc. | Unmanned aerial vehicle privacy controls |
US11772270B2 (en) | 2016-02-09 | 2023-10-03 | Cobalt Robotics Inc. | Inventory management by mobile robot |
US10265859B2 (en) | 2016-02-09 | 2019-04-23 | Cobalt Robotics Inc. | Mobile robot with removable fabric panels |
US10404369B2 (en) * | 2016-06-07 | 2019-09-03 | Siklu Communication ltd. | Systems and methods for using drones for determining line-of-sight conditions in wireless networks |
KR20180028701A (en) * | 2016-09-09 | 2018-03-19 | 엘지전자 주식회사 | Portable camera and method for controlling the same |
US11724399B2 (en) | 2017-02-06 | 2023-08-15 | Cobalt Robotics Inc. | Mobile robot with arm for elevator interactions |
WO2019126785A1 (en) * | 2017-12-21 | 2019-06-27 | Correnti Matthew Daniel | Monitoring system for securing networks from hacker drones |
JP2021144260A (en) * | 2018-06-15 | 2021-09-24 | ソニーグループ株式会社 | Information processing device, information processing method, program, and information processing system |
US11082667B2 (en) * | 2018-08-09 | 2021-08-03 | Cobalt Robotics Inc. | Contextual automated surveillance by a mobile robot |
US11460849B2 (en) | 2018-08-09 | 2022-10-04 | Cobalt Robotics Inc. | Automated route selection by a mobile robot |
US10796548B2 (en) * | 2018-12-28 | 2020-10-06 | Intel Corporation | Management of guardianship of an entity including via elastic boundaries |
US10980218B2 (en) | 2019-07-19 | 2021-04-20 | Sports Data Labs, Inc. | Unmanned aerial vehicle (UAV)-based system for collecting and distributing animal data for monitoring |
DE102019130804B4 (en) | 2019-11-14 | 2021-12-09 | Universität Stuttgart | Drone, method for operating a drone and electronic control and regulating device for controlling and regulating the operation of a drone |
CN112419666B (en) * | 2020-10-19 | 2022-06-14 | 钱继华 | Intelligent living service method and system based on lifting control |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100096490A1 (en) * | 2008-10-18 | 2010-04-22 | Kevin Patrick Gordon | Remote engine/electric helicopter industrial plat form |
US20100201829A1 (en) * | 2009-02-09 | 2010-08-12 | Andrzej Skoskiewicz | Camera aiming using an electronic positioning system for the target |
US20140120977A1 (en) * | 2012-10-25 | 2014-05-01 | David Amis | Methods and systems for providing multiple coordinated safety responses |
US9170117B1 (en) * | 2014-08-21 | 2015-10-27 | International Business Machines Corporation | Unmanned aerial vehicle navigation assistance |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6952646B2 (en) | 2003-05-14 | 2005-10-04 | Chang Industry, Inc. | Tracking device and associated system and method |
US20150054639A1 (en) * | 2006-08-11 | 2015-02-26 | Michael Rosen | Method and apparatus for detecting mobile phone usage |
US7970507B2 (en) | 2008-01-23 | 2011-06-28 | Honeywell International Inc. | Method and system for autonomous tracking of a mobile target by an unmanned aerial vehicle |
IL201681A (en) | 2009-10-22 | 2014-06-30 | Abraham Abershitz | Uav system and method |
US8857716B1 (en) * | 2011-02-21 | 2014-10-14 | Proxense, Llc | Implementation of a proximity-based system for object tracking and automatic application initialization |
US9841761B2 (en) | 2012-05-04 | 2017-12-12 | Aeryon Labs Inc. | System and method for controlling unmanned aerial vehicles |
US20150350614A1 (en) * | 2012-08-31 | 2015-12-03 | Brain Corporation | Apparatus and methods for tracking using aerial video |
US9529447B2 (en) * | 2013-01-18 | 2016-12-27 | Microsoft Technology Licensing, Llc | Removable input module |
US8989922B2 (en) * | 2013-03-15 | 2015-03-24 | Azure Sky Group, LLC. | Modular drone and methods for use |
US20150134143A1 (en) | 2013-10-04 | 2015-05-14 | Jim Willenborg | Novel tracking system using unmanned aerial vehicles |
US9754425B1 (en) * | 2014-02-21 | 2017-09-05 | Allstate Insurance Company | Vehicle telematics and account management |
US9269252B2 (en) * | 2014-04-17 | 2016-02-23 | Honeywell International Inc. | Man down detector |
US9335764B2 (en) * | 2014-05-27 | 2016-05-10 | Recreational Drone Event Systems, Llc | Virtual and augmented reality cockpit and operational control systems |
US9611038B2 (en) * | 2014-06-03 | 2017-04-04 | Working Drones, Inc. | Mobile computing device-based guidance navigation and control for unmanned aerial vehicles and robotic systems |
-
2015
- 2015-06-24 US US15/739,589 patent/US10816976B2/en active Active
- 2015-06-24 WO PCT/US2015/037428 patent/WO2016209225A1/en active Application Filing
- 2015-06-24 EP EP15896516.0A patent/EP3313731B1/en active Active
-
2020
- 2020-08-05 US US16/985,576 patent/US20210116918A1/en not_active Abandoned
-
2023
- 2023-06-30 US US18/217,443 patent/US20230350411A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100096490A1 (en) * | 2008-10-18 | 2010-04-22 | Kevin Patrick Gordon | Remote engine/electric helicopter industrial plat form |
US20100201829A1 (en) * | 2009-02-09 | 2010-08-12 | Andrzej Skoskiewicz | Camera aiming using an electronic positioning system for the target |
US20140120977A1 (en) * | 2012-10-25 | 2014-05-01 | David Amis | Methods and systems for providing multiple coordinated safety responses |
US9170117B1 (en) * | 2014-08-21 | 2015-10-27 | International Business Machines Corporation | Unmanned aerial vehicle navigation assistance |
Also Published As
Publication number | Publication date |
---|---|
EP3313731B1 (en) | 2021-05-05 |
WO2016209225A1 (en) | 2016-12-29 |
US20230350411A1 (en) | 2023-11-02 |
EP3313731A4 (en) | 2019-01-16 |
US10816976B2 (en) | 2020-10-27 |
EP3313731A1 (en) | 2018-05-02 |
US20180314251A1 (en) | 2018-11-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230350411A1 (en) | Control aerial movement of drone based on line-of-sight of humans using devices | |
US11810465B2 (en) | Flight control for flight-restricted regions | |
US20190317530A1 (en) | Systems and methods of unmanned aerial vehicle flight restriction for stationary and moving objects | |
US11294398B2 (en) | Personal security robotic vehicle | |
US20210287559A1 (en) | Device, system, and method for controlling unmanned aerial vehicle | |
JP6538852B2 (en) | Aircraft height limitation and control | |
KR101690502B1 (en) | System for tracking using drone | |
US10838415B2 (en) | Systems and methods for automatically customizing operation of a robotic vehicle | |
JP2018503194A (en) | Method and system for scheduling unmanned aircraft, unmanned aircraft | |
KR101917860B1 (en) | Method for optimal path search of drone, optimal path searching server and system | |
US20190147747A1 (en) | Remote Control of an Unmanned Aerial Vehicle | |
US20220027772A1 (en) | Systems and methods for determining predicted risk for a flight path of an unmanned aerial vehicle | |
US10937324B2 (en) | Orchestration in heterogeneous drone swarms | |
US20190310629A1 (en) | Control of robotic vehicles based on attention level of operator | |
US20190268720A1 (en) | System and method for indicating drones flying overhead | |
KR101840820B1 (en) | Method for generating 3-dementional augmented reality flight video | |
US20190310630A1 (en) | Control of robotic vehicles based on attention level of operator | |
JP6912518B2 (en) | Aircraft altitude limit and control | |
EP4327317A1 (en) | System infrastructure for manned vertical take-off and landing aerial vehicles | |
JP7351613B2 (en) | computer system | |
US20240019863A1 (en) | System for shared remote drone usage | |
Sharma et al. | An insight on UAV/drone autonomous navigation methods and applications: a review | |
KR20210156506A (en) | Multiple UAV control systems and control methods for ground control |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |