US10210905B2 - Remote controlled object macro and autopilot system - Google Patents
Remote controlled object macro and autopilot system Download PDFInfo
- Publication number
- US10210905B2 US10210905B2 US15/394,391 US201615394391A US10210905B2 US 10210905 B2 US10210905 B2 US 10210905B2 US 201615394391 A US201615394391 A US 201615394391A US 10210905 B2 US10210905 B2 US 10210905B2
- Authority
- US
- United States
- Prior art keywords
- sequence
- uav
- controller inputs
- playback device
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims description 26
- 238000007726 management method Methods 0.000 description 29
- 238000012545 processing Methods 0.000 description 19
- 238000005516 engineering process Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 230000007613 environmental effect Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- 238000009877 rendering Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 239000000872 buffer Substances 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 230000008030 elimination Effects 0.000 description 3
- 238000003379 elimination reaction Methods 0.000 description 3
- 238000003032 molecular docking Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000001816 cooling Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 235000015842 Hesperis Nutrition 0.000 description 1
- 235000012633 Iberis amara Nutrition 0.000 description 1
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 1
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 239000002253 acid Substances 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- OJIJEKBXJYRIBZ-UHFFFAOYSA-N cadmium nickel Chemical compound [Ni].[Cd] OJIJEKBXJYRIBZ-UHFFFAOYSA-N 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910052744 lithium Inorganic materials 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000009987 spinning Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0022—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
-
- B64C2201/127—
-
- B64C2201/141—
-
- B64C2201/146—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
- B64U10/14—Flying platforms with four distinct rotor axes, e.g. quadcopters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/20—Remote controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U50/00—Propulsion; Power supply
- B64U50/30—Supply or distribution of electrical power
- B64U50/39—Battery swapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
Definitions
- This application relates to remote controlled piloting, and more particularly to a system and method for remote controlled piloting aids for unmanned aerial vehicles.
- a flight path management system manages flight paths for an unmanned aerial vehicle (UAV).
- UAV unmanned aerial vehicle
- the flight path management system receives a sequence of controller inputs for the UAV, and stores the sequence of controller inputs in a memory.
- the flight path management system accesses the memory and selects a selected section of the sequence of controller inputs corresponding to a time period.
- the flight management system outputs the selected section to a playback device in real time over a length of the time period.
- the controller inputs are downloaded from an online database 140 .
- the online database 140 can be a central repository where UAV pilots can upload their flight paths and maneuvers, represented by a sequence of controller inputs.
- the sequence of controller inputs can be organized into various categories for ease of retrieval. For example, various sequences of controller inputs used to fly a barrel roll can be placed in a barrel roll category, etc. In another example, various sequence of controller inputs used for a particular racing course can be placed in a category for that particular racing course.
- the flight path management system 100 receives sensor data for the UAV, as it flies along a flight path and performs maneuvers, from one or more movement sensors 130 .
- the movement sensors can include at least one of a camera, distance sensors, a Light Detection and Ranging sensor, or other such sensor.
- the computing device 110 determines controller inputs from the sensor data from the movement sensors by for example using a reverse engineering process.
- the movement sensors 130 allows the computing device 110 to calculate the controller inputs required to move the UAV along the flight path and to perform the maneuvers based on the UAV's position, speed, and/or other sensed data.
- the playback device is a head-mounted display (HMD) such as a virtual reality HMD or an augmented reality (AR) HMD.
- HMD head-mounted display
- AR augmented reality
- the HMD allows a user of to adjust a camera viewing angle or position in a virtual space by moving or rotating his head in the physical space.
- the flight path management system 100 can include a graphical user interface in for use with playback device to allow a user to adjust camera angles and positioning during playback of the UAV flying the received flight path.
- the GUI can include a virtual camera with pan (also referred to as yaw), tilt (also referred to as pitch), and zoom controls to select/change a camera view.
- the view selection interface allows the user to select the camera view for the playback, defined by view parameters, shown by the virtual camera.
- the user can adjust a camera view shown by a preview of a still image at the selected camera view.
- Battery 270 can include a temperature sensor for overload prevention. For example, when charging, the rate of charge can be thermally limited (the rate will decrease if the temperature exceeds a certain threshold). Similarly, the power delivery at electronic speed controllers 245 can be thermally limited—providing less power when the temperature exceeds a certain threshold. Battery 270 can include a charging and voltage protection circuit to safely charge battery 270 and prevent its voltage from going above or below a certain range.
- the flight path management system senses movement of the UAV, by at least one sensor, and determines controller inputs that correspond to the movement of the UAV.
- the BIOS 710 of the processing system 700 can include a BIOS configuration that defines how the BIOS 710 controls various hardware components in the processing system 700 .
- the BIOS configuration can determine the order in which the various hardware components in the processing system 700 are started.
- the BIOS 710 can provide an interface (e.g., BIOS setup utility) that allows a variety of different parameters to be set, which can be different from parameters in a BIOS default configuration.
- BIOS setup utility e.g., BIOS setup utility
- a user e.g., an administrator
- the northbridge 760 can be a chip on the motherboard that can be directly connected to the processor 740 or can be integrated into the processor 740 . In some instances, the northbridge 760 and the southbridge 770 can be combined into a single die. The northbridge 760 and the southbridge 770 , manage communications between the processor 740 and other parts of the motherboard. The northbridge 760 can manage tasks that require higher performance than the southbridge 770 . The northbridge 760 can manage communications between the processor 740 , the memory 720 , and video controllers (not shown). In some instances, the northbridge 760 can include a video controller.
- the display device 704 can be at least one of a monitor, a light-emitting display (LED) screen, a liquid crystal display (LCD) screen, a head mounted display (HMD), a virtual reality (VR) display, a augmented reality (AR) display, or other such output device.
- the display device 704 allows the processing system 700 to output visual information to a user.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- User Interface Of Digital Computer (AREA)
- Multimedia (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
Abstract
A flight path management system manages flight paths for an unmanned aerial vehicle (UAV). The flight path management system receives a sequence of controller inputs for the UAV, and stores the sequence of controller inputs in a memory. The flight path management system accesses the memory and selects a selected section of the sequence of controller inputs corresponding to a time period. The flight management system outputs the selected section to a playback device in real time over a length of the time period.
Description
This application claims the benefit of U.S. Provisional Patent Application Ser. No. 62/402,737 entitled “REMOTE CONTROLLED OBJECT MACRO AND AUTOPILOT SYSTEM”, filed Sep. 30, 2016. The aforementioned application is herein incorporated by reference in its entirety.
This application is related to U.S. Provisional Patent Application Ser. No. 62/402,752 entitled “STEERING ASSIST”, filed Sep. 30, 2016, and U.S. Provisional Patent Application Ser. No. 62/402,747 entitled “COLLISION DETECTION AND AVOIDANCE”, filed Sep. 30, 2016, all of which are incorporated herein by reference.
This application relates to remote controlled piloting, and more particularly to a system and method for remote controlled piloting aids for unmanned aerial vehicles.
An unmanned aerial vehicle (UAV), commonly known as a drone or quadricopter, or by several other names, is an aircraft without a human pilot aboard. The flight of UAVs may operate with various degrees of autonomy: either under remote control by a human operator, or fully or intermittently autonomously, by onboard computers.
Drones may be used for racing games or competition that consist in following a path that is defined like a slalom by single or double gates or posts, and by a finish line. In order to win the race, it is essential to go fast. And in order to save time, it is necessary to turn around the posts as closely as possible while conserving a maximum amount of kinetic energy, i.e. while traveling relatively fast. Outside of racing, drones can also perform advanced aerial maneuvers such as loops, split-s, Immelmann, barrel rolls, figure eights, etc.
With present drones, these maneuvers require the user to be skilled and experienced because the mode of piloting requires the user to use several different controls in combination in order to perform these maneuvers.
Offering recording and playback functionality may be particularly helpful with drone pilots and spectators for learning and for entertainment. As drone controller have become capable of executing more maneuvers, game controllers have incorporated more complex control options (e.g., multiple joysticks, Z-triggers, shoulder-triggers, motion tracking along X-, Y-, and Z-axes, and the like). The use of the various control options alone, in combination, and sometimes in particular sequences has increased the complexity of executing certain maneuvers. It may be helpful for some users to have increased knowledge of how a particular task or maneuver was executed with respect to the various control options on a controller device.
Additionally, users who are not controlling a drone may desire to watch a drone being controlled by others as a spectator, for example, as entertainment, for learning, or for social reasons.
The following presents a simplified summary of one or more embodiments in order to provide a basic understanding of present technology. This summary is not an extensive overview of all contemplated embodiments of the present technology, and is intended to neither identify key or critical elements of all examples nor delineate the scope of any or all aspects of the present technology. Its sole purpose is to present some concepts of one or more examples in a simplified form as a prelude to the more detailed description that is presented later.
In accordance with one or more aspects of the examples described herein, systems and methods are provided for managing flight paths for an unmanned aerial vehicle (UAV).
In an aspect, a flight path management system manages flight paths for an unmanned aerial vehicle (UAV). The flight path management system receives a sequence of controller inputs for the UAV, and stores the sequence of controller inputs in a memory. The flight path management system accesses the memory and selects a selected section of the sequence of controller inputs corresponding to a time period. The flight management system outputs the selected section to a playback device in real time over a length of the time period.
In a second aspect, a system for managing flight paths by an unmanned aerial vehicle (UAV) includes a recording device configured to receive a sequence of controller inputs for the UAV. The system includes a memory coupled to the recording device configured to store the sequence of controller inputs. The system includes a playback device configured to access the memory and select a selected section of the sequence of controller inputs corresponding to a time period, and output the selected section to a playback device in real time over a length of the time period.
These and other sample aspects of the present technology will be described in the detailed description and the appended claims that follow, and in the accompanying drawings, wherein:
The subject disclosure provides techniques for initializing a panoramic video, in accordance with the subject technology. Various aspects of the present technology are described with reference to the drawings. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects. It can be evident, however, that the present technology can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing these aspects. The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments.
The flight path management system 100 receives a sequence of controller inputs for a UAV 170. The controller inputs represent what a user flying the UAV inputs into a controller device to fly the UAV along a particular flight path and performing one or more aerial maneuvers. The sequence of controller inputs can for example be a sequence of controller actions along with timestamps. The sequence of controller inputs can be recorded at a set time interval, such as for example, every 0.1 seconds, 0.01, or 0.001 seconds. A shorter time interval gives a more accurate record of the controller inputs, but requires more storage and/or processing power.
In some implementations, the controller inputs is received from a controller device 120 that is used to control the UAV 170. The controller can include one or more analog or digital input options. Analog input options include flight stick or joystick controls, triggers, sliders, throttles, dials, etc. Digital input options include buttons, switches, etc. The sequence of controller inputs can for example include angles of positions of each joystick lever, slider position, and whether or not each button on the controller device is depressed, for each increment of time recorded.
In some implementations, the controller inputs are downloaded from an online database 140. The online database 140 can be a central repository where UAV pilots can upload their flight paths and maneuvers, represented by a sequence of controller inputs. The sequence of controller inputs can be organized into various categories for ease of retrieval. For example, various sequences of controller inputs used to fly a barrel roll can be placed in a barrel roll category, etc. In another example, various sequence of controller inputs used for a particular racing course can be placed in a category for that particular racing course.
In some implementations, the flight path management system 100 receives sensor data for the UAV, as it flies along a flight path and performs maneuvers, from one or more movement sensors 130. The movement sensors can include at least one of a camera, distance sensors, a Light Detection and Ranging sensor, or other such sensor. The computing device 110 determines controller inputs from the sensor data from the movement sensors by for example using a reverse engineering process. For example, the movement sensors 130 allows the computing device 110 to calculate the controller inputs required to move the UAV along the flight path and to perform the maneuvers based on the UAV's position, speed, and/or other sensed data.
The computing device 110 can be a personal computer, a laptop, a smart phone, tablet, a server, a mobile device, a game console system, or other such device. The computing device includes a processor 102, a graphics processing unit (GPU) 104, and a memory 106. It is understood that one or more of the processor 102, the GPU 104, or the memory 106 can be integrated into a single unit or as separate connected units.
The sequence of controller inputs is stored in the memory 106 of the computing device 110. The sequence of controller inputs is then accessed from the memory 106 by the processor 102. A user can select a selected section of the sequence of controller inputs that is of interest. For example, a user may only be interested in a flight path of a particular corner of a drone racing course and correspondingly select the selected section for that particular corner. In another example, the user is interested in the entire flight path of a race and thus the selected section is the entirety of the received sequence of controller inputs. The processor 102 accesses the memory 106 to access the selected section of the sequence of controller inputs.
The GPU 104 can output a graphical representation of the controller inputs. For example, a playback device can display a two-dimensional or three-dimensional virtual model representation of a controller device that mimics the looks and control movement of the physical controller device 120. For example, the playback device can display joystick levers, buttons, sliders, etc. on the virtual model move according to the controller inputs as the physical controller device 120 would move. This allows a user viewing the playback device to learn the proper controller inputs to cause the UAV 170 to move according to the flight path and perform the same maneuvers.
The GPU 104 can also or alternatively output a graphical representation of the UAV flight path corresponding to the selected section. For example, a playback device can display a trace of the flight path in a virtual space.
The GPU 104 can also or alternatively output a video playback of the UAV for the selected section in a first-person view, a third person view, or a static camera view. The user can watch and rewatch a graphical representation of the UAV fly the flight path corresponding to the selected section.
In some implementations, the playback device outputs the selected section of the sequence of controller inputs in real time, while the flight path management system 100 is concurrently receiving the sequence of controller inputs. For example, the sequence of controller inputs can be livestreamed to spectators connected to the flight path management system 100 over a wide area network (WAN) such as the Internet or over a local area network (LAN).
In some implementations, the playback device is a head-mounted display (HMD) such as a virtual reality HMD or an augmented reality (AR) HMD. The HMD allows a user of to adjust a camera viewing angle or position in a virtual space by moving or rotating his head in the physical space.
The flight path management system 100 can include a graphical user interface in for use with playback device to allow a user to adjust camera angles and positioning during playback of the UAV flying the received flight path. The GUI can include a virtual camera with pan (also referred to as yaw), tilt (also referred to as pitch), and zoom controls to select/change a camera view. The view selection interface allows the user to select the camera view for the playback, defined by view parameters, shown by the virtual camera. In some implementations, the user can adjust a camera view shown by a preview of a still image at the selected camera view. In some implementations, the user can drag a cursor's (e.g., with mouse, joystick, or touchscreen) position across a camera viewing plane to adjust/rotate (e.g., pan, tilt, or roll) a camera orientation of a panoramic video displayed by the playback device.
For example, selecting and dragging the cursor along the X and Y axis of a two-dimensional screen can rotate the camera orientation around the X and Y rotational axis in three-dimensional space. The graphical user interface can calculate the amount of rotation of the camera orientation based on a distance of travel of the mouse cursor in the two-dimensional screen. In some implementations, the user can zoom in or out of the camera view using at least one of keyboard, mouse, touchscreen, trackball, joystick, or other input device. In some implementations, the graphical user interface can include text boxes for a user to manually enter of the orientation by typing in Euler angles desired.
The graphical user interface creates a three-dimensional mesh representing a virtual world and places the virtual camera at the center of the three dimensional mesh. The selected camera view corresponds to a portion of a three-dimensional mesh space. For example, the three-dimensional mesh can be a sphere or a variety of other shapes depending on how the camera frames were stitched together. The shape is used by an algorithm to un-distort the stitched camera frame.
A mesh is a collection of vertices, edges and faces that defines the shape of a polyhedral object for use in three-dimensional modeling. The faces usually include triangles, quadrilaterals, or other simple convex polygons, but can also include more general concave polygons, or polygons with holes. A vertex is a position (usually in 3D space) along with other information such as color, normal vector and texture coordinates. An edge is a connection between two vertices. A face is a closed set of edges (e.g., a triangle face has tree edges and a quad face has four edges).
Polygon meshes may be represented in a variety of ways, using different methods to store the vertex, edge and face data. Examples of polygon mesh representations include Face-vertex meshes, Winged-edge meshes, Half-edge meshes, Quad-edge meshes, Corner-table meshes, and Vertex-vertex meshes.
The graphical user interface uses a graphics pipeline or rendering pipeline to create a two-dimensional representation of a three-dimensional scene using the three-dimensional mesh.
In some embodiments, UAV 200 has main body 210 with one or more arms 240. The proximal end of arm 240 can attach to main body 210 while the distal end of arm 240 can secure motor 250. Arms 240 can be secured to main body 210 in an “X” configuration, an “H” configuration, a “T” configuration, or any other configuration as appropriate. The number of motors 250 can vary, for example there can be three motors 250 (e.g., a “tricopter”), four motors 250 (e.g., a “quadcopter”), eight motors (e.g., an “octocopter”), etc.
In some embodiments, each motor 255 rotates (i.e., the drive shaft of motor 255 spins) about parallel axes. For example, the thrust provided by all propellers 255 can be in the Z direction. Alternatively, a motor 255 can rotate about an axis that is perpendicular (or any angle that is not parallel) to the axis of rotation of another motor 255. For example, two motors 255 can be oriented to provide thrust in the Z direction (e.g., to be used in takeoff and landing) while two motors 255 can be oriented to provide thrust in the X direction (e.g., for normal flight). In some embodiments, UAV 200 can dynamically adjust the orientation of one or more of its motors 250 for vectored thrust.
In some embodiments, the rotation of motors 250 can be configured to create or minimize gyroscopic forces. For example, if there are an even number of motors 250, then half of the motors can be configured to rotate counter-clockwise while the other half can be configured to rotate clockwise. Alternating the placement of clockwise and counter-clockwise motors can increase stability and enable UAV 200 to rotate about the z-axis by providing more power to one set of motors 250 (e.g., those that rotate clockwise) while providing less power to the remaining motors (e.g., those that rotate counter-clockwise).
In some embodiments, motor 250 is a brushless motor and can be connected to electronic speed controller 245. Electronic speed controller 245 can determine the orientation of magnets attached to a drive shaft within motor 250 and, based on the orientation, power electromagnets within motor 250. For example, electronic speed controller 245 can have three wires connected to motor 250, and electronic speed controller 245 can provide three phases of power to the electromagnets to spin the drive shaft in motor 250. Electronic speed controller 245 can determine the orientation of the drive shaft based on back-emf on the wires or by directly sensing to position of the drive shaft.
In some embodiments, transceiver 265 can also transmit data to a control unit. Transceiver 265 can communicate with the control unit using lasers, light, ultrasonic, infra-red, Bluetooth, 802.11x, or similar communication methods, including a combination of methods. Transceiver can communicate with multiple control units at a time.
In some embodiments, an environmental awareness system can take inputs from position sensors 235, environmental awareness sensors, databases (e.g., a predefined mapping of a region) to determine the location of UAV 200, obstacles, and pathways. In some embodiments, this environmental awareness system is located entirely on UAV 200, alternatively, some data processing can be performed external to UAV 200.
A gimbal and dampeners can help stabilize camera 205 and remove erratic rotations and translations of UAV 200. For example, a three-axis gimbal can have three stepper motors that are positioned based on a gyroscope reading in order to prevent erratic spinning and/or keep camera 205 level with the ground.
In some embodiments, battery 270 is a multi-cell (e.g., 2S, 3S, 4S, etc.) lithium polymer battery. Battery 270 can also be a lithium-ion, lead-acid, nickel-cadmium, or alkaline battery. Other battery types and variants can be used as known in the art. Additional or alternative to battery 270, other energy sources can be used. For example, UAV 200 can use solar panels, wireless power transfer, a tethered power cable (e.g., from a ground station or another UAV 200), etc. In some embodiments, the other energy source can be utilized to charge battery 270 while in flight or on the ground.
After calculating current flight characteristics, target flight characteristics, and response characteristics flight controller 230 can calculate optimized control signals to achieve the target flight characteristics. Various control systems can be implemented during these calculations. For example a proportional-integral-derivative (PID) can be used. In some embodiments, an open-loop control system (i.e., one that ignores current flight characteristics) can be used. In some embodiments, some of the functions of flight controller 230 are performed by a system external to UAV 200. For example, current flight characteristics can be sent to a server that returns the optimized control signals. Flight controller 230 can send the optimized control signals to electronic speed controllers 245 to control UAV 200.
In some embodiments, UAV 200 has various outputs that are not part of the flight control system. For example, UAV 200 can have a loudspeaker for communicating with people or other UAVs 200. Similarly, UAV 200 can have a flashlight or laser. The laser can be used to “tag” another UAV 200.
In some implementations, the flight path management system senses movement of the UAV, by at least one sensor, and determines controller inputs that correspond to the movement of the UAV.
At step 520, the flight path management system stores the sequence of controller inputs in a memory.
At step 530, the flight path management system accesses the memory and selects a selected section of the sequence of controller inputs corresponding to a time period.
At step 540, the flight path management system outputs the selected section to a playback device in real time over a length of the time period. In some implementations, the playback device is a self-piloting UAV and outputting comprises controlling the UAV to fly according to the selected section of the sequence of controller inputs. In some implementations, outputting comprises displaying a graphical representation of controller inputs, corresponding to the selected section, on a virtual model of a controller device. In some implementations, outputting comprises displaying a first-person view playback of the UAV, corresponding to the selected section, on the playback device. In some implementations, outputting comprises displaying a third-person view playback of the UAV, corresponding to the selected section, on the playback device. In some implementations, outputting comprises displaying a trace of a flight path of the UAV, corresponding to the selected section, on the playback device. In some implementations, the playback device comprises a head-mounted display (HMD). In some implementations, outputting the selected section occurs concurrently with the receiving the sequence of controller inputs.
In some implementations, the flight path management system moves of one or more levers of a flight stick, wherein controller inputs corresponding to the moving matches the selected section of the sequence of controller inputs, and wherein the playback device is the flight stick.
Stages of the graphics pipeline include creating a scene out of geometric primitives (i.e., simple geometric objects such as points or straight line segments). Traditionally this is done using triangles, which are particularly well suited to this as they always exist on a single plane. The graphics pipeline transforms form a local coordinate system to a three-dimensional world coordinate system. A model of an object in abstract is placed in the coordinate system of the three-dimensional world. Then the graphics pipeline transforms the three-dimensional world coordinate system into a three-dimensional camera coordinate system, with the camera as the origin.
The graphics pipeline then creates lighting, illuminating according to lighting and reflectance. The graphics pipeline then performs projection transformation to transform the three-dimensional world coordinates into a two-dimensional view of a two-dimensional camera. In the case of a perspective projection, objects which are distant from the camera are made smaller. Geometric primitives that now fall completely outside of the viewing frustum will not be visible and are discarded.
Next the graphics pipe performs rasterization. Rasterization is the process by which the two-dimensional image space representation of the scene is converted into raster format and the correct resulting pixel values are determined. From now on, operations will be carried out on each single pixel. This stage is rather complex, involving multiple steps often referred as a group under the name of pixel pipeline.
In some implementations, the graphical user interface performs projecting and rasterizing based on the virtual camera. For example, each pixel in the image can be determined during the rasterizing to create the two-dimensional image. This two-dimensional image can then be displayed.
Lastly, the graphics pipeline assigns individual fragments (or pre-pixels) a color based on values interpolated from the vertices during rasterization, from a texture in memory, or from a shader program. A shader program calculates appropriate levels of color within an image, produce special effects, and perform video post-processing. Shader programs calculate rendering effects on graphics hardware with a high degree of flexibility. Most shader programs use a graphics processing unit (GPU).
When a stitched together camera frame is mapped over the three-dimensional mesh, the graphics pipeline can interpolate between vertexes of the mesh. The number of vertices of the mesh is a factor in how well an image can be rendered. A higher number of vertices can provide a better image rendering, but can be more time consuming to render by computer hardware. Each vertex can represented as a three dimensional coordinate with and X, Y and Z parameter.
Interpolation is the filling in of frames between the key frames. It typically calculates the in between frames through use of (usually) piecewise polynomial interpolation to draw images semi-automatically. Interpolation gives the appearance that a first frame evolves smoothly into a second frame.
In the example graphics pipeline show in FIG. 6 , at step 610 untransformed model vertices are stored in vertex memory buffers. At step 620 geometric primitives, including points, lines, triangles, and polygons, are referenced in the vertex data with index buffers. At step 630, tessellation converts higher-order primitives, displacement maps, and mesh patches to vertex locations and stores those locations in vertex buffers. At step 640, transformations are applied to vertices stored in the vertex buffer. At step 650, clipping, back face culling, attribute evaluation, and rasterization are applied to the transformed vertices. At step 660, Pixel shader operations use geometry data to modify input vertex and texture data, yielding output pixel color values. At step 670, texture coordinates are supplied. At step 680 texture level-of-detail filtering is applied to input texture values. At step 690 final rendering processes modify pixel color values with alpha, depth, or stencil testing, or by applying alpha blending or fog.
The processing system 700 can be, for example, a server (e.g., one of many rack servers in a data center) or a personal computer. The processor (e.g., central processing unit (CPU)) 740 can be a chip on a motherboard that can retrieve and execute programming instructions stored in the memory 720. The processor 740 can be a single CPU with a single processing core, a single CPU with multiple processing cores, or multiple CPUs. One or more buses (not shown) can transmit instructions and application data between various computer components such as the processor 740, memory 720, storage 730, and networking interface 750.
The memory 720 can include any physical device used to temporarily or permanently store data or programs, such as various forms of random-access memory (RAM). The storage 730 can include any physical device for non-volatile data storage such as a HDD or a flash drive. The storage 730 can have a greater capacity than the memory 720 and can be more economical per unit of storage, but can also have slower transfer rates.
The BIOS 710 can include a Basic Input/Output System or its successors or equivalents, such as an Extensible Firmware Interface (EFI) or Unified Extensible Firmware Interface (UEFI). The BIOS 710 can include a BIOS chip located on a motherboard of the processing system 700 storing a BIOS software program. The BIOS 710 can store firmware executed when the computer system is first powered on along with a set of configurations specified for the BIOS 710. The BIOS firmware and BIOS configurations can be stored in a non-volatile memory (e.g., NVRAM) 712 or a ROM such as flash memory. Flash memory is a non-volatile computer storage medium that can be electronically erased and reprogrammed.
The BIOS 710 can be loaded and executed as a sequence program each time the processing system 700 is started. The BIOS 710 can recognize, initialize, and test hardware present in a given computing system based on the set of configurations. The BIOS 710 can perform self-test, such as a Power-on-Self-Test (POST), on the processing system 700. This self-test can test functionality of various hardware components such as hard disk drives, optical reading devices, cooling devices, memory modules, expansion cards and the like. The BIOS can address and allocate an area in the memory 720 in to store an operating system. The BIOS 710 can then give control of the computer system to the OS.
The BIOS 710 of the processing system 700 can include a BIOS configuration that defines how the BIOS 710 controls various hardware components in the processing system 700. The BIOS configuration can determine the order in which the various hardware components in the processing system 700 are started. The BIOS 710 can provide an interface (e.g., BIOS setup utility) that allows a variety of different parameters to be set, which can be different from parameters in a BIOS default configuration. For example, a user (e.g., an administrator) can use the BIOS 710 to specify dock and bus speeds, specify what peripherals are attached to the computer system, specify monitoring of health (e.g., fan speeds and CPU temperature limits), and specify a variety of other parameters that affect overall performance and power usage of the computer system.
The management controller 780 can be a specialized microcontroller embedded on the motherboard of the computer system. For example, the management controller 780 can be a BMC or a RMC. The management controller 780 can manage the interface between system management software and platform hardware. Different types of sensors built into the computer system can report to the management controller 780 on parameters such as temperature, cooling fan speeds, power status, operating system status, etc. The management controller 780 can monitor the sensors and have the ability to send alerts to an administrator via the network interface 750 if any of the parameters do not stay within preset limits, indicating a potential failure of the system. The administrator can also remotely communicate with the management controller 780 to take some corrective action such as resetting or power cycling the system to restore functionality.
The northbridge 760 can be a chip on the motherboard that can be directly connected to the processor 740 or can be integrated into the processor 740. In some instances, the northbridge 760 and the southbridge 770 can be combined into a single die. The northbridge 760 and the southbridge 770, manage communications between the processor 740 and other parts of the motherboard. The northbridge 760 can manage tasks that require higher performance than the southbridge 770. The northbridge 760 can manage communications between the processor 740, the memory 720, and video controllers (not shown). In some instances, the northbridge 760 can include a video controller.
The southbridge 770 can be a chip on the motherboard connected to the northbridge 760, but unlike the northbridge 760, is not directly connected to the processor 740. The southbridge 770 can manage input/output functions (e.g., audio functions, BIOS, Universal Serial Bus (USB), Serial Advanced Technology Attachment (SATA), Peripheral Component Interconnect (PCI) bus, PCI eXtended (PCI-X) bus, PCI Express bus, Industry Standard Architecture (ISA) bus, Serial Peripheral Interface (SPI) bus, Enhanced Serial Peripheral Interface (eSPI) bus, System Management Bus (SMBus), etc.) of the processing system 700. The southbridge 770 can be connected to or can include within the southbridge 770 the management controller 770, Direct Memory Access (DMAs) controllers, Programmable Interrupt Controllers (PICs), and a real-time clock.
The input device 702 can be at least one of a game controller, a joystick, a mouse, a keyboard, a touchscreen, a trackpad, or other similar control device. The input device 702 allows a user to provide input data to the processing system 700.
The display device 704 can be at least one of a monitor, a light-emitting display (LED) screen, a liquid crystal display (LCD) screen, a head mounted display (HMD), a virtual reality (VR) display, a augmented reality (AR) display, or other such output device. The display device 704 allows the processing system 700 to output visual information to a user.
The various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein can be implemented or performed with a general-purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor can be a microprocessor, but in the alternative, the processor can be any conventional processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The operations of a method or algorithm described in connection with the disclosure herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can reside in an ASIC. The ASIC can reside in a user terminal. In the alternative, the processor and the storage medium can reside as discrete components in a user terminal.
In one or more exemplary designs, the functions described can be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions can be stored on or transmitted over as one or more instructions or code on a non-transitory computer-readable medium. Non-transitory computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media can be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer-readable media can include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code means in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blue ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of non-transitory computer-readable media.
The previous description of the disclosure is provided to enable any person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein can be applied to other variations without departing from the scope of the disclosure. Thus, the disclosure is not intended to be limited to the examples and designs described herein, but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (20)
1. A method for managing flight paths for an unmanned aerial vehicle (UAV) by a flight path management system, the method comprising:
receiving a sequence of controller inputs at a recording device, the received sequence of controller inputs associated with controlling a flight path of the UAV;
storing the sequence of controller inputs in a memory associated with the recording device, the sequence of controller inputs stored in association with the flight path;
receiving a selection of a portion of the flight path at a user interface associated with a playback device, the portion corresponding to a time period within a duration of the flight path;
identifying a section of the sequence of controller inputs corresponding to the time period within the duration of the flight path based on execution of instructions by a processor associated with the playback device; and
outputting a graphical representation of the identified section of the sequence of controller inputs to the playback device in real time over a length of the time period.
2. The method of claim 1 , further comprising recording the controller inputs received from a controller device at the recording device.
3. The method of claim 1 , wherein receiving the sequence of controller inputs comprises retrieving a saved flight path from an online database.
4. The method of claim 1 , further comprising:
sensing movement of the UAV via at least one sensor; and
identifying the sequence of controller inputs that correspond to the movement of the UAV via the processor.
5. The method of claim 1 , wherein the playback device is a self-piloting UAV, and further outputting control instructions executable by the self-piloting UAV to fly according to the identified section of the sequence of controller inputs.
6. The method of claim 1 , wherein outputting the graphical representation further comprises displaying the graphical representation of the identified section of the sequence of controller inputs on a virtual model of a controller device.
7. The method of claim 1 , wherein the playback device includes a flight stick, and wherein the identified section of the sequence of controller inputs comprises one or more lever movements of the flight stick.
8. The method of claim 1 , wherein the graphical representation further comprises a first-person view playback of the UAV corresponding to the identified section of the sequence of controller inputs as displayed on the playback device.
9. The method of claim 1 , wherein the graphical representation further comprises a third-person view playback of the UAV corresponding to the identified section of the sequence of controller inputs as displayed on the playback device.
10. The method of claim 1 , wherein outputting the graphical representation further comprises a trace of the flight path of the UAV corresponding to the identified section of the sequence of controller inputs as displayed on the playback device.
11. The method of claim 1 , wherein the playback device comprises a head-mounted display (HMD).
12. The method of claim 1 , wherein outputting the graphical representation of the identified section occurs concurrently with receiving the sequence of controller inputs.
13. A system for managing flight paths by an unmanned aerial vehicle (UAV), the system comprising:
a recording device that receives a sequence of controller inputs associated with controlling a flight path of the UAV;
a memory coupled to the recording device that stores the sequence of controller inputs in association with the flight path; and
a playback device configured to:
access the memory and select an identified section of the sequence of controller inputs corresponding to a time period within a duration of the flight path based on a received selection of a portion of the flight path, the portion corresponding to the time period within a duration of the flight path; and
output a graphical representation of the identified section of the sequence of controller inputs to a playback device in real time over a length of the time period.
14. The system of claim 13 further comprising at least one sensor that senses movement of the UAV; and a processor that executes instructions stored in memory to identify the sequence of controller inputs that corresponds to the movement of the UAV.
15. The system of claim 13 , wherein the playback device is a self-piloting UAV that further outputs control instructions executable by the self-piloting UAV to fly according to the identified section of the sequence of controller inputs.
16. The system of claim 13 wherein the playback device outputs the graphical representation by displaying the graphical representation of the identified section of the sequence of controller inputs on a virtual model of a controller device.
17. The system of claim 13 , wherein the playback device includes a flight stick, and wherein the identified section of the sequence of controller inputs comprises one or more lever movements of the flight stick.
18. The system of claim 13 , wherein the playback device outputs the graphical representation by displaying a first-person view playback of the UAV corresponding to the identified section of the sequence of controller inputs as displayed on the playback device.
19. The system of claim 13 wherein the playback device outputs the graphical representation by displaying a third-person view playback of the UAV corresponding to the identified section of the sequence of controller inputs as displayed on the playback device.
20. The system of claim 13 wherein the playback device outputs the graphical representation by displaying a trace of the flight path of the UAV corresponding to the identified section of the sequence of controller inputs as displayed on the playback device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/394,391 US10210905B2 (en) | 2016-09-30 | 2016-12-29 | Remote controlled object macro and autopilot system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662402737P | 2016-09-30 | 2016-09-30 | |
US15/394,391 US10210905B2 (en) | 2016-09-30 | 2016-12-29 | Remote controlled object macro and autopilot system |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180095463A1 US20180095463A1 (en) | 2018-04-05 |
US10210905B2 true US10210905B2 (en) | 2019-02-19 |
Family
ID=61758714
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/394,391 Active US10210905B2 (en) | 2016-09-30 | 2016-12-29 | Remote controlled object macro and autopilot system |
Country Status (1)
Country | Link |
---|---|
US (1) | US10210905B2 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10317207B2 (en) * | 2017-03-09 | 2019-06-11 | Moxa Inc. | Three-dimensional trace verification apparatus and method thereof |
US10336469B2 (en) | 2016-09-30 | 2019-07-02 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental interactions |
US10357709B2 (en) | 2016-09-30 | 2019-07-23 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental airflow |
US10377484B2 (en) | 2016-09-30 | 2019-08-13 | Sony Interactive Entertainment Inc. | UAV positional anchors |
US10410320B2 (en) | 2016-09-30 | 2019-09-10 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US10416669B2 (en) | 2016-09-30 | 2019-09-17 | Sony Interactive Entertainment Inc. | Mechanical effects by way of software or real world engagement |
US10679511B2 (en) | 2016-09-30 | 2020-06-09 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
US10850838B2 (en) | 2016-09-30 | 2020-12-01 | Sony Interactive Entertainment Inc. | UAV battery form factor and insertion/ejection methodologies |
US10979672B1 (en) * | 2020-10-20 | 2021-04-13 | Katmai Tech Holdings LLC | Web-based videoconference virtual environment with navigable avatars, and applications thereof |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6408832B2 (en) * | 2014-08-27 | 2018-10-17 | ルネサスエレクトロニクス株式会社 | Control system, relay device, and control method |
US10067736B2 (en) | 2016-09-30 | 2018-09-04 | Sony Interactive Entertainment Inc. | Proximity based noise and chat |
US11125561B2 (en) | 2016-09-30 | 2021-09-21 | Sony Interactive Entertainment Inc. | Steering assist |
US10388056B2 (en) * | 2017-01-26 | 2019-08-20 | Advanced Micro Devices, Inc. | Split frame rendering |
CN110891862B (en) * | 2017-08-10 | 2023-07-11 | 深圳零零无限科技有限公司 | System and method for obstacle avoidance in a flight system |
JP7355738B2 (en) * | 2017-08-17 | 2023-10-03 | コロンビアド ローンチ サービシーズ インコーポレイテッド | System and method for distributing power to aircraft systems |
US10817983B1 (en) * | 2017-09-28 | 2020-10-27 | Apple Inc. | Method and device for combining real and virtual images |
WO2019165588A1 (en) * | 2018-02-28 | 2019-09-06 | 深圳市大疆创新科技有限公司 | Teaching method for unmanned aerial vehicle and remote controller for unmanned aerial vehicle |
WO2020226719A1 (en) * | 2019-02-18 | 2020-11-12 | Livieratos Evangelos | Breaching for submergible fixed wing aircraft |
TWI750932B (en) * | 2020-12-03 | 2021-12-21 | 國立陽明交通大學 | Autonomous local trajectory generation system and method for real time navigation without detail vector map |
Citations (75)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3279863A (en) | 1963-10-22 | 1966-10-18 | Spencer Melksham Ltd | Mobile air layer transporter |
US3367658A (en) | 1964-11-19 | 1968-02-06 | Edwin H. Bayha | Air jet toy |
US6021646A (en) | 1998-06-26 | 2000-02-08 | Burley's Rink Supply, Inc. | Floor system for a rink |
US6075924A (en) | 1995-01-13 | 2000-06-13 | University Of Southern California | Intelligent motion surface |
US20030102016A1 (en) | 2001-12-04 | 2003-06-05 | Gary Bouchard | Integrated circuit processing system |
US20030152892A1 (en) | 2002-02-11 | 2003-08-14 | United Defense, L.P. | Naval virtual target range system |
US20040115593A1 (en) | 2002-08-20 | 2004-06-17 | Hatlestad Kathryn W. | Free fall simulator |
US20050004723A1 (en) | 2003-06-20 | 2005-01-06 | Geneva Aerospace | Vehicle control system including related methods and components |
US20060095262A1 (en) | 2004-10-28 | 2006-05-04 | Microsoft Corporation | Automatic censorship of audio data for broadcast |
US20060169508A1 (en) | 2005-01-18 | 2006-08-03 | Trojahn Charles J | Air cushion vehicle and game |
US20070061116A1 (en) | 2001-11-27 | 2007-03-15 | Lockheed Martin Corporation | Robust uninhabited air vehicle active missions |
US20070102876A1 (en) | 2003-12-16 | 2007-05-10 | Dmi Sports, Inc. | Virtual goal for a game table |
US20080073839A1 (en) | 2006-09-21 | 2008-03-27 | Sportcraft, Ltd. | Game table with centrifugal blower assembly |
US20080093796A1 (en) | 2006-10-20 | 2008-04-24 | Narus Michael H | Banked air hockey table |
US20080144884A1 (en) | 2006-07-20 | 2008-06-19 | Babak Habibi | System and method of aerial surveillance |
US20080154447A1 (en) * | 2006-12-21 | 2008-06-26 | Spinelli Charles B | Determining suitable areas for off-airport landings |
US20080221745A1 (en) * | 2006-10-02 | 2008-09-11 | Rocket Racing, Inc. | Collection and distribution system |
US20090005167A1 (en) | 2004-11-29 | 2009-01-01 | Juha Arrasvuori | Mobile Gaming with External Devices in Single and Multiplayer Games |
US20090076665A1 (en) | 2007-09-14 | 2009-03-19 | Hoisington Zachary C | Method and System to Control Operation of a Device Using an Integrated Simulation with a Time Shift Option |
US20090087029A1 (en) | 2007-08-22 | 2009-04-02 | American Gnc Corporation | 4D GIS based virtual reality for moving target prediction |
US20090118896A1 (en) | 2007-10-15 | 2009-05-07 | Saab Ab | Method and apparatus for generating at least one voted flight trajectory of a vehicle |
US20090187389A1 (en) * | 2008-01-18 | 2009-07-23 | Lockheed Martin Corporation | Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave |
US20100083038A1 (en) * | 2008-09-30 | 2010-04-01 | David Barnard Pierce | Method and systems for restarting a flight control system |
US20100096491A1 (en) * | 2006-10-02 | 2010-04-22 | Rocket Racing, Inc. | Rocket-powered entertainment vehicle |
US20100121574A1 (en) | 2006-09-05 | 2010-05-13 | Honeywell International Inc. | Method for collision avoidance of unmanned aerial vehicle with other aircraft |
US20100228468A1 (en) | 2009-03-03 | 2010-09-09 | D Angelo Giuseppe Maria | Method of collision prediction between an air vehicle and an airborne object |
US20100305724A1 (en) | 2007-12-19 | 2010-12-02 | Robert Eric Fry | Vehicle competition implementation system |
US20110106339A1 (en) | 2006-07-14 | 2011-05-05 | Emilie Phillips | Autonomous Behaviors for a Remote Vehicle |
US7988154B1 (en) | 2010-03-11 | 2011-08-02 | Regan Jr James I | Air actuated ball game |
US8025293B1 (en) | 2010-03-26 | 2011-09-27 | Crawford Timothy D | Air hockey table |
US20120035799A1 (en) | 2010-01-13 | 2012-02-09 | Meimadtek Ltd. | Method and system for operating a self-propelled vehicle according to scene images |
US20120188078A1 (en) | 2011-01-21 | 2012-07-26 | Soles Alexander M | Damage detection and remediation system and methods thereof |
US20120212399A1 (en) | 2010-02-28 | 2012-08-23 | Osterhout Group, Inc. | See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film |
US20130328927A1 (en) | 2011-11-03 | 2013-12-12 | Brian J. Mount | Augmented reality playspaces with adaptive game rules |
US8909391B1 (en) * | 2012-12-28 | 2014-12-09 | Google Inc. | Responsive navigation of an unmanned aerial vehicle to a remedial facility |
US20150063610A1 (en) | 2013-08-30 | 2015-03-05 | GN Store Nord A/S | Audio rendering system categorising geospatial objects |
US9061102B2 (en) | 2012-07-17 | 2015-06-23 | Elwha Llc | Unmanned device interaction methods and systems |
US20150209659A1 (en) | 2014-01-30 | 2015-07-30 | Airblade Technologies Llc | Surface with airflow |
US20150323931A1 (en) | 2014-05-12 | 2015-11-12 | Unmanned Innovation, Inc. | Unmanned aerial vehicle authorization and geofence envelope determination |
US20150346722A1 (en) * | 2014-05-27 | 2015-12-03 | Recreational Drone Event Systems, Llc | Virtual and Augmented Reality Cockpit and Operational Control Systems |
US20160035224A1 (en) | 2014-07-31 | 2016-02-04 | SZ DJI Technology Co., Ltd. | System and method for enabling virtual sightseeing using unmanned aerial vehicles |
US20160078759A1 (en) | 2012-08-06 | 2016-03-17 | Cloudparc, Inc. | Tracking a Vehicle Using an Unmanned Aerial Vehicle |
US20160091894A1 (en) | 2014-09-30 | 2016-03-31 | SZ DJI Technology Co., Ltd | Systems and methods for flight simulation |
US20160117931A1 (en) | 2014-09-30 | 2016-04-28 | Elwha Llc | System and method for management of airspace for unmanned aircraft |
US20160205654A1 (en) * | 2015-01-09 | 2016-07-14 | Fresh Digital, Inc. | Systems and methods for providing location specific content and notifications utilizing beacons and drones |
US20160217698A1 (en) | 2014-09-05 | 2016-07-28 | SZ DJI Technology Co., Ltd | Context-based flight mode selection |
US20160253908A1 (en) | 2015-01-22 | 2016-09-01 | Zipline International Inc. | Unmanned aerial vehicle management system |
US20160257001A1 (en) | 2015-03-03 | 2016-09-08 | Toyota Motor Engineering & Manufacturing North America, Inc. | Push constraint using robotic limbs |
US9442485B1 (en) | 2014-08-13 | 2016-09-13 | Trace Live Network Inc. | Pixel based image tracking system for unmanned aerial vehicle (UAV) action camera system |
US20160291593A1 (en) | 2015-03-03 | 2016-10-06 | PreNav, Inc. | Scanning environments and tracking unmanned aerial vehicles |
US20160299506A1 (en) | 2013-12-04 | 2016-10-13 | Spatial Information Systems Research Limited | Method and apparatus for developing a flight path |
US20170039859A1 (en) | 2015-08-03 | 2017-02-09 | Amber Garage, Inc. | Planning a flight path by identifying key frames |
US20170053169A1 (en) | 2015-08-20 | 2017-02-23 | Motionloft, Inc. | Object detection and analysis via unmanned aerial vehicle |
US20170061813A1 (en) * | 2014-09-30 | 2017-03-02 | SZ DJI Technology Co., Ltd. | System and method for supporting simulated movement |
US20170069214A1 (en) | 2015-07-29 | 2017-03-09 | Dennis J. Dupray | Unmanned aerial vehicles |
US9605926B1 (en) | 2016-01-07 | 2017-03-28 | DuckDrone, LLC | Drone-target hunting/shooting system |
US20170116723A1 (en) | 2015-10-23 | 2017-04-27 | The Boeing Company | Pattern-based camera pose estimation system |
US20170158353A1 (en) * | 2015-08-07 | 2017-06-08 | Mark Schmick | Remote Aerodrome for UAVs |
US20170251323A1 (en) | 2014-08-13 | 2017-08-31 | Samsung Electronics Co., Ltd. | Method and device for generating and playing back audio signal |
US20170295446A1 (en) | 2016-04-08 | 2017-10-12 | Qualcomm Incorporated | Spatialized audio output based on predicted position data |
US20170372617A1 (en) | 2015-07-15 | 2017-12-28 | Harris Corporation | Process and System to Register and Regulate Unmanned Aerial Vehicle Operations |
US20180039262A1 (en) * | 2016-08-04 | 2018-02-08 | International Business Machines Corporation | Lost person rescue drone |
US20180046187A1 (en) | 2016-08-12 | 2018-02-15 | Skydio, Inc. | Unmanned aerial image capture platform |
US20180046560A1 (en) | 2016-08-12 | 2018-02-15 | Dash Robotics, Inc. | Device-agnostic systems, methods, and media for connected hardware-based analytics |
US20180095461A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Uav positional anchors |
US20180093171A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental airflow |
US20180093768A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Uav battery form factor and insertion/ejection methodologies |
US20180093781A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental interactions |
WO2018063594A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US20180094931A1 (en) | 2016-09-30 | 2018-04-05 | Michael Taylor | Steering assist |
US20180096611A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
US20180095433A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Mechanical effects by way of software or real world engagement |
US20180098052A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Translation of physical object viewed by unmanned aerial vehicle into virtual world object |
US20180095714A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Proximity based noise and chat |
US10062292B2 (en) | 2016-03-08 | 2018-08-28 | International Business Machines Corporation | Programming language for execution by drone |
-
2016
- 2016-12-29 US US15/394,391 patent/US10210905B2/en active Active
Patent Citations (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3279863A (en) | 1963-10-22 | 1966-10-18 | Spencer Melksham Ltd | Mobile air layer transporter |
US3367658A (en) | 1964-11-19 | 1968-02-06 | Edwin H. Bayha | Air jet toy |
US6075924A (en) | 1995-01-13 | 2000-06-13 | University Of Southern California | Intelligent motion surface |
US6021646A (en) | 1998-06-26 | 2000-02-08 | Burley's Rink Supply, Inc. | Floor system for a rink |
US20070061116A1 (en) | 2001-11-27 | 2007-03-15 | Lockheed Martin Corporation | Robust uninhabited air vehicle active missions |
US20030102016A1 (en) | 2001-12-04 | 2003-06-05 | Gary Bouchard | Integrated circuit processing system |
US20030152892A1 (en) | 2002-02-11 | 2003-08-14 | United Defense, L.P. | Naval virtual target range system |
US20040115593A1 (en) | 2002-08-20 | 2004-06-17 | Hatlestad Kathryn W. | Free fall simulator |
US20050004723A1 (en) | 2003-06-20 | 2005-01-06 | Geneva Aerospace | Vehicle control system including related methods and components |
US20090125163A1 (en) | 2003-06-20 | 2009-05-14 | Geneva Aerospace | Vehicle control system including related methods and components |
US20110184590A1 (en) | 2003-06-20 | 2011-07-28 | Geneva Aerospace | Unmanned aerial vehicle take-off and landing systems |
US20140324253A1 (en) | 2003-06-20 | 2014-10-30 | L-3 Unmanned Systems, Inc. | Autonomous control of unmanned aerial vehicles |
US20070102876A1 (en) | 2003-12-16 | 2007-05-10 | Dmi Sports, Inc. | Virtual goal for a game table |
US20060095262A1 (en) | 2004-10-28 | 2006-05-04 | Microsoft Corporation | Automatic censorship of audio data for broadcast |
US20090005167A1 (en) | 2004-11-29 | 2009-01-01 | Juha Arrasvuori | Mobile Gaming with External Devices in Single and Multiplayer Games |
US20060169508A1 (en) | 2005-01-18 | 2006-08-03 | Trojahn Charles J | Air cushion vehicle and game |
US20110106339A1 (en) | 2006-07-14 | 2011-05-05 | Emilie Phillips | Autonomous Behaviors for a Remote Vehicle |
US20080144884A1 (en) | 2006-07-20 | 2008-06-19 | Babak Habibi | System and method of aerial surveillance |
US20100121574A1 (en) | 2006-09-05 | 2010-05-13 | Honeywell International Inc. | Method for collision avoidance of unmanned aerial vehicle with other aircraft |
US20080073839A1 (en) | 2006-09-21 | 2008-03-27 | Sportcraft, Ltd. | Game table with centrifugal blower assembly |
US20080221745A1 (en) * | 2006-10-02 | 2008-09-11 | Rocket Racing, Inc. | Collection and distribution system |
US20100096491A1 (en) * | 2006-10-02 | 2010-04-22 | Rocket Racing, Inc. | Rocket-powered entertainment vehicle |
US20080093796A1 (en) | 2006-10-20 | 2008-04-24 | Narus Michael H | Banked air hockey table |
US20080154447A1 (en) * | 2006-12-21 | 2008-06-26 | Spinelli Charles B | Determining suitable areas for off-airport landings |
US20090087029A1 (en) | 2007-08-22 | 2009-04-02 | American Gnc Corporation | 4D GIS based virtual reality for moving target prediction |
US20090076665A1 (en) | 2007-09-14 | 2009-03-19 | Hoisington Zachary C | Method and System to Control Operation of a Device Using an Integrated Simulation with a Time Shift Option |
US20090118896A1 (en) | 2007-10-15 | 2009-05-07 | Saab Ab | Method and apparatus for generating at least one voted flight trajectory of a vehicle |
US20100305724A1 (en) | 2007-12-19 | 2010-12-02 | Robert Eric Fry | Vehicle competition implementation system |
US20090187389A1 (en) * | 2008-01-18 | 2009-07-23 | Lockheed Martin Corporation | Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave |
US20100083038A1 (en) * | 2008-09-30 | 2010-04-01 | David Barnard Pierce | Method and systems for restarting a flight control system |
US20100228468A1 (en) | 2009-03-03 | 2010-09-09 | D Angelo Giuseppe Maria | Method of collision prediction between an air vehicle and an airborne object |
US20120035799A1 (en) | 2010-01-13 | 2012-02-09 | Meimadtek Ltd. | Method and system for operating a self-propelled vehicle according to scene images |
US20120212399A1 (en) | 2010-02-28 | 2012-08-23 | Osterhout Group, Inc. | See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film |
US7988154B1 (en) | 2010-03-11 | 2011-08-02 | Regan Jr James I | Air actuated ball game |
US8025293B1 (en) | 2010-03-26 | 2011-09-27 | Crawford Timothy D | Air hockey table |
US20120188078A1 (en) | 2011-01-21 | 2012-07-26 | Soles Alexander M | Damage detection and remediation system and methods thereof |
US20130328927A1 (en) | 2011-11-03 | 2013-12-12 | Brian J. Mount | Augmented reality playspaces with adaptive game rules |
US9061102B2 (en) | 2012-07-17 | 2015-06-23 | Elwha Llc | Unmanned device interaction methods and systems |
US20160078759A1 (en) | 2012-08-06 | 2016-03-17 | Cloudparc, Inc. | Tracking a Vehicle Using an Unmanned Aerial Vehicle |
US8909391B1 (en) * | 2012-12-28 | 2014-12-09 | Google Inc. | Responsive navigation of an unmanned aerial vehicle to a remedial facility |
US20150063610A1 (en) | 2013-08-30 | 2015-03-05 | GN Store Nord A/S | Audio rendering system categorising geospatial objects |
US20160299506A1 (en) | 2013-12-04 | 2016-10-13 | Spatial Information Systems Research Limited | Method and apparatus for developing a flight path |
US20150209659A1 (en) | 2014-01-30 | 2015-07-30 | Airblade Technologies Llc | Surface with airflow |
US20150323931A1 (en) | 2014-05-12 | 2015-11-12 | Unmanned Innovation, Inc. | Unmanned aerial vehicle authorization and geofence envelope determination |
US20150346722A1 (en) * | 2014-05-27 | 2015-12-03 | Recreational Drone Event Systems, Llc | Virtual and Augmented Reality Cockpit and Operational Control Systems |
US20160035224A1 (en) | 2014-07-31 | 2016-02-04 | SZ DJI Technology Co., Ltd. | System and method for enabling virtual sightseeing using unmanned aerial vehicles |
US20170251323A1 (en) | 2014-08-13 | 2017-08-31 | Samsung Electronics Co., Ltd. | Method and device for generating and playing back audio signal |
US9442485B1 (en) | 2014-08-13 | 2016-09-13 | Trace Live Network Inc. | Pixel based image tracking system for unmanned aerial vehicle (UAV) action camera system |
US20160217698A1 (en) | 2014-09-05 | 2016-07-28 | SZ DJI Technology Co., Ltd | Context-based flight mode selection |
US20170061813A1 (en) * | 2014-09-30 | 2017-03-02 | SZ DJI Technology Co., Ltd. | System and method for supporting simulated movement |
US20160091894A1 (en) | 2014-09-30 | 2016-03-31 | SZ DJI Technology Co., Ltd | Systems and methods for flight simulation |
US20160117931A1 (en) | 2014-09-30 | 2016-04-28 | Elwha Llc | System and method for management of airspace for unmanned aircraft |
US20160205654A1 (en) * | 2015-01-09 | 2016-07-14 | Fresh Digital, Inc. | Systems and methods for providing location specific content and notifications utilizing beacons and drones |
US20160253908A1 (en) | 2015-01-22 | 2016-09-01 | Zipline International Inc. | Unmanned aerial vehicle management system |
US20160291593A1 (en) | 2015-03-03 | 2016-10-06 | PreNav, Inc. | Scanning environments and tracking unmanned aerial vehicles |
US20160257001A1 (en) | 2015-03-03 | 2016-09-08 | Toyota Motor Engineering & Manufacturing North America, Inc. | Push constraint using robotic limbs |
US20170372617A1 (en) | 2015-07-15 | 2017-12-28 | Harris Corporation | Process and System to Register and Regulate Unmanned Aerial Vehicle Operations |
US20170069214A1 (en) | 2015-07-29 | 2017-03-09 | Dennis J. Dupray | Unmanned aerial vehicles |
US20170039859A1 (en) | 2015-08-03 | 2017-02-09 | Amber Garage, Inc. | Planning a flight path by identifying key frames |
US20170158353A1 (en) * | 2015-08-07 | 2017-06-08 | Mark Schmick | Remote Aerodrome for UAVs |
US20170053169A1 (en) | 2015-08-20 | 2017-02-23 | Motionloft, Inc. | Object detection and analysis via unmanned aerial vehicle |
US20170116723A1 (en) | 2015-10-23 | 2017-04-27 | The Boeing Company | Pattern-based camera pose estimation system |
US9605926B1 (en) | 2016-01-07 | 2017-03-28 | DuckDrone, LLC | Drone-target hunting/shooting system |
US10062292B2 (en) | 2016-03-08 | 2018-08-28 | International Business Machines Corporation | Programming language for execution by drone |
US20170295446A1 (en) | 2016-04-08 | 2017-10-12 | Qualcomm Incorporated | Spatialized audio output based on predicted position data |
US20180039262A1 (en) * | 2016-08-04 | 2018-02-08 | International Business Machines Corporation | Lost person rescue drone |
US20180046187A1 (en) | 2016-08-12 | 2018-02-15 | Skydio, Inc. | Unmanned aerial image capture platform |
US20180046560A1 (en) | 2016-08-12 | 2018-02-15 | Dash Robotics, Inc. | Device-agnostic systems, methods, and media for connected hardware-based analytics |
US20180095461A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Uav positional anchors |
US20180093768A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Uav battery form factor and insertion/ejection methodologies |
US20180093781A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental interactions |
WO2018063594A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US20180094931A1 (en) | 2016-09-30 | 2018-04-05 | Michael Taylor | Steering assist |
US20180096455A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US20180096611A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
US20180095433A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Mechanical effects by way of software or real world engagement |
US20180098052A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Translation of physical object viewed by unmanned aerial vehicle into virtual world object |
US20180095714A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Proximity based noise and chat |
US20180093171A1 (en) | 2016-09-30 | 2018-04-05 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental airflow |
US10067736B2 (en) | 2016-09-30 | 2018-09-04 | Sony Interactive Entertainment Inc. | Proximity based noise and chat |
Non-Patent Citations (16)
Title |
---|
Fujii, Katsuya; Higuchi, Keita; Rekimoto, Jun; "Endless Flyer: A Continuous Flying Drone with Automatic Battery Replacement", 2013 IEEE 10th International Conference on Ubiquitous Intelligence & Computing and 2013 IEEE 10th International Conference on Autonomic & Trusted Computing, pp. 216-223. |
PCT Application No. PCT/US2017/048064 International Search Report and Written Opinion dated Nov. 7, 2017. |
U.S. Appl. No. 15/393,855 Final Office Action dated Oct. 12, 2018. |
U.S. Appl. No. 15/393,855 Office Action dated May 16, 2018. |
U.S. Appl. No. 15/394,267 Office Action dated Aug. 24, 2018. |
U.S. Appl. No. 15/394,285 Office Action dated Aug. 3, 2018. |
U.S. Appl. No. 15/394,313 Office Action dated Oct. 18, 2017. |
U.S. Appl. No. 15/394,313, Michael Taylor, Proximity Based Noise and Chat, filed Dec. 29, 2016. |
U.S. Appl. No. 15/394,329 Office Action dated Aug. 7, 2018. |
U.S. Appl. No. 15/394,473, Dennis Castleman, UAV Battery Form Factor and Insertion/Ejection Methodologies, filed Dec. 29, 2016. |
U.S. Appl. No. 15/711,695 Office Action dated Oct. 5, 2018. |
U.S. Appl. No. 15/711,695, Dominic S. Mallinson, Unmanned Aerial Vehicle Movement Via Environmental Airflow, filed Sep. 21, 2017. |
U.S. Appl. No. 15/711,961 Office Action dated Oct. 5, 2018. |
U.S. Appl. No. 15/711,961, Dominic S. Mallinson, Unmanned Aerial Vehicle Movement Via Environmental Interactions, filed Sep. 21, 2017. |
U.S. Appl. No. 16/121,441, Michael Taylor, Proximity Based Noise and Chat, filed Sep. 4, 2018. |
Williams, Elliot; "Real-life Space Invaders with Drones and Lasers," Hackaday, Sep. 19, 2016. |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10540746B2 (en) | 2016-09-30 | 2020-01-21 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US10692174B2 (en) | 2016-09-30 | 2020-06-23 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US10357709B2 (en) | 2016-09-30 | 2019-07-23 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental airflow |
US10377484B2 (en) | 2016-09-30 | 2019-08-13 | Sony Interactive Entertainment Inc. | UAV positional anchors |
US10410320B2 (en) | 2016-09-30 | 2019-09-10 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US10416669B2 (en) | 2016-09-30 | 2019-09-17 | Sony Interactive Entertainment Inc. | Mechanical effects by way of software or real world engagement |
US10336469B2 (en) | 2016-09-30 | 2019-07-02 | Sony Interactive Entertainment Inc. | Unmanned aerial vehicle movement via environmental interactions |
US11288767B2 (en) | 2016-09-30 | 2022-03-29 | Sony Interactive Entertainment Inc. | Course profiling and sharing |
US11222549B2 (en) | 2016-09-30 | 2022-01-11 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
US10850838B2 (en) | 2016-09-30 | 2020-12-01 | Sony Interactive Entertainment Inc. | UAV battery form factor and insertion/ejection methodologies |
US10679511B2 (en) | 2016-09-30 | 2020-06-09 | Sony Interactive Entertainment Inc. | Collision detection and avoidance |
US10317207B2 (en) * | 2017-03-09 | 2019-06-11 | Moxa Inc. | Three-dimensional trace verification apparatus and method thereof |
US10979672B1 (en) * | 2020-10-20 | 2021-04-13 | Katmai Tech Holdings LLC | Web-based videoconference virtual environment with navigable avatars, and applications thereof |
US11290688B1 (en) | 2020-10-20 | 2022-03-29 | Katmai Tech Holdings LLC | Web-based videoconference virtual environment with navigable avatars, and applications thereof |
Also Published As
Publication number | Publication date |
---|---|
US20180095463A1 (en) | 2018-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10210905B2 (en) | Remote controlled object macro and autopilot system | |
US11222549B2 (en) | Collision detection and avoidance | |
US11125561B2 (en) | Steering assist | |
US11755041B2 (en) | Objective-based control of an autonomous unmanned aerial vehicle | |
US11861892B2 (en) | Object tracking by an unmanned aerial vehicle using visual sensors | |
US11307584B2 (en) | Applications and skills for an autonomous unmanned aerial vehicle | |
US20240062663A1 (en) | User Interaction With An Autonomous Unmanned Aerial Vehicle | |
US11288767B2 (en) | Course profiling and sharing | |
CN205263655U (en) | A system, Unmanned vehicles and ground satellite station for automatic generation panoramic photograph | |
US10377484B2 (en) | UAV positional anchors | |
US20180098052A1 (en) | Translation of physical object viewed by unmanned aerial vehicle into virtual world object | |
US20190079722A1 (en) | Proximity Based Noise and Chat | |
CN107087427A (en) | Control method, device and the equipment and aircraft of aircraft | |
CN105045279A (en) | System and method for automatically generating panorama photographs through aerial photography of unmanned aerial aircraft | |
CN113228103A (en) | Target tracking method, device, unmanned aerial vehicle, system and readable storage medium | |
JP6630939B2 (en) | Control device, imaging device, moving object, control method, and program | |
CN111433819A (en) | Target scene three-dimensional reconstruction method and system and unmanned aerial vehicle | |
CN114637306A (en) | Unmanned aerial vehicle visual navigation strategy method, device and medium | |
WO2022021028A1 (en) | Target detection method, device, unmanned aerial vehicle, and computer-readable storage medium | |
JP6547984B2 (en) | CONTROL DEVICE, IMAGING DEVICE, IMAGING SYSTEM, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM | |
WO2022113482A1 (en) | Information processing device, method, and program | |
WO2024087024A1 (en) | Information processing method, information processing device, aircraft system and storage medium | |
JP6459012B1 (en) | Control device, imaging device, flying object, control method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY INTERACTIVE ENTERTAINMENT INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CASTLEMAN, DENNIS DALE;CHEN, RUXIN;ZHAO, FRANK;AND OTHERS;SIGNING DATES FROM 20170123 TO 20170131;REEL/FRAME:041666/0163 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |