US20210132626A1 - Robot and drone game system - Google Patents
Robot and drone game system Download PDFInfo
- Publication number
- US20210132626A1 US20210132626A1 US17/146,484 US202117146484A US2021132626A1 US 20210132626 A1 US20210132626 A1 US 20210132626A1 US 202117146484 A US202117146484 A US 202117146484A US 2021132626 A1 US2021132626 A1 US 2021132626A1
- Authority
- US
- United States
- Prior art keywords
- game
- drone
- robot
- virtual
- player
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000004891 communication Methods 0.000 claims abstract description 20
- 230000003190 augmentative effect Effects 0.000 claims abstract description 11
- 230000003993 interaction Effects 0.000 claims description 12
- 238000004458 analytical method Methods 0.000 claims description 4
- 230000000977 initiatory effect Effects 0.000 claims description 3
- 238000000034 method Methods 0.000 claims description 3
- 230000002457 bidirectional effect Effects 0.000 claims description 2
- 230000000007 visual effect Effects 0.000 claims description 2
- 101000716700 Mesobuthus martensii Toxin BmKT Proteins 0.000 abstract 1
- 238000010168 coupling process Methods 0.000 description 7
- 230000008878 coupling Effects 0.000 description 5
- 238000005859 coupling reaction Methods 0.000 description 5
- 230000003044 adaptive effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/08—Programme-controlled manipulators characterised by modular constructions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
- B25J9/0087—Dual arms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L50/00—Electric propulsion with power supplied within the vehicle
- B60L50/50—Electric propulsion with power supplied within the vehicle using propulsion power supplied by batteries or fuel cells
- B60L50/60—Electric propulsion with power supplied within the vehicle using propulsion power supplied by batteries or fuel cells using power supplied by batteries
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L53/00—Methods of charging batteries, specially adapted for electric vehicles; Charging stations or on-board charging equipment therefor; Exchange of energy storage elements in electric vehicles
- B60L53/80—Exchanging energy storage elements, e.g. removable batteries
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D57/00—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
- B62D57/02—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
- B62D57/028—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members having wheels and mechanical legs
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C37/00—Convertible aircraft
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C39/00—Aircraft not otherwise provided for
- B64C39/02—Aircraft not otherwise provided for characterised by special use
- B64C39/024—Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64F—GROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
- B64F1/00—Ground or aircraft-carrier-deck installations
- B64F1/22—Ground or aircraft-carrier-deck installations installed for handling aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0011—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement
- G05D1/0027—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot associated with a remote control arrangement involving a plurality of vehicles, e.g. fleet or convoy travelling
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/0088—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0238—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0285—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using signals transmitted via a public communication network, e.g. GSM network
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0287—Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
- G05D1/0291—Fleet control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F9/00—Games not otherwise provided for
- A63F9/24—Electric games; Games using electronic circuits not otherwise provided for
- A63F2009/2483—Other characteristics
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0025—Means for supplying energy to the end effector
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2200/00—Type of vehicles
- B60L2200/10—Air crafts
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2200/00—Type of vehicles
- B60L2200/16—Single-axle vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2200/00—Type of vehicles
- B60L2200/40—Working vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2220/00—Electrical machine types; Structures or applications thereof
- B60L2220/40—Electrical machine applications
- B60L2220/46—Wheel motors, i.e. motor connected to only one wheel
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2260/00—Operating Modes
- B60L2260/20—Drive modes; Transition between modes
- B60L2260/32—Auto pilot mode
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60L—PROPULSION OF ELECTRICALLY-PROPELLED VEHICLES; SUPPLYING ELECTRIC POWER FOR AUXILIARY EQUIPMENT OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRODYNAMIC BRAKE SYSTEMS FOR VEHICLES IN GENERAL; MAGNETIC SUSPENSION OR LEVITATION FOR VEHICLES; MONITORING OPERATING VARIABLES OF ELECTRICALLY-PROPELLED VEHICLES; ELECTRIC SAFETY DEVICES FOR ELECTRICALLY-PROPELLED VEHICLES
- B60L2260/00—Operating Modes
- B60L2260/20—Drive modes; Transition between modes
- B60L2260/34—Stabilising upright position of vehicles, e.g. of single axle vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/60—UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U30/00—Means for producing lift; Empennages; Arrangements thereof
- B64U30/20—Rotors; Rotor supports
- B64U30/29—Constructional aspects of rotors or rotor supports; Arrangements thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U50/00—Propulsion; Power supply
- B64U50/30—Supply or distribution of electrical power
- B64U50/34—In-flight charging
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U50/00—Propulsion; Power supply
- B64U50/30—Supply or distribution of electrical power
- B64U50/37—Charging when not in flight
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W84/00—Network topologies
- H04W84/02—Hierarchically pre-organised networks, e.g. paging networks, cellular networks, WLAN [Wireless Local Area Network] or WLL [Wireless Local Loop]
- H04W84/10—Small scale networks; Flat hierarchical networks
- H04W84/12—WLAN [Wireless Local Area Networks]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/60—Electric or hybrid propulsion means for production processes
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/60—Other road transportation technologies with climate change mitigation effect
- Y02T10/70—Energy storage systems for electromobility, e.g. batteries
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/60—Other road transportation technologies with climate change mitigation effect
- Y02T10/7072—Electromobility specific charging systems or methods for batteries, ultracapacitors, supercapacitors or double-layer capacitors
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T90/00—Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation
- Y02T90/10—Technologies relating to charging of electric vehicles
- Y02T90/16—Information or communication technologies improving the operation of electric vehicles
Definitions
- the present disclosure relates to highly intelligent mobile robots, drone devices and a hybrid robot-drone array utilized as laborers for multiple professions and relates to adaptive robot control system and AI game systems for controlling the handling of task objectives in work environments and in multi-game environments, the claimed inventions may combine and thus relate to one or more of these technologies.
- FIG. 1A schematically illustrates a robot and drone array 104 employment in a work environment in accordance with the present disclosure.
- FIG. 1B schematically illustrates a first mobile robot 101 for employment applications in accordance with the present disclosure.
- FIG. 1C schematically illustrates a first drone device 102 ( 1 ) for employment applications in accordance with the present disclosure.
- FIG. 1D illustrates a first hybrid robot-drone 103 ( 1 ) for employment applications in accordance with the present disclosure.
- FIG. 1E schematically illustrates a second hybrid robot-drone 103 ( 2 ) for employment in a work environment in accordance with the present disclosure.
- FIG. 2 schematically illustrates a block diagram of a real-time game system 1900 of the robot and drone array 100 in accordance with an embodiment of the disclosure.
- FIG. 3 schematically illustrates a block diagram of a virtual reality game system 2000 of the robot and drone array 100 in accordance with an embodiment of the disclosure.
- FIG. 4 illustrates a block diagram of an augmented virtual game system 2100 for a E Sports game platform in accordance with an embodiment of the disclosure.
- the present invention discloses a robot and drone array 100 comprising diverse mobile robots 101 , drone devices 102 , and hybrid robot-drones 103 characterized as having a level of artificial intelligence to accomplish the handling objectives 1000 pertaining to complex tasks 104 and to interactively work for users 105 .
- Fundamentally controlled elements of the robot and drone array 100 include an AI system 1500 schematically arranged for achieving the handling of target objects 105 for a variety of professions including; domestic, medical, rescue, environment restoration, delivery, food service production, policing, military, industry, gameplay and other tasks.
- FIG. 1A illustrates the robot and drone array 100 comprising different mobile robot types for instance; a first mobile robot 101 ( 1 ) depicted in FIG. 1B ; and a second mobile robot 101 ( 2 ) illustrated in FIG. 2A , as well as, comprising different drone types; a drone device 102 ( 1 ) depicted in FIG. 1C ; a drone device 102 ( 2 ) illustrated in FIG. 2C , and a hybrid robot-drone 103 illustrated in FIG. 1D .
- each task objective 104 is based on criteria associated with tasks allocated for the mobile robots 101 ( 1 )- 101 ( 2 ), an aquatic robot 101 ( 3 ), and a space robot 101 ( 4 ), and criteria associated with tasks allocated for drone device 102 ( 1 )- 102 ( 2 ) and an aquatic drone 102 ( 3 ) and criteria associated with tasks allocated for the hybrid robot-drone 103 .
- the robot and drone array 100 may receive user input 105 a from a communication network 1517 via a AI system 1500 systematically configured with an Adaptive Robot Control System (ARCS) 1600 , see FIG. 16 , an Autonomous Coupling System 1700 , see FIG. 17 , an Autonomous Charging System 1800 , see FIG. 18 , and other subsystems disclosed herein.
- Adaptive Robot Control System see FIG. 16
- Autonomous Coupling System 1700 see FIG. 17
- an Autonomous Charging System 1800 see FIG. 18
- FIG. 1B illustrates first mobile robot 101 ( 1 ) wherein the mobile robot comprising a body 201 , the body configured with multiple sections configured with varied geometric shapes and dimensions; as depicted a compartment 900 is configured with an upper yoke apparatus 300 comprising a landing platform 202 (LP), said landing platform 202 comprising one or more locking mechanisms configured for coupling via a latching sequence exampled in FIG. 17C .
- the landing platform providing a latching means 1700 for battery charging and for carrying a payload affixed thereon.
- the yoke apparatus further comprising one or more robotic arms having grippers or other tools; a section for a compartment 900 , wherein the compartment for containing batteries, an appliance 901 - 910 , and wired conduit; and a bottom section comprising a torso and a section comprising a truck 604 , or truck 605 comprising a set of opposing drive wheels 700 , said drive wheel components include a hub wheel arrangement 701 or a track wheel arrangement 702 proving drive propulsion.
- FIG. 1C illustrates the first drone device 102 ( 1 ) comprising a body 202 ( 1 ), the drone body 202 ( 1 ) configured with multiple component sections including; a fuselage 102 a section configured with a built-in smartphone device 210 and flight hardware 102 b - 102 d , a yoke apparatus 300 section comprising one or more robotic arms 303 , and a landing gear 606 arrangement comprising a steerable propulsion drive wheel 701 - 702 ; a prewired conduit 201 a - 201 b array linking a motor control subsystem 1500 A and component sections to battery banks 1811 ; and comprising locking mechanisms configured for coupling to a mobile robot 101 , exampled in FIG. 12 , FIG. 14A , and FIG. 14B .
- a drone device 102 ( 2 ) is configure also with multiple component sections configured in drone device 102 ( 1 ) however the yoke apparatus 300 section comprises an interactive head module 400 , wherein the head module 400 comprises PC display 308 , drone device camera 1514 , and may comprise one or more compartments 900 containing an appliance 901 , and a truck 606 providing propulsion.
- the head module 400 comprises PC display 308 , drone device camera 1514 , and may comprise one or more compartments 900 containing an appliance 901 , and a truck 606 providing propulsion.
- FIG. 1E illustrates a hybrid robot-drone 103 ( 2 ) arrangement comprising a coupling process configured by an Autonomous Coupling System 1700 , said Autonomous Coupling System 1700 is calibrated for integrating a drone device body 202 to a mobile robot body thus creating the hybrid robot-drone array 103 .
- the hybrid robot-drone body 103 (B) contains an array of conduit 201 a - 201 d linking the upper and lower body sections together.
- the coupling process is calculated by trajectory algorithms calibrated self-adapting programming via ARCS 1600 for controlling the mobile robot 101 and the drone device 102 to couple and to separate.
- the hybrid robot-drone 103 comprises sensoring system components 1501 a - 1501 h and processor components 1516 a - 1615 d , and a Flight Control System 1400 configured for transporting the hybrid robot-drone in aerial flight, as exampled in FIG. 14 , the drone device flight plan processes 1401 - 1420 allowing the hybrid robot-drone 103 to fly and transport a payload, the payload accordingly being one or more target objects 105 - 106 .
- the drone device 102 of the hybrid robot-drone 103 is configured to comprise large retractable rotors 102 b calibrated for the handling the weight of the payload 105 .
- Respectively the hybrid robot-drone 103 when actively engaged for takeoff is controlled by said fight control subsystem 1400 illustrated in FIG. 14A-14E exampling a drone device 102 is transporting an active or deactivated mobile robot 101 .
- FIG. 2 illustrates a game system 1900 , wherein users 1900 a play in a real-time game environment 1901 - 1927 comprising multiple players and a user interface computing system 1902 , said user interface computing system being electronically coupled to a game processor 1903 , a playing field 1904 of the linking the user 1900 a with wireless controller device 1905 to a cloud-based analytics platform 1906 to visualize a stereoscopic image 1907 in the playing field 1904 ; wherein the user interface computing system 1902 is linked with an wireless controller device 1905 which is configured as; a PC laptop 1905 a and a Bluetooth smartphone 1905 b , a virtual reality headset device 1905 c , or a handheld device 1905 d , or a combination thereof, accordingly the wireless controller device 1905 configured to be electronically coupled to the processor 1903 , wherein the user interface computing system 1902 is compatible with android, iOS, and windows; wherein the plurality of actions 1908 executed by the electronic controller device from a real-time game environment includes interaction 1908 a ,
- game environments categized as rooms, or game fields which include land-based sport arenas, race and track fields, aerial arenas, and structures situated in and on land, ocean and space; a match 1917 controlled by a user interface electronic controller device configured for controlling gameplay elements on a game playing field; an identifier scanning system of the one or more mobile robot and drone players 1910 - 1912 comprising image recognition software of the programmed to receive data from the track the motion data of game playing field elements and target objects, and to store motion tracking data in memory linking to a cloud-based analytics platform 1906 to visualize a stereoscopic image 1907 in the playing field 1904 , wherein the mobile robot and drone players 1910 - 1912 comprise a built-in smartphone device linking to the game to a wireless communication network 1913 , wherein the built-in smartphone device of the mobile robot, the drone device, and the hybrid robot-drone, for receiving high level programming command instructions linking to a motor control subsystem processor of the AI system 1500 ; a processor configured to automatically instruct a motor control subsystem processor to summon
- FIG. 20 illustrates the Virtual Reality Game System 2000 , wherein users 2000 a play in a computer generated game environment 2001 - 2020 comprising: a user interface computing system 2002 electronically coupled to a processor 2003 , a virtual game field 2004 of the linking the user 2000 a with wireless controller device 2005 to a cloud-based analytics platform 2006 to visualize a stereoscopic image 2007 in the virtual game field 2004 comprising virtual players or (avatars), wherein the user interface computing system 2002 is linked with an wireless controller device which is configured as; a PC laptop 2005 a and a Bluetooth smartphone 2005 b , a virtual reality headset device 2005 c , or a handheld device 2005 d , or a combination thereof, accordingly the wireless controller device 2005 configured to be electronically coupled to the processor 2003 , wherein the user interface computing system 2002 is compatible with android, iOS, and windows; wherein the plurality of actions 2008 executed by the electronic controller device from a virtual game environment includes interaction 2008 a , navigation 2008 b , and a game architecture module 2009 to control a virtual drone device avatar 2010
- FIG. 3 illustrates the augmented virtual game system 2100 , wherein users 2100 a play in an augmented virtual game environment 2101 - 2132 comprising: a user interface computing system 2102 , and a user interface electronic controller device 2103 , wherein the user interface computing system 2102 linking the user interface electronic controller device 2103 to a cloud-based analytics platform 2104 ; and the user interface computing system 2102 for automatically analyzing received input data 2105 ; the input data 2105 from; video input 2106 , audio input 2107 and image input 2108 , configured for generating visual data of target object elements 2109 in an augmented virtual game environment 2101 ; the target objects include; a drone device avatar 2110 , a mobile robot avatar 2111 , a hybrid robot-drone avatar 2112 , and a space robot avatar 101 ( 4 ); a storage reader 2113 for reading a game application from a storage medium 2114 ; a memory 2115 storing a communications client application 2116 ; a network interface 2117 for receiving data 2118 from
Abstract
A robot and drone game comprising an electronic game system configured for one of; software programming for a live reality match including game players which include one or more of: a semiautonomous and autonomous mobile robot player; a game match configured with an array of drone device avatars, mobile robot avatars, and robot-drone avatars and target objects to compete in said match via multiple players on a packet-based communication network. A virtual reality game system; a game processor of said AI system. A user of a virtual reality device to be electronically coupled to said game processor to visualize a stereoscopic image of said virtual game environment configured with said virtual environment game field; an augmented game system; AGSP comprising game application software programming for an augmented virtual game environment configured with multiple users playing a match; wherein said user interface computing system links the user interface electronic controller device to a cloud-based analytics platform.
Description
- A notice of issuance for a continuation in part of patent application in reference to application Ser. No. 15/993,609, filed May 31, 2018; title: “Robot and Drone Array”.
- The present disclosure relates to highly intelligent mobile robots, drone devices and a hybrid robot-drone array utilized as laborers for multiple professions and relates to adaptive robot control system and AI game systems for controlling the handling of task objectives in work environments and in multi-game environments, the claimed inventions may combine and thus relate to one or more of these technologies.
- As the demand to employ highly intelligent robots and drones increases present day robots and drones have limited mobility to handle most job requirements for example, todays mobile robots can vacuum, mow lawns and repetitively work with one skill in factories and farms, and for gameplay robot competitors superfluously try to destroy one another and drones just fly about in racing matches. Advantageously what is needed are peer robots and drones providing improved tactile mobility and integrated handling skills to complete complex tasks when employed for targeting objects to clean, cook, gather, produce and maintenance, or to entertain users by playing physically automatic and virtually innovative mobile robot and drone game applications.
- Ideally what is essential for the advancement of autonomous mobile robot and drone technology is to provide highly intelligent and adaptive mobile robots and drone devices capable of collaborating as integrated hybrid robot drones to handle a plurality of task objectives in environments relating to multiple professions and mixed-reality game applications, a user or multiple users utilizing a video game console providing on central processing unit to calculate various aspects of the game and control how the game responds to user input; and processing game's instructions, handling game logic in the form of movement or interaction with objects an electronic game system; a video game consoles providing on central processing unit to calculate various aspects of the game and control how the game responds to user input; and processing game's instructions, handling game logic in the form of movement or interaction with objects an electronic game system based on a real-time game platform; a Real-Time Game System comprising mobile robot players and drone device players, and hybrid robot-drone players, and more so, a Virtual Reality Game System and an Augmented Virtual Game System played with mobile robot avatars and drone device avatars, and hybrid robot-drone players avatars and game accessories.
-
FIG. 1A schematically illustrates a robot and drone array 104 employment in a work environment in accordance with the present disclosure. -
FIG. 1B schematically illustrates a firstmobile robot 101 for employment applications in accordance with the present disclosure. -
FIG. 1C schematically illustrates a first drone device 102(1) for employment applications in accordance with the present disclosure. -
FIG. 1D illustrates a first hybrid robot-drone 103(1) for employment applications in accordance with the present disclosure. -
FIG. 1E schematically illustrates a second hybrid robot-drone 103(2) for employment in a work environment in accordance with the present disclosure. -
FIG. 2 schematically illustrates a block diagram of a real-time game system 1900 of the robot anddrone array 100 in accordance with an embodiment of the disclosure. -
FIG. 3 schematically illustrates a block diagram of a virtual reality game system 2000 of the robot anddrone array 100 in accordance with an embodiment of the disclosure. -
FIG. 4 illustrates a block diagram of an augmented virtual game system 2100 for a E Sports game platform in accordance with an embodiment of the disclosure. - Hereinafter, the present invention discloses a robot and
drone array 100 comprising diversemobile robots 101,drone devices 102, and hybrid robot-drones 103 characterized as having a level of artificial intelligence to accomplish thehandling objectives 1000 pertaining to complex tasks 104 and to interactively work for users 105. Fundamentally controlled elements of the robot anddrone array 100 include anAI system 1500 schematically arranged for achieving the handling of target objects 105 for a variety of professions including; domestic, medical, rescue, environment restoration, delivery, food service production, policing, military, industry, gameplay and other tasks. - In greater detail
FIG. 1A illustrates the robot anddrone array 100 comprising different mobile robot types for instance; a first mobile robot 101(1) depicted inFIG. 1B ; and a second mobile robot 101(2) illustrated inFIG. 2A , as well as, comprising different drone types; a drone device 102(1) depicted inFIG. 1C ; a drone device 102(2) illustrated inFIG. 2C , and a hybrid robot-drone 103 illustrated inFIG. 1D . Respectively, the robot and drone array's Asystem 1500 data processor 1505 and processors 1516 a-1516 d for determining spatial locations of the one or moremobile robots 101 in an environment 107, and control positions and trajectories of objects in the environment 107, the objects are characterized as; a “person,” or user 105, a “thing,” a “payload,” are referenced as a target object 106, and a “place,” is referenced as a work environment 107 a or a game environment 107 b. Primarily environments and target objects involvingrobot types 101 anddrone types 102 physically obtaining items like abox 106 a, a tool 106 b, acontainer 800, and an appliance 901-910 contained within acompartment 900. Respectively, each task objective 104 is based on criteria associated with tasks allocated for the mobile robots 101(1)-101(2), an aquatic robot 101(3), and a space robot 101(4), and criteria associated with tasks allocated for drone device 102(1)-102(2) and an aquatic drone 102(3) and criteria associated with tasks allocated for the hybrid robot-drone 103. In some contexts, the robot anddrone array 100 may receive user input 105 a from a communication network 1517 via aAI system 1500 systematically configured with an Adaptive Robot Control System (ARCS) 1600, seeFIG. 16 , an Autonomous Coupling System 1700, seeFIG. 17 , an Autonomous Charging System 1800, seeFIG. 18 , and other subsystems disclosed herein. - In greater detail
FIG. 1B illustrates first mobile robot 101(1) wherein the mobile robot comprising a body 201, the body configured with multiple sections configured with varied geometric shapes and dimensions; as depicted acompartment 900 is configured with anupper yoke apparatus 300 comprising a landing platform 202(LP), said landing platform 202 comprising one or more locking mechanisms configured for coupling via a latching sequence exampled inFIG. 17C . The landing platform providing a latching means 1700 for battery charging and for carrying a payload affixed thereon. The yoke apparatus further comprising one or more robotic arms having grippers or other tools; a section for acompartment 900, wherein the compartment for containing batteries, an appliance 901-910, and wired conduit; and a bottom section comprising a torso and a section comprising atruck 604, or truck 605 comprising a set of opposing drive wheels 700, said drive wheel components include ahub wheel arrangement 701 or atrack wheel arrangement 702 proving drive propulsion. - In greater detail
FIG. 1C illustrates the first drone device 102(1) comprising a body 202(1), the drone body 202(1) configured with multiple component sections including; a fuselage 102 a section configured with a built-in smartphone device 210 and flight hardware 102 b-102 d, ayoke apparatus 300 section comprising one or morerobotic arms 303, and alanding gear 606 arrangement comprising a steerable propulsion drive wheel 701-702; a prewired conduit 201 a-201 b array linking a motor control subsystem 1500A and component sections to battery banks 1811; and comprising locking mechanisms configured for coupling to amobile robot 101, exampled inFIG. 12 ,FIG. 14A , andFIG. 14B . - As shown in
FIG. 1D , a drone device 102(2) is configure also with multiple component sections configured in drone device 102(1) however theyoke apparatus 300 section comprises an interactive head module 400, wherein the head module 400 comprises PC display 308, drone device camera 1514, and may comprise one ormore compartments 900 containing an appliance 901, and atruck 606 providing propulsion. - In greater detail
FIG. 1E illustrates a hybrid robot-drone 103(2) arrangement comprising a coupling process configured by an Autonomous Coupling System 1700, said Autonomous Coupling System 1700 is calibrated for integrating a drone device body 202 to a mobile robot body thus creating the hybrid robot-drone array 103. The hybrid robot-drone body 103(B) contains an array of conduit 201 a-201 d linking the upper and lower body sections together. The coupling process is calculated by trajectory algorithms calibrated self-adapting programming via ARCS 1600 for controlling themobile robot 101 and thedrone device 102 to couple and to separate. In various aspects, the hybrid robot-drone 103 comprises sensoring system components 1501 a-1501 h and processor components 1516 a-1615 d, and a Flight Control System 1400 configured for transporting the hybrid robot-drone in aerial flight, as exampled inFIG. 14 , the drone device flight plan processes 1401-1420 allowing the hybrid robot-drone 103 to fly and transport a payload, the payload accordingly being one or more target objects 105-106. In various aspects, thedrone device 102 of the hybrid robot-drone 103 is configured to comprise large retractable rotors 102 b calibrated for the handling the weight of the payload 105. Respectively the hybrid robot-drone 103 when actively engaged for takeoff is controlled by said fight control subsystem 1400 illustrated inFIG. 14A-14E exampling adrone device 102 is transporting an active or deactivatedmobile robot 101. - In greater detail
FIG. 2 illustrates a game system 1900, wherein users 1900 a play in a real-time game environment 1901-1927 comprising multiple players and a user interface computing system 1902, said user interface computing system being electronically coupled to agame processor 1903, a playing field 1904 of the linking the user 1900 a with wireless controller device 1905 to a cloud-based analytics platform 1906 to visualize a stereoscopic image 1907 in the playing field 1904; wherein the user interface computing system 1902 is linked with an wireless controller device 1905 which is configured as; a PClaptop 1905 a and a Bluetoothsmartphone 1905 b, a virtual reality headset device 1905 c, or a handheld device 1905 d, or a combination thereof, accordingly the wireless controller device 1905 configured to be electronically coupled to theprocessor 1903, wherein the user interface computing system 1902 is compatible with android, iOS, and windows; wherein the plurality of actions 1908 executed by the electronic controller device from a real-time game environment includes interaction 1908 a, navigation 1908 b, and a game architecture module 1909 to control a drone device 1910 “player,” a mobile robot 1911 “player,” and a hybrid robot-drone 1912 “player,” in a real-time game environment 1901, (e, g., the mobile robot, the drone device, and the hybrid robot-drone are configured with A system input device; a built-in smartphone and an identifier scanning device and processors for receiving high level programming command instructions to engage strategy in a real-time game system playing field environment); wherein the game architecture module 1909 to execute a plurality of actions using the drone device player 1910, the mobile robot player 1911, and the hybrid robot-drone player 1912 comprise a wireless communication network 1913 configured within the real-time game environment 1901; wherein the plurality of actions played by the drone device player 1910, the mobile robot player 1911, and the hybrid robot-drone player 1912 are linked to an audiovisual display or headset 1915 comprising an audio speaker 1916 based on a status of the real-time game environment 1901; the wireless communication network 1913 linking with one or more drone device players 1910, mobile robot players 1911, and hybrid robot-drone players 1912; the playing field 1904, e. g., game environments, categized as rooms, or game fields which include land-based sport arenas, race and track fields, aerial arenas, and structures situated in and on land, ocean and space; a match 1917 controlled by a user interface electronic controller device configured for controlling gameplay elements on a game playing field; an identifier scanning system of the one or more mobile robot and drone players 1910-1912 comprising image recognition software of the programmed to receive data from the track the motion data of game playing field elements and target objects, and to store motion tracking data in memory linking to a cloud-based analytics platform 1906 to visualize a stereoscopic image 1907 in the playing field 1904, wherein the mobile robot and drone players 1910-1912 comprise a built-in smartphone device linking to the game to a wireless communication network 1913, wherein the built-in smartphone device of the mobile robot, the drone device, and the hybrid robot-drone, for receiving high level programming command instructions linking to a motor control subsystem processor of theAI system 1500; a processor configured to automatically instruct a motor control subsystem processor to summon the mobile robot and to summon the drone device and to summon the hybrid robot-drone to play a match 1917, the match 1017 determined by a user via the user interface computing system 1902; a processor configured to automatically instruct a motor control subsystem 1918 processor to navigate the one or more drone device players 1910, mobile robot players 1911, and hybrid robot-drone players 1912 by a user 1900 a via the user interface computing system 1902; aprocessor 1903 configured for receiving highlevel command instructions 1919 from a wireless controller device 1905 relaying executingsoftware routines 1920 and strategy instructions 1921 related to the high-level command instruction 1919; a high-level command instruction 1919 to execute autonomously perform actions 1922 in a programmable manner in the playing field 1904 according to the respective strategy instructions 1921 directed by a user 1900 a; a high-level command instruction 1919 to execute autonomously perform actions in a programmable manner in the playing field according to therespective software routines 1920; a user directing to execute a sequence of autonomous actions 1922 and achieve performance actions 1922 with or without further command instructions from the user 1900 a, allowing wireless controller device 1905 to simultaneously control one or more target objects 1923 and game accessories 1924 on a playing field 1904, other robot and drone players and game related accessories 1924 in the autonomous mobile robot's vicinity, and in the autonomous drone device's vicinity, and in the playing field 1904; at least one remote controlled drone device player 1910, mobile robot player 1911, and hybrid robot-drone player 1912 to receive high level command instruction to directly control drive wheels and joint actuators in such a way that the wireless controller device 1905 actively steers the drone device player 1910, the mobile robot player 1911, and the hybrid robot-drone player 1912, and actively controls the game related accessories 1924 and target objects 1923 in the playing field 1904; one or more high level command instructions to allow the remote controlled mobile robot to avoid obstacles, to seek out and engage other opposing drone device players 1910, mobile robot players 1911, and hybrid robot-drone players 1912, to interact with one or more game related accessories in the playing field, and to complete task objectives within the context of the game environment while under the direct control of the wireless controller device 1905 via the game architecture module 1909; a memory of the transmitting game information of the state 1925 of the drone device player 1910, mobile robot player 1911, and hybrid robot-drone player 1912 to a wireless controller device 1905 which information is displayed on the display of the wireless controller device 1905; wherein the transmitted information includes status information 1926 on a game state information 1925, thereby allowing an operator of a remote controller unit to monitor the status information 1927 of a drone device player 1910, the status information 1928 of the mobile robot player 1911, and the status information 1929 of the hybrid robot-drone player 1912 with respect to their physical status and game parameters 1927; and the status information 1929 of one or more target objects 1923 and the status information 1930 of the game accessories, and the status information 1931 of the user 1900 a, and theAI system 1500 to generate motion tracking data 1932, via a receiver 1933 and a server 1934 with game logic programmed to control aspects of the game based on said motion tracking data 1935 via a receiver 1933, the receiver receiving by thegame application 1936, gameapplication comprising processor 1903 and amemory 1937 and a storage device 1939 for receiving a plurality of data throughput ratings 1940 from a plurality of user interface computing systems 1902, each associated with a user interface computing system 1902 of the plurality, and based, at least in part, on throughput analysis data 1941 associated with the storage device 1939 of the corresponding user interface computing system 1902; and theprocessor 1903 the user status information 1931 determining the presence of the one or more users playing a match 1942. - In greater detail
FIG. 20 illustrates the Virtual Reality Game System 2000, wherein users 2000 a play in a computer generated game environment 2001-2020 comprising: a user interface computing system 2002 electronically coupled to aprocessor 2003, a virtual game field 2004 of the linking the user 2000 a withwireless controller device 2005 to a cloud-based analytics platform 2006 to visualize a stereoscopic image 2007 in the virtual game field 2004 comprising virtual players or (avatars), wherein the user interface computing system 2002 is linked with an wireless controller device which is configured as; a PC laptop 2005 a and a Bluetooth smartphone 2005 b, a virtual reality headset device 2005 c, or a handheld device 2005 d, or a combination thereof, accordingly thewireless controller device 2005 configured to be electronically coupled to theprocessor 2003, wherein the user interface computing system 2002 is compatible with android, iOS, and windows; wherein the plurality of actions 2008 executed by the electronic controller device from a virtual game environment includes interaction 2008 a, navigation 2008 b, and a game architecture module 2009 to control a virtual drone device avatar 2010, a virtualmobile robot avatar 2011, and a virtual hybrid robot-drone avatar 2012 in avirtual game environment 2001; the game architecture module 2009 to execute a plurality of actions using the drone device avatar 2010, themobile robot avatar 2011, and the hybrid robot drone avatar 2012 from thevirtual game environment 2001, wherein the plurality ofactions 2011 comprised of audiovisual content to a display headset 2015 and audio speaker 2016 based on a status of thevirtual game environment 2001; wherein the game architecture module 2009 comprising methods detecting an event that occurs within broadcast data content 2012 while a plurality of users 2000 a are viewing the broadcast data content; when the event is detected, automatically capturing user feedback data 2013 with a user computer interaction capturing device 2014 from each user 2000 a; wherein game architecture module 2009 to execute a plurality of actions executed by; the drone device avatar 2010, themobile robot avatar 2011, and the hybrid robot-drone avatar 2012 in a virtual reality environment, issue the audio-visual content to the display headset 2015 and audio speaker 2016 based on a status of thevirtual game environment 2001, and based on a magnitude and direction of the virtual force 2017; wherein the one ormore processors 2003 are configured to project the drone device avatar 2010, themobile robot avatar 2011, and the hybrid robot-drone avatar 2012 in the virtual environment game field 2001 a; wherein the one or more processors are configured to receive a user-originated command 2018, modify a status of an avatar of a drone device based on the user-originated command and a status of thevirtual game environment 2001, and control the drone device avatar based on the modified status of the drone device avatar; wherein the one ormore processors 2003 are configured to receive a user-originated command, modify a status of an avatar of a mobile robot based on the user-originated command and a status of thevirtual game environment 2001, and control the mobile robot avatar based on the modified status of the mobile robot avatar; wherein the one ormore processors 2003 are configured to receive a user-originated command via the user interface electronic controller device, too modify a status of an avatar for example, of a hybrid robot-drone avatar 2012 based on the user-originated command and a status of thevirtual game environment 2001, and control the hybrid robot-drone avatar 2012 based on the modified status of the hybrid robot-drone avatar 2012; wherein the one or more processors are configured to, upon determining that a virtual force 2017 has been applied onto the avatars, compare the virtual force to a threshold force 2017 a, and based on the virtual force 2017 exceeding the threshold force 2017 a, control the virtual motor 2018-2019 based on a magnitude and direction of the virtual force 2017; a virtual motor 2018 of the drone device avatar 2010, avirtual motor 2019 of themobile robot avatar 2011, and multiple virtual motors 2018-2019 of the hybrid robot-drone avatar 2012 or, a space robot avatar 101(4), and control other based on a magnitude and direction of the virtual force 2017, and theA system 1500 to generatemotion tracking data 2125; a game application 2126 comprising agame application server 2020 with game logic programmed to control aspects of the game based on saidmotion tracking data 2124; a receiver 2128, the receiver receiving by thegame application server 2020 comprising aprocessor 2003 and amemory 2021 and astorage device 2022 for receiving a plurality ofdata throughput ratings 2023 from a plurality of user interface computing systems 2002, each associated with a user interface computing system 2002 of the plurality, and based, at least in part, onthroughput analysis data 2024 associated with thestorage device 2022 of the corresponding user interface computing system 2002; and theprocessor 2003 selecting and initiating a game match 2025. - In greater detail
FIG. 3 illustrates the augmented virtual game system 2100, wherein users 2100 a play in an augmented virtual game environment 2101-2132 comprising: a userinterface computing system 2102, and a user interfaceelectronic controller device 2103, wherein the userinterface computing system 2102 linking the user interfaceelectronic controller device 2103 to a cloud-based analytics platform 2104; and the userinterface computing system 2102 for automatically analyzing receivedinput data 2105; theinput data 2105 from; video input 2106, audio input 2107 and image input 2108, configured for generating visual data of target object elements 2109 in an augmentedvirtual game environment 2101; the target objects include; a drone device avatar 2110, amobile robot avatar 2111, a hybrid robot-drone avatar 2112, and a space robot avatar 101(4); a storage reader 2113 for reading a game application from astorage medium 2114; amemory 2115 storing a communications client application 2116; anetwork interface 2117 for receivingdata 2118 from a remote user 2100 b via a packet-basedcommunication network 2119; and processing apparatus 2120 coupled tomemory 2115 in anetwork interface 2117, wherein a communication client 2121 is programmed to establish bidirectional video communications 2122 via saidnetwork interface 2117 and packet-basedcommunication network 2119, including receiving video data from one or more users 2100 a via and remote users 2100 b via a plurality of user interface computing systems 2123; andimage recognition software 2124 programmed to receive video data 2106 a from a client application 2116, and to recognize a predetermined image element in the received video data 2106 a, and track the motion of target object elements 2109 to generatemotion tracking data 2125; a game application 2126 comprising a game application server 2127 with game logic programmed to control aspects of the game based on saidmotion tracking data 2124; a receiver 2128, the receiver receiving by the game application server 2127 with a storage device 2129 receiving a plurality of data throughput ratings 2130 from a plurality of user interface computing systems 2123, each associated with a user interface computing system 2123 of the plurality, and based, at least in part, on throughput analysis data 2131 associated with the storage device 2129 of the corresponding user interface computing system 2123; and the receiver 2128 selecting and initiating a game match 2132 selected by the game application server 2127, and by the interface computing system 2123 which is configured to display an augmented virtual reality game system 2100 configured with an array of drone device avatars 2110,mobile robot avatars 2111, and hybrid robot-drone avatars 2112 and other target objects to compete game match via multiple players on a packet-basedcommunication network 2119. - Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the following descriptions. The scope should be determined as set forth above has been disclosed in its preferred form(s), the specific embodiments thereof as disclosed and illustrated are not to be considered in a limiting sense, because numerous variations are possible.
Claims (20)
1. A robot and drone game comprising:
a user or multiple users utilizing a video game console providing on central processing unit to calculate various aspects of the game and control how the game responds to user input; and processing game's instructions, handling game logic in the form of movement or interaction with objects an electronic game system; a video game consoles providing on central processing unit to calculate various aspects of the game and control how the game responds to user input; and processing game's instructions, handling game logic in the form of movement or interaction with objects an electronic game system based on a real-time game platform;
software programming for a live reality match including one game player or multiple game players, and a user interface system comprising a wireless controller device linking to game players which include one or more of:
a semiautonomous and/or autonomous mobile robot player;
a semiautonomous and/or autonomous drone device player and a hybrid robot-drone player combination thereof;
a processor and a memory configured for receiving and transmitting status information of said game players to a wireless controller device in which player information is displayed and managed; wherein said transmitted player information includes customized physical attributes and status information on a state of each game player, thereby allowing an operator of said wireless controller device to monitor said game players with respect to their physical status and to determine the presence of said game players within parameters on a game field, and to determine a game match category and to communicate the determined data of the match to the AI game system and said user interface system, said user interface system, and
memory and Cloud network.
2. A robot and drone game further of claim 1 further comprising:
an environment having one or more users, a user interface, wherein said user interface computing system is electronically coupled to said processor, having a playing field linking the user with said wireless controller device to a cloud-based analytics platform to visualize a stereoscopic image in said playing field, wherein said user interface includes a computing system linked with a wireless controller device which is configured as;
a PC laptop and a Bluetooth smartphone, a virtual reality headset device, or a handheld device, or a combination thereof, said wireless controller device is further configured to be electronically coupled to said processor, wherein said user wireless controller device communicates with said user interface system.
3. A robot and drone game further of claim 1 further comprising:
a plurality of actions executed by said wireless controller device from a real-time game environment includes interaction, navigation, and a game architecture module to control a drone device player, a mobile robot player, and a hybrid robot-drone player in a real-time game environment, said mobile robot player, said drone device player, and said hybrid robot-drone player are configured with an AI system input device;
a built-in smartphone and an identifier scanning device, and processors for receiving high level programming command instructions to engage strategy in said real-time game system playing field environment.
4. A robot and drone game further of claim 1 wherein the game architecture module to execute a plurality of actions using said drone device player, said mobile robot player, and said hybrid robot-drone player comprise a wireless communication network configured within said real-time game environment.
5. A robot and drone game further of claim 1 wherein the plurality of actions played by said drone device player, said mobile robot player, and said hybrid robot-drone player are linked to an audio-visual display or headset comprising an audio speaker based on a status of said real-time game environment; said wireless communication network linking with one or more of said drone device players, said mobile robot players, and said hybrid robot-drone players;
said playing field is categized as; rooms, game fields, land-based sport arenas, race and track fields, aerial arenas, and structures situated in and on land, ocean and space;
a match controlled by said user interface electronic controller device is configured for controlling gameplay elements on said game playing field;
an identifier scanning system of the one or more said mobile robot and drone players comprising image recognition software programmed to receive data from and to track the motion data of said game playing field elements and target objects, and to store motion tracking data in a memory linking to a cloud-based analytics platform to visualize a stereoscopic image of said playing field;
said mobile robot and drone players comprising a built-in smartphone device linking said game to a wireless communication network, wherein said built-in smartphone device of said mobile robot, said drone device, and said hybrid robot-drone device receives high level programming command instructions linking to a motor control subsystem processor of said A system.
6. A robot and drone game further of claim 1 further comprising:
a processor configured to automatically instruct said motor control subsystem processor to summon said mobile robot device and to summon said drone device and to summon said hybrid robot-drone to play a match, the match outcome determined by said user via said user interface computing system;
yet another a processor configured to automatically instruct a motor control subsystem processor to navigate one or more of said drone device players, mobile robot players, and hybrid robot-drone players by a user via said user interface computing system;
still another processor configured for receiving high level command instructions from a wireless controller device relaying executing software routines and strategy instructions related to said high-level command instruction;
said high-level command instruction to execute autonomously perform actions in a programmable manner on said playing field according to said respective strategy instructions directed by a user;
said high-level command instruction further executing autonomously to perform actions in a programmable manner on said playing field according to the said software routines;
said user directing to execute a sequence of autonomous actions and achieve performance actions with or without further command instructions from the user, allowing said wireless controller device to simultaneously control one or more target objects and game accessories on said laying field, using other said robot and drone players and game related accessories in the vicinity of said autonomous mobile robot devices, said autonomous drone device, and said autonomous hybrid robot-drone on said playing field.
7. A robot and drone game further of claim 1 further comprising remote controlled drone device players, mobile robot players, and hybrid robot-drone players receiving high level command instruction to directly control drive wheels and joint actuators in such a way that said wireless controller device actively steers said drone device player, said mobile robot player, and said hybrid robot-drone player, and controls said game related accessories and said target objects on said playing field.
8. A robot and drone game further of claim 1 further comprising: one or more high level command instructions to allow said remote controlled mobile robot to avoid obstacles, to seek out and engage other opposing drone device players, mobile robot players, and hybrid robot-drone players, to interact with one or more of said game related accessories in the playing field, and to complete said task objectives within the context of said game environment while under the direct control of said wireless controller device via said game architecture module.
9. A robot and drone game further of claim 1 further 1 comprising: a memory of said transmitting game information of said state of the said drone device player, mobile robot player, and hybrid robot-drone player to said wireless controller device is displayed on said display of said wireless controller device; wherein said transmitted information includes status information on a game state information, thereby allowing said user of a remote controller unit to monitor said status information of said drone device player.
10. A robot and drone game further of claim 1 further comprising: mobile robot player, and said hybrid robot-drone player with respect to their physical status and game parameters; and said status information of said one or more target objects and said status information of said game accessories, and the status information of said user, said status information for determining the presence of the one or more users playing said game.
11. A robot and drone game comprising:
a video game console providing on central processing unit to calculate various aspects of the game and control how the game responds to user input; and processing game's instructions, handling game logic in the form of movement or interaction with objects an electronic game system based on a virtual reality game system;
a game processor of said A system; said game processor comprising game application software programming for a said virtual reality game;
a user of a virtual reality device to be electronically coupled to said game processor to visualize a stereoscopic image of said virtual game environment configured with said virtual environment game field;
a user interface electronic controller device including; a PC laptop, a Bluetooth smartphone, a virtual reality headset device, or a handheld device configured to be electronically coupled to said game processor, wherein said electronic controller device is a wireless smart device.
12. A robot and drone game further of claim 11 wherein the plurality of actions executed by said electronic interface controller device in said virtual environment includes interaction, navigation, and a game architecture module to control a virtual drone device, a virtual mobile robot avatar, and a virtual hybrid robot-drone avatar in a virtual game environment;
wherein the game architecture module permitting the executing plurality of actions using said drone device avatar, said mobile robot avatar, and said hybrid robot-drone avatar from said virtual game environment, wherein said plurality of actions comprised of audio-visual content directed to a display headset and audio speaker based on a status of said virtual game environment.
13. A robot and drone game further of claim 11 further comprising:
a game architecture module comprising methods for detecting an event that occurs within broadcast data content while said plurality of users view the broadcast data content; when the event is detected said automatically capturing user feedback data via a user computer interaction capturing device from each of said user;
wherein said game architecture module to execute a plurality of actions executed by; said drone device avatar, said mobile robot avatar, and said hybrid robot-drone avatar in said virtual game environment is capable of issuing the audio-visual content to said display headset and audio speaker based on a status of said virtual game environment, and based on a magnitude and direction of said virtual force.
14. A robot and drone game further of claim 11 , further comprising one or more avatar processors are configured to:
project said drone device avatar, said mobile robot avatar, and said hybrid robot-drone avatar in the virtual game environment game field;
receive a user-originated command, modify a status of said avatar of a drone device based on the user-originated command and a status of the virtual game environment, and to control said drone device avatar based on the modified status of said drone device avatar;
receive said user-originated command, modify a status of said avatar of a mobile robot based on the user-originated command and a status of said virtual game environment, and control said mobile robot avatar based on the modified status of said mobile robot avatar;
receive a user-originated command via said user interface electronic controller device, too modify a status of said avatar of a hybrid robot-drone based on the user-originated command and a status of said virtual game environment, and to control said hybrid robot-drone avatar based on the modified status of said hybrid robot-drone avatar.
15. A robot and drone game further of claim 11 , wherein the one or more processors configured to, upon determining that a virtual force has been applied onto said avatars, compare said virtual force to a threshold force, and based on said virtual force exceeding said threshold force, and control said virtual motor based on a magnitude and direction of said virtual force.
16. A robot and drone game further of claim 11 further comprising a virtual motor of said drone device avatar, said virtual motor of the mobile robot avatar, and multiple virtual motors of said hybrid robot-drone avatar, and controlling said avatars based on a magnitude and direction of said virtual force in said virtual game environment.
17. A robot and drone game comprising:
a video game console providing on central processing unit to calculate various aspects of the game and control how the game responds to user input; and processing game's instructions, handling game logic in the form of movement or interaction with objects an electronic game system based on an augmented game system;
a processor (AGSP); utilizing processor of an A system, the AGSP comprising game application software programming for an augmented virtual game environment configured with multiple users playing a match;
a user interface computing system, and a user interface electronic controller device;
wherein said user interface computing system links the user interface electronic controller device to a cloud-based analytics platform.
18. A robot and drone game further of claim 17 further comprising:
user interface computing system for automatically analyzing received input data such that the input data from one of; video input, audio input and image input are configured for generating visual data of target object elements in said augmented virtual game environment and the target objects include;
a drone device avatar, a mobile robot avatar, and a hybrid robot-drone avatar;
a storage reader for reading a game application from a storage medium; a memory storing a communications client application; a network interface for receiving data from a remote user via a packet-based communication network; and processing apparatus coupled to memory in a network interface;
wherein said communication client is programmed to establish bidirectional video communications via said network interface and packet-based communication network, including receiving video data from one or more users and remote users via a plurality of user interface computing system;
an image recognition software programmed to receive video data from said client application, and to recognize a predetermined image element in the received video data, and track the motion of target object elements to generate motion tracking data.
19. A robot and drone game further of claim 17 further comprising: a game application device comprising said game application server with game logic programmed to control aspects of the game based on the motion tracking data.
20. A robot and drone game further of claim 17 further comprising:
a receiver, the receiver receiving by the game application server receiving a plurality of data throughput ratings from a plurality of user interface computing systems;
a user interface computing system of said plurality of user interface computing system, and based, at least in part, on throughput analysis data associated with a storage device of said corresponding user interface computing system;
a receiver selecting and initiating a game match selected by said game application server, and by said interface computing system configured to display a virtual reality game system playing a game match configured with an array of drone device avatars, mobile robot avatars, and hybrid robot-drone avatars and other target objects to compete in said match via multiple players on a packet-based communication network.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/146,484 US20210132626A1 (en) | 2018-05-31 | 2021-01-11 | Robot and drone game system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/993,609 US10890921B2 (en) | 2018-05-31 | 2018-05-31 | Robot and drone array |
US17/146,484 US20210132626A1 (en) | 2018-05-31 | 2021-01-11 | Robot and drone game system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/993,609 Continuation US10890921B2 (en) | 2018-05-31 | 2018-05-31 | Robot and drone array |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210132626A1 true US20210132626A1 (en) | 2021-05-06 |
Family
ID=68692627
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/993,609 Active 2039-02-12 US10890921B2 (en) | 2018-05-31 | 2018-05-31 | Robot and drone array |
US17/146,484 Abandoned US20210132626A1 (en) | 2018-05-31 | 2021-01-11 | Robot and drone game system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/993,609 Active 2039-02-12 US10890921B2 (en) | 2018-05-31 | 2018-05-31 | Robot and drone array |
Country Status (3)
Country | Link |
---|---|
US (2) | US10890921B2 (en) |
JP (1) | JP2021533425A (en) |
WO (1) | WO2019231477A1 (en) |
Families Citing this family (85)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10078330B2 (en) * | 2016-03-25 | 2018-09-18 | International Business Machines Corporation | Coordinating robotic apparatus deliveries |
US10987804B2 (en) * | 2016-10-19 | 2021-04-27 | Fuji Xerox Co., Ltd. | Robot device and non-transitory computer readable medium |
CA3070300A1 (en) | 2017-07-28 | 2019-01-31 | Nuro, Inc. | Food and beverage delivery system on autonomous and semi-autonomous vehicle |
CN109427502A (en) * | 2017-08-25 | 2019-03-05 | 深圳市道通智能航空技术有限公司 | Remote controler |
WO2019065303A1 (en) * | 2017-09-29 | 2019-04-04 | 本田技研工業株式会社 | Service provision system, service provision method, and management device for service provision system |
JP6944354B2 (en) * | 2017-11-22 | 2021-10-06 | 川崎重工業株式会社 | Robot system and how to make things using it |
KR102465066B1 (en) * | 2017-12-18 | 2022-11-09 | 삼성전자주식회사 | Unmanned aerial vehicle and operating method thereof, and automated guided vehicle for controlling movement of the unmanned aerial vehicle |
CN110138519A (en) | 2018-02-02 | 2019-08-16 | 索尼公司 | Device and method, computer readable storage medium in wireless communication system |
US11173605B2 (en) * | 2018-02-26 | 2021-11-16 | dogugonggan Co., Ltd. | Method of controlling mobile robot, apparatus for supporting the method, and delivery system using mobile robot |
US10860953B2 (en) | 2018-04-27 | 2020-12-08 | DISH Technologies L.L.C. | IoT drone fleet |
JP7155660B2 (en) * | 2018-06-26 | 2022-10-19 | セイコーエプソン株式会社 | Robot controller and robot system |
JP7396691B2 (en) * | 2018-07-13 | 2023-12-12 | ラブラドール システムズ インコーポレイテッド | Visual navigation for mobile devices that can operate in various environmental lighting conditions |
MX2021000277A (en) * | 2018-07-23 | 2021-09-08 | Airgility Inc | System of play platform for multi-mission application spanning any one or combination of domains or environments. |
US11292678B2 (en) * | 2018-07-31 | 2022-04-05 | Uatc, Llc | Palette system for cargo transport |
US11340625B2 (en) | 2018-08-08 | 2022-05-24 | Uatc, Llc | Autonomous vehicle compatible robot |
CN112423945B (en) * | 2018-08-10 | 2023-11-24 | 川崎重工业株式会社 | Information processing device, robot operating system, and robot operating method |
US11090806B2 (en) * | 2018-08-17 | 2021-08-17 | Disney Enterprises, Inc. | Synchronized robot orientation |
US11689707B2 (en) * | 2018-09-20 | 2023-06-27 | Shoppertrak Rct Llc | Techniques for calibrating a stereoscopic camera in a device |
US11136120B2 (en) * | 2018-10-05 | 2021-10-05 | Aurora Flight Sciences Corporation | Ground operations for autonomous object pickup |
US11161245B2 (en) * | 2018-10-25 | 2021-11-02 | Wells Fargo Bank, N.A. | Systems and methods for secure locker feeders |
US20220024395A1 (en) * | 2018-11-13 | 2022-01-27 | ADA Innovation Lab Limited | Autonomous driving system for a racing car or other vehicle |
US11471741B2 (en) * | 2018-12-21 | 2022-10-18 | SurfaSense LLC | Adaptive tennis ball machine |
US11500393B2 (en) * | 2019-01-03 | 2022-11-15 | Lg Electronics Inc. | Control method of robot system |
US10834523B1 (en) * | 2019-01-14 | 2020-11-10 | Accelerate Labs, Llc | Identification of delivery zones for autonomous vehicles, rovers, and drones |
US11521432B2 (en) * | 2019-01-17 | 2022-12-06 | Florian Pestoni | System and method for adaptive diagnostics and data collection in a connected robot-cloud environment |
CN112809709B (en) * | 2019-01-25 | 2022-12-02 | 北京妙趣伙伴科技有限公司 | Robot, robot operating system, robot control device, robot control method, and storage medium |
FR3092416B1 (en) * | 2019-01-31 | 2022-02-25 | Univ Grenoble Alpes | SYSTEM AND METHOD FOR INTERACTING WITH ROBOTS IN MIXED REALITY APPLICATIONS |
US11203120B1 (en) * | 2019-02-06 | 2021-12-21 | Intrinsic Innovation Llc | Mobile robotics frame system |
JP7196323B2 (en) * | 2019-02-08 | 2022-12-26 | 株式会社センストーン | Virtual code-based unmanned mobile control system, method and program thereof, control device thereof, and control signal generation means thereof |
EP3730374A1 (en) * | 2019-04-26 | 2020-10-28 | Tusimple, Inc. | Auditory assistant module for autonomous vehicles |
WO2020227694A1 (en) * | 2019-05-08 | 2020-11-12 | Agility Robotics, Inc. | Systems and methods for mixed-use delivery of people and packages using autonomous vehicles and machines |
EP3985951A4 (en) * | 2019-06-14 | 2023-01-11 | Robo Garage Co., Ltd. | Thin portable communication terminal, and control method and control program thereof |
WO2021010502A1 (en) * | 2019-07-12 | 2021-01-21 | 엘지전자 주식회사 | Robot and method for managing article by using same |
US11548140B2 (en) * | 2019-08-15 | 2023-01-10 | Covidien Lp | System and method for radio based location of modular arm carts in a surgical robotic system |
US11829162B2 (en) | 2019-08-15 | 2023-11-28 | Teledyne Flir Detection, Inc. | Unmanned aerial vehicle locking landing pad |
US11740670B2 (en) * | 2019-09-17 | 2023-08-29 | Dish Network L.L.C. | Systems and methods for automated battery replacement |
US11958183B2 (en) * | 2019-09-19 | 2024-04-16 | The Research Foundation For The State University Of New York | Negotiation-based human-robot collaboration via augmented reality |
JP7226240B2 (en) * | 2019-10-16 | 2023-02-21 | トヨタ自動車株式会社 | Article transport robot |
US11403891B2 (en) * | 2019-11-01 | 2022-08-02 | Gm Cruise Holdings Llc | Autonomous setup and takedown of calibration environment for vehicle sensor calibration |
US11809200B1 (en) * | 2019-12-06 | 2023-11-07 | Florida A&M University | Machine learning based reconfigurable mobile agents using swarm system manufacturing |
US20210283783A1 (en) * | 2019-12-18 | 2021-09-16 | Carla R. Gillett | Modular robotic service vehicle |
CN111003162B (en) * | 2019-12-25 | 2023-04-07 | 南京维景数据工程有限公司 | Surveying and mapping unmanned aerial vehicle and surveying and mapping method thereof |
US20210284357A1 (en) * | 2020-03-06 | 2021-09-16 | Joby Elevate, Inc. | System and Method for Robotic Charging Aircraft |
CN111444790B (en) * | 2020-03-13 | 2022-07-01 | 北京理工大学 | Pulse-level intelligent identification method for multifunctional radar working mode sequence |
CN115088001A (en) | 2020-03-23 | 2022-09-20 | 纽诺有限公司 | Method and apparatus for autonomous delivery |
CN111447599B (en) * | 2020-03-26 | 2023-09-12 | 上海有个机器人有限公司 | Robot communication method and system based on multilink multiplexing |
JPWO2021210063A1 (en) * | 2020-04-14 | 2021-10-21 | ||
CN111488105B (en) * | 2020-04-17 | 2021-07-30 | 北京如影智能科技有限公司 | Method and device for generating motion flow of mechanical arm |
CN111572760B (en) * | 2020-05-07 | 2022-11-11 | 重庆交通大学 | System and method for damping control of landing frame for unmanned aerial vehicle |
US20210349478A1 (en) * | 2020-05-11 | 2021-11-11 | Soter Technology Inc | Methods, systems, apparatuses, and devices for facilitating managing of paths for unmanned vehicles |
US20210387808A1 (en) | 2020-06-11 | 2021-12-16 | Nimble Robotics, Inc. | Automated Delivery Vehicle |
CN111722646B (en) * | 2020-06-24 | 2022-01-25 | 汕头大学 | Maritime search method and system based on cooperation of unmanned ship group and unmanned ship group |
DE102020125472A1 (en) | 2020-09-30 | 2022-03-31 | Bayerische Motoren Werke Aktiengesellschaft | Vehicle with gripping device for gripping an object |
US11934981B2 (en) * | 2020-10-06 | 2024-03-19 | Dish Network L.L.C. | Systems and methods for drones as a service |
CN112278106B (en) * | 2020-11-10 | 2022-05-10 | 北京理工大学 | Wheel-foot composite walking mechanism |
CN112340042A (en) * | 2020-11-16 | 2021-02-09 | 中山大学 | Multifunctional unmanned aerial vehicle |
US11797896B2 (en) | 2020-11-30 | 2023-10-24 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle assisted viewing location selection for event venue |
US11443518B2 (en) | 2020-11-30 | 2022-09-13 | At&T Intellectual Property I, L.P. | Uncrewed aerial vehicle shared environment privacy and security |
US11726475B2 (en) | 2020-11-30 | 2023-08-15 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle airspace claiming and announcing |
DE102020132329A1 (en) * | 2020-12-04 | 2022-06-09 | Rheinmetall Electronics Gmbh | Mobile robotic unit |
US11915172B2 (en) * | 2021-01-19 | 2024-02-27 | Ford Global Technologies, Llc | Robot-assisted package delivery with integrated curbside parking monitoring and curb operations optimization |
CN113183816B (en) * | 2021-02-01 | 2023-04-28 | 东风(十堰)汽车液压动力有限公司 | Battery pack quick-change and locking system of electric heavy truck |
US11225325B1 (en) * | 2021-03-12 | 2022-01-18 | DroneUp, LLC | Payload delivery and drop device for unmanned aerial vehicle for automatic release upon surface contact |
JP2022148261A (en) * | 2021-03-24 | 2022-10-06 | トヨタ自動車株式会社 | Article recovery system, article recovery robot, article recovery method, and article recovery program |
KR20230174231A (en) * | 2021-04-13 | 2023-12-27 | 샛챌 림츠마 | Drones with articulated limbs |
US11642978B2 (en) * | 2021-04-16 | 2023-05-09 | Lixiong Wu | Apparatus and method for electric vehicle battery resource sharing |
CN113110590B (en) * | 2021-04-30 | 2023-10-10 | 北京天航创联科技发展有限责任公司 | Multi-machine distributed collaborative simulation control platform and control method |
CN113086173B (en) * | 2021-05-12 | 2022-10-18 | 复旦大学 | Multi-functional unmanned aerial vehicle undercarriage and unmanned aerial vehicle |
FR3123095B1 (en) * | 2021-05-18 | 2023-06-09 | Habot Jean Christophe | Offset compact multi-support mounting system |
CN215097225U (en) * | 2021-06-07 | 2021-12-10 | 上海峰飞航空科技有限公司 | Unmanned aerial vehicle take-off and landing platform |
CN215476898U (en) * | 2021-06-07 | 2022-01-11 | 上海峰飞航空科技有限公司 | Unmanned aerial vehicle transport case |
US20230015540A1 (en) * | 2021-07-14 | 2023-01-19 | Cindy Jingru Wang | Unmanned Flying Vaccine Administration System |
CN113459738B (en) * | 2021-07-22 | 2022-07-26 | 燕山大学 | Amphibious quadruped robot based on deformable floating legs and driving method thereof |
CN113778113B (en) * | 2021-08-20 | 2024-03-26 | 北京科技大学 | Pilot auxiliary driving method and pilot auxiliary driving system based on multi-mode physiological signals |
CN113766267B (en) * | 2021-09-15 | 2023-10-24 | 深圳市道通智能航空技术股份有限公司 | Multi-path video live broadcast method, system, equipment and storage medium based on unmanned aerial vehicle |
WO2023110818A1 (en) * | 2021-12-14 | 2023-06-22 | Eeve Bv | A multi-functional robot with integrated container and extendable tool system |
CN114313232A (en) * | 2022-01-25 | 2022-04-12 | 复旦大学 | Multifunctional unmanned aerial vehicle parallel chassis |
CN114394238A (en) * | 2022-02-08 | 2022-04-26 | 山西工程职业学院 | Intelligent oiling robot and method for unmanned aerial vehicle throwing |
CN114326822B (en) * | 2022-03-09 | 2022-05-10 | 中国人民解放军66136部队 | Unmanned aerial vehicle cluster information sharing method based on evolutionary game |
CN114415731B (en) * | 2022-03-25 | 2022-06-17 | 季华实验室 | Multi-flying robot cooperative operation method and device, electronic equipment and storage medium |
CN115297308B (en) * | 2022-07-29 | 2023-05-26 | 东风汽车集团股份有限公司 | Surrounding AR-HUD projection system and method based on unmanned aerial vehicle |
CN115655102A (en) * | 2022-10-10 | 2023-01-31 | 广州里工实业有限公司 | Autonomous robot with size measuring system and workpiece measuring method |
CN116382328B (en) * | 2023-03-09 | 2024-04-12 | 南通大学 | Dam intelligent detection method based on cooperation of multiple robots in water and air |
CN115946145B (en) * | 2023-03-10 | 2023-07-14 | 成都时代星光科技有限公司 | Special handle, grabbing mechanism and grabbing system for intelligent battery of unmanned aerial vehicle |
CN116088540B (en) * | 2023-04-07 | 2023-06-27 | 电子科技大学中山学院 | Unmanned aerial vehicle and unmanned aerial vehicle cooperated cable channel inspection method and medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180104573A1 (en) * | 2016-10-17 | 2018-04-19 | Aquimo, Llc | Method and system for using sensors of a control device for control of a game |
US20190102684A1 (en) * | 2017-09-29 | 2019-04-04 | Sony Interactive Entertainment Inc. | Mobile and autonomous personal companion based on an artificial intelligence (ai) model for a user |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4465440B2 (en) * | 2000-04-20 | 2010-05-19 | 独立行政法人 宇宙航空研究開発機構 | Near-flying space robot and spacecraft operation system using the space robot |
US10331136B2 (en) * | 2006-02-27 | 2019-06-25 | Perrone Robotics, Inc. | General purpose robotics operating system with unmanned and autonomous vehicle extensions |
US8205820B2 (en) * | 2009-02-03 | 2012-06-26 | Honeywell International Inc. | Transforming unmanned aerial-to-ground vehicle |
US8342440B2 (en) * | 2009-12-10 | 2013-01-01 | Regents Of The University Of Minnesota | Miniature robotic vehicle with ground and flight capability |
US9586471B2 (en) | 2013-04-26 | 2017-03-07 | Carla R. Gillett | Robotic omniwheel |
CA2929674A1 (en) * | 2013-11-04 | 2015-05-07 | Sprig, Inc. | Methods and systems for distributing items |
US9335764B2 (en) * | 2014-05-27 | 2016-05-10 | Recreational Drone Event Systems, Llc | Virtual and augmented reality cockpit and operational control systems |
US10163177B2 (en) * | 2014-07-31 | 2018-12-25 | Emmett Farris | System and method for controlling drone delivery or pick up during a delivery or pick up phase of drone operation |
KR101592904B1 (en) * | 2014-08-21 | 2016-02-18 | 두산중공업 주식회사 | a maintenance unit for wind turbine and a maintenance method using it |
US9371133B2 (en) * | 2014-11-07 | 2016-06-21 | Paccar Inc | Drone systems for pre-trip inspection and assisted backing |
US9656805B1 (en) * | 2014-12-12 | 2017-05-23 | Amazon Technologies, Inc. | Mobile base utilizing transportation units for receiving items |
US9915956B2 (en) * | 2015-01-09 | 2018-03-13 | Workhorse Group Inc. | Package delivery by means of an automated multi-copter UAS/UAV dispatched from a conventional delivery vehicle |
US20160260049A1 (en) * | 2015-03-06 | 2016-09-08 | Wal-Mart Stores, Inc. | Shopping facility assistance systems, devices, and methods to dispatch and recover motorized transport units that effect remote deliveries |
EP4001111A3 (en) * | 2015-11-10 | 2022-08-17 | Matternet, Inc. | Methods and system for transportation using unmanned aerial vehicles |
US10535195B2 (en) * | 2016-01-06 | 2020-01-14 | SonicSensory, Inc. | Virtual reality system with drone integration |
US10228694B2 (en) * | 2016-03-04 | 2019-03-12 | Animusoft Corporation | Drone and robot control systems and methods |
-
2018
- 2018-05-31 US US15/993,609 patent/US10890921B2/en active Active
- 2018-06-07 JP JP2020514665A patent/JP2021533425A/en active Pending
- 2018-06-07 WO PCT/US2018/036535 patent/WO2019231477A1/en active Application Filing
-
2021
- 2021-01-11 US US17/146,484 patent/US20210132626A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180104573A1 (en) * | 2016-10-17 | 2018-04-19 | Aquimo, Llc | Method and system for using sensors of a control device for control of a game |
US20190102684A1 (en) * | 2017-09-29 | 2019-04-04 | Sony Interactive Entertainment Inc. | Mobile and autonomous personal companion based on an artificial intelligence (ai) model for a user |
Also Published As
Publication number | Publication date |
---|---|
JP2021533425A (en) | 2021-12-02 |
US20190369641A1 (en) | 2019-12-05 |
US10890921B2 (en) | 2021-01-12 |
WO2019231477A1 (en) | 2019-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210132626A1 (en) | Robot and drone game system | |
US11845187B2 (en) | Transferable intelligent control device | |
Hansen et al. | The use of gaze to control drones | |
US20120238366A1 (en) | Robot Game for Multiple Players that is Remotely Controlled over a Network | |
US20190072949A1 (en) | Network based operation of an unmanned aerial vehicle based on user commands and virtual flight assistance constraints | |
US20130204465A1 (en) | Autonomous Behaviors For A Remote Vehicle | |
EP2205333A1 (en) | System to control semi-autonomous robots in interactive robot gaming | |
US20160243455A1 (en) | Multiplayer Game Platform For Toys Fleet Controlled By Mobile Electronic Device | |
Aggravi et al. | Decentralized control of a heterogeneous human–robot team for exploration and patrolling | |
Prassler et al. | Domestic robotics | |
GB2598345A (en) | Remote operation of robotic systems | |
US20200036609A1 (en) | Secondary robot commands in robot swarms | |
Lee et al. | Collaborative control of uav/ugv | |
Zhao et al. | The effects of visual and control latency on piloting a quadcopter using a head-mounted display | |
US20210217245A1 (en) | System and Method of Competitively Gaming in a Mixed Reality with Multiple Players | |
Kennel-Maushart et al. | Interacting with multi-robot systems via mixed reality | |
Gromov et al. | Guiding quadrotor landing with pointing gestures | |
KR20150114058A (en) | On/off line 3D printing robot battle game system invoked augmented reality and Robot battle game method using the same | |
US10456682B2 (en) | Augmentation of a gaming controller via projection system of an autonomous personal companion | |
Lamberti et al. | Robotquest: A robotic game based on projected mixed reality and proximity interaction | |
KR101989184B1 (en) | System for providing augmented reality service | |
Testart et al. | A real-time hybrid architecture for biped humanoids with active vision mechanisms | |
Corrente et al. | Cooperative robotics: Passes in robotic soccer | |
Kasai et al. | Development of a Teleoperated Play Tag Robot with Semi-Automatic Play | |
Martínez-Carranza et al. | On combining wearable sensors and visual SLAM for remote controlling of low-cost micro aerial vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |