WO2019025872A2 - Autonomous city transportation means with artificial telepathy - Google Patents

Autonomous city transportation means with artificial telepathy Download PDF

Info

Publication number
WO2019025872A2
WO2019025872A2 PCT/IB2018/001355 IB2018001355W WO2019025872A2 WO 2019025872 A2 WO2019025872 A2 WO 2019025872A2 IB 2018001355 W IB2018001355 W IB 2018001355W WO 2019025872 A2 WO2019025872 A2 WO 2019025872A2
Authority
WO
WIPO (PCT)
Prior art keywords
autonomous
iat
telepathy
vehicles
data
Prior art date
Application number
PCT/IB2018/001355
Other languages
French (fr)
Other versions
WO2019025872A3 (en
Inventor
Wasfi Alshdaifat
Original Assignee
Wasfi Alshdaifat
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wasfi Alshdaifat filed Critical Wasfi Alshdaifat
Priority to PCT/IB2018/001355 priority Critical patent/WO2019025872A2/en
Publication of WO2019025872A2 publication Critical patent/WO2019025872A2/en
Publication of WO2019025872A3 publication Critical patent/WO2019025872A3/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60FVEHICLES FOR USE BOTH ON RAIL AND ON ROAD; AMPHIBIOUS OR LIKE VEHICLES; CONVERTIBLE VEHICLES
    • B60F5/00Other convertible vehicles, i.e. vehicles capable of travelling in or on different media
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M15/00Arrangements for metering, time-control or time indication ; Metering, charging or billing arrangements for voice wireline or wireless communications, e.g. VoIP
    • H04M15/68Payment of value-added services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/24Accounting or billing

Definitions

  • This invention relates to an autonomous city with its transportation means such as unmanned aerial and land vehicles capable for telecommunicating via artificial telepathy with each other or other machines, robots, structures...
  • this target level may be achieved in-between 2025 - 2030, while the unmanned aerial vehicles still not put in a wide use other than taking photos or recording videos, meanwhile the unmanned aerial vehicles are still: not carrying passengers, not carrying small containers of 100 kg to 1 ton, and not carrying out civil services for a city.
  • An autonomous city transportation means based on UAVs, autonomous vehicles and interchangeable autonomous aero-land vehicle (AALV) interactive and collaporative in-between and with other machines, robots, buildings, structures, humans via intillegent artificial telepathy (IAT) devices.
  • AALV autonomous aero-land vehicle
  • the autonomous aero-land vehicles are of compact and easily interchangeable shapes, by building and installing over smooth shaped vehicles' roof a smooth aerodynamic body containing two side ducted propellers or small jet propulsion engines, while at the middle an engine or electric motor is installed to drive a stowable top propellers located over the smooth body, meanwhile side propellers are built-in vertically or horizontally inside the vehicle, and pushed out vertically or horizontally when put in use, and rotated conventionally to take a horizontal or vertical configuration while they are fully extending out of the vehicle body.
  • the (AALV) is a modified new shape of Volkswagen Sedric, wherein the horizontal side ducted propellers are built-in inside the bottom part of the vehicle and conventionally pushed out sidewards.
  • IAT devices which are based on vastly modified RFID (Radio-Frequency Identification) chips, wherein in addition to the uploaded data parameters related to GPS spatial location coordinates, altitude, latitude, longitude, direction, and roughly estimated occupied space envelope, in addition to motion parameters: speed, acceleration, deceleration using many devices, an additional parameter is added related to a grided actual shape, geometry, topography, texture, tilt anngles and dimensions of each consistuent of this shape to assist in locating perfectly each occupied space by each part of it, to engage, synchronize, fit with it, or cooperate in a perfect manner with the least numbers of devices which are excluded into one, in addition to providing it with other telepathic applications and technical features to simulate actual telepathy.
  • RFID Radio-Frequency Identification
  • FIG. 1 (A- F) Illustrates multiple 3-D views for a first embodiment of the interchangeable Autonomous Aero-Land Vehicle (AALV).
  • AALV interchangeable Autonomous Aero-Land Vehicle
  • FIG. 2 (A-E) Illustrates multiple 3-D views for a second embodiment of the interchangeable Autonomous Aero-Land Vehicle (AALV) with multiple configurations for ducted propellers or armed foldable blades.
  • AALV Autonomous Aero-Land Vehicle
  • FIG. 3 (A-F) Illustrates multiple 3-D views for an (AAVL) modified smooth body carrying a baseball shape body, capsule, or container for civil service use.
  • AAVL AAVL modified smooth body carrying a baseball shape body, capsule, or container for civil service use.
  • FIG. 4 Illustrates multiple 3-D views for (AALV) modified smooth bodies in two parallel aero-convoy configuration.
  • FIG. 5 Illustrates multiple 3-D views for the second embodiment of the interchangeable Autonomous Aero-Land Vehicle (AALV) in aero-convoy configuration.
  • AALV Autonomous Aero-Land Vehicle
  • FIG. 6 Illustrates multiple 3-D views for (AALV) modified smooth body picking up an aero-land vehicle with its propellers in.
  • FIG. 7 Illustrates dynamic bodies' movement and engagement with relation to static natural and urban bodies by observing track via p re- updated data about geometrical topography positioning and occupied space borders.
  • Fig.8 Illustrates (PMGW) microprocessor and the input data parameters.
  • FIG. 9 Illustrates flow chart for (PMGW) and artificial intelligence working principle
  • FIG. 10 Illustrates a 3D views for the aerial facility room.
  • Fig. 11 Illustrates a full city view for positioning and artificial telepathy collaboration lines in-between transportation means, structures, machines, natural obstacles... with a neighborhood server and (CATS).
  • CAS neighborhood server and
  • Fig. 12 Illustrates a list for a method of three stages for making an autonomous city transportation means with artificial telepathy.
  • Figure 13 illustrates a rough distribution map of the firefighting tools and vehicles spread over part of a forest.
  • Figure 14 Illustrates an aerial delivery of a capsule to a vehicle under the control of (IAT) devices collaboration.
  • FIG. 15 Illustrates a manipulator picking up an item and filling an autonomous trolley under the control of (IAT) devices collaboration.
  • Figure 16 Illustrates a motorized robotic manipulator pushes a package out of an autonomous trolley into an autonomous vehicle under the control of (IAT) devices collaboration.
  • FIG. 17 Illustrates an autonomous trolley getting inside an autonomous vehicle via the tilted smooth stair (tilted gate).
  • Figure 18 Illustrates a manipulator picking up a parcel from in-between shelves and handling it to a UAV.
  • Figure 19 Illustrates 3D views for window aerial delivery mechanisms.
  • Figure 21 Illustrates NVH data collectors, detectors and meters distributed in a vehicle.
  • ⁇ Figure 22 Illustrates autonomous vehicle customized inner equipment.
  • Figure 23 Illustrates a flow chart briefing all devices, control units, microprocessors, data manipulators, servers, and autonomous machines collaborations.
  • Figure 24 Illustrates a multiple-task foldable flexible screen UAV. Detailed description for carrying out the Invention:
  • AAVL autonomous aero-land vehicles
  • AAVL autonomous aero-land vehicles
  • aerial carrying and delivery machines wherein these machines are provided with intelligent artificial telepathy (IAT) devices 21 to cooperate with unmanned aerial vehicles (UAVs) 22 and similar ones of both in the space, and on or in land structures 23, buildings 24, machines 25, robots 26, and humans 26 to make the recognition of identity, tracking the route, handling the load, and engaging to carry out a task easily optimized, and to be approached clearly without using plenty of navigating devices, or probabilities of hitting obstacles (hills, trees, posts, protruding parts...) 28, or failing to achieve the assigned tasks.
  • IAT intelligent artificial telepathy
  • UAVs unmanned aerial vehicles
  • the Autonomous aero-land vehicles (AAVL) 20 are of compact and easily interchangeable shapes aero-vehicle into land vehicle and vice versa, in a first embodiment a normal smooth vehicle 29 available in the market e.g: Toyota Previa 29 is to be modified indoor before sale (Fig.
  • VTOL vertical take-off or landing vehicle
  • a smooth aerodynamic body 31 containing two side ducted propellers 32 or small jet propulsion engines 33 with their ducting 34 extending from the front to the rear exhaust nozzles 35, while at the middle an engine or electric motor 36 is installed to drive stowable propeller 37 located over the smooth body 31 , meanwhile side propellers 38 are built-in vertically or horizontally inside the vehicle 29, and pushed out vertically or horizontally when put in use, and rotated conventionally to take a horizontal or vertical configuration while they are fully extending out of the vehicle body 29.
  • VTOL vertical take-off or landing vehicle
  • the tail 40 is made of two parts; each part in the top side is containing a small propeller 41 to adjust the reverse rotation of the main top propeller 37.
  • the (AALV) 20 is a modified shape of a Volkswagen Sedric 42 or any similar electric mini-bus, wherein three configurations are provided: a- the horizontal side ducted propellers are built-in inside the bottom part of the vehicle and conventionally pushed out sidewards when put in use Fig.
  • b- side rotor blades are built underneath the vehicle and pushed out via telescopic arms Fig.2-D , or C- an expanded bigger right and left side armed rotor blades are installed on a foldable mode into grooves (not shown) in the roof sides when not in use while the front and rear rotor blades are stored in the bottom, in this combined configuration, the bottom rotor blades telescopic mechanism pushes them out, and the right/left side rotor blade arms (or wing shaped) are unfolded downward when put in use Fig. 2-E.
  • the rotor blades too can be rotated from horizontal to vertical configuration. It is obvious here that many configurations can be drawn from such embodiments and configurations.
  • these (AAVL) 20 smooth aerodynamic body 31 flying mechanisms can be used and modified by adding two foldable wings and two vertical side ducted propellers (Fig.3 A-B), each ending with one or two ducted branches for carrying baseball shape containers, capsule, or normal containers 44 (Fig.3 C-F), tanks or to have small shapes like the drones to carry parcels, wherein the containers 44, tanks, capsules...etc can be used in sequential firefighting, fueling other machines, and irrigation...(Fig. 3).
  • unmanned aero-carriers can be collected in parallel series of convoys carrying out effective firefighting, wherein their vessels are either filled by dipping them inside a nearby lake, or carrying the filled ones from a nearby station, or provided over trucks, then carrying out firefighting or cooling the ash which is carried by the wind, returning back, dropping the empty ones for refilling, and carrying filled ones... and so on in a continuous cycle (Fig. 4).
  • AAVL 20 can be attached or connected conventionally to be set to fly in aerial convoys 45 as well as UAVs (Fig.
  • such applicable flight configuration is applicably providing safe, compact, less air traffic, and well organized semi- train carriages flight, wherein if one engine or propeller got idle, all of the other AAVL 20 units will support a safe flight for the defected unit until the next destination or safe emergency landing, or a until a modified smooth flying body 46 makes and aerial (recover) pick-up of the idle unit from a convoy (FIG. 6) with its propellers in, while other parts not shown for clarity. It is obvious that the AAVL 20 second embodiment can be equipped with a parachute over its roof for emergencies.
  • Aero-convoy configurations are not limited to passenger Aero-Land vehicles, but Aero-Land vehicle convoys loaded with parcels, luggage or fluids can be created too, while Aero-Carriers (with top aerodynamic bodies) 31 loaded with parcels, luggage or fluids (Firefighting powder, water, fuel) can be arranged in aero-convoys, wherein a aero-carrier firefighting convoys can carry firefighting container, capsule, baseball like vessel to carry out a sequential firefighting for fire frontiers, and return back to unload these empty used ones, and pick up new filled vessels to go back to extinguish fire, forming an aero- convoy firefighting series and cycle.
  • Telepathy device principle of operation As illustrated in (Fig.7) different bodies are distributed throughout a specific cubic part of a city and nearby mountains space, each part refers to an identified physical body, the used geometries are simulating these physical bodies, they are numbered in sequence for reference, wherein body 1 is assumed to be located and moving on road level, body 2 is a mountains, body 3 is a building compound, with body 4 located over it, wherein body 4 is a parcel, while body 5 is flying in the space in-between all.
  • the location conventional data parameters can be got depending on the prior art and available technologies depending on the following devices: global positioning system (GPS) unit 52, RFID (Radio frequency identification) unit 53, remote sensing unit 54, altitude meters 55 to know their heights, also their motion conventional data parameters can be measured via speedometers 56 and accelerometer 57. Their tilt angles are measured via tilt meters 58 and their orientations and angular velocities are measured via gyroscopes 59. So, the so defined instant motions, position (location), tilt and orientation parameters can be measured via available devices, these devices are to be compacted in one device in this invention.
  • GPS global positioning system
  • RFID Radio frequency identification
  • body 4 The full geometrical shape of body 4 with its full dimensions to scale, including all of its protrusions, textures, and topography, the same applies on body 5 itself, that mean body 5 too should know all details related to its geometry shape and topography.
  • each body should have identified shape with full details about its dimensions to scale + all of its protrusions, texture, and topography, while current devices in the art are not providing such data in prior, so these data too should be identified and titled as geometrical parameters.
  • the innovative solution which is behind the telepathy device 21 working principle is based on providing the compact (PMGW) device 60 with preset downloaded data related to its geometry parameters, wherein its whole geometry including all of its parts are gridded into tiny spatial cubes, wherein each big, small, very small or tiny 3D part of the geometry is too having identified geometrical data to help in recognizing where each small part is located inside this 3D geometry, in another way, each physical body or prohibited flying zone will be providing by itself its occupied space data Saying: Hello, I am here! Occupying these particular space boundaries!
  • the (PMGW) device 60 will not be randomly installed inside a physical body, but it will be installed such that its focal point is synchronized with the focal point of the mother body, in addition to mating its position direction, tilting... with that of the mother body, such that after installing the (PMGW) device 60 into a specific preselected and calculated safe position inside the physical body, the digital geometrical shape including its inner constituents, dimensions, texture, topography and even coloring, is providing (PMGW) device 60 data that is 100.0% mating and synchronized with the real body (PMGW) data, that is to mean, the (PMGW) device 60 is providing to other similar devices, its mother body digital (PMGW) data simulating perfectly its mother body real (PMGW) data.
  • the (PMGW) device 60 can provide position data, altitude, tilt angles data not only the general shape of the building, but even it provides a specific window altitude, direction, tilt angles, shape, dimensions, frames dimensions, coloring, the same applies for its lifts, lights, protrusions... etc., ...etc, which are already available via 3D architect designs and their amendments for the same building.
  • the (PMGW) device 60 can provide position data, motion data, and altitude, tilt angles, in addition to its shape, dimensions, coloring, and topography... etc., and similar details about each part of it. So, when a (PMGW) 60 device is installed inside a UAV 22, wherein its data is shared and sent while flying via a data emitter to a remote data receiver, the data receiver without using radars scanning the space, will know not only the global position, altitude, speed, acceleration and direction of the UAV 22, but it will know its actual shape with the actual dimensions, topography, texture and location of each part in it, big, small, or tiny.
  • the flying body 5 which is assigned to pick up body 4, will inform the server 61 about its task, the server will create an imaginary spatial cube in the space which encloses all cooperating bodies, and will share their (PMGW) data in-between specially the moving bodies inside the imaginary cube, such that each body share its (PMGW) device data 60 with body (4) and vice versa, then according to the distance in-between body (5) and body (4) and considering the obstacles which are known not via Lidar, radar, camera, distance sensors, or TV monitoring..., but according to data that are provided by the (PMGW) data devices 60 received from the bodies located in-between or nearby the route of body 5 to body 4, the (PMGW) device 60 of the flying body 5 will receive data over a server 61 from from the mountains body 2 (PMG) device 60 data and building body 3 (PMGW) device 60 data.
  • PMGW mountain body 2
  • PMGW building body 3
  • the flying body 5 should create a flight route toward body 4 without hitting the mountains body 2 (including its trees) or building body 3 which are occupying a space inside the imaginary cube in-between it and body 4, also it should adjust variably its speed, acceleration, altitude, coordinates... until approaching body 4, wherein it will not use cameras to assist a remote controller human to make trials to engage the clamps (hooks) 62 of body 5 with the handles 63 which are protruding over body 4, neither to use laser emitter, or distance sensors to adjust itself and such that its clamps engage to the handles 63 of body 4, but its clamps (Sub-PMGW) data are manipulated compared to the handles 63 (Sub- PMGW) data which are collected from both devices (Refer Fig.
  • the flying body 5 does not need to use specific devices to inspect if the engagement is done successfully, all of what it should do, is to follow its route towards body 4 according to its body (PMGW) device 60 data, considering not to go in a straight line, as it will not be allowed to cross the imaginary geometrical shape boundaries of the building body 3 which is recognized to it without observatory tools but according to shared (PMGW) data of body 3.
  • PMGW body
  • the next route in- between Body 5 and body 1 is manipulated in the same, once the flying body 5 approaches body 1 , it will follow the preset order to get inside it via a back window 65, again, it will not use observatory tools to search for the window 65 of body 1 , as body 1 will provide it with its window 65 (Sub-PMGW) data, and again, the flying body 5 will not move in a straight line towards the window 65, as it will hit body 1 according to body 1 (PMGW) data which is identifying the imaginary spatial body shape, dimensions and coordinates occupied by body (1), wherein no machine is allowed to cross it randomly, as it will hit body 1 , and so, the flying body 5 will artificially calculate and manipulate its shortest safe route toward the window 65, meanwhile body 1 which is updated about body 5 (PMGW) data is updated and recognizing that body 5 is now near, so it will instruct the window 65 to open to give access to body 5, and so body 5 will land inside with body 4 on the specified landing location, according to landing location (Sub-PMGW) data which are provided
  • body 1 which is updated about the body 5 (PMGW) that it is inside it currently, closes the window 65 and moves to pass in-between the building body 3 and the mountains body 2, but it will move autonomously according to (PMGW) data received from both the building and mountains, such that it will navigate its route in-between them without crossing their occupied imaginary geometrical shape boundaries.
  • (PMGW) data device is used here to share data which is used to know, approach, handle, communicate, cooperate, engage... in addition to further coming and following technical additions, modifications and applications which will be demonstrated throughout this invention, it will be titled Intelligent artificial telepathy (IAT) device 21 instead of (PMGW) device 60.
  • FIG. 9) illustrates a flow chart briefing the (IAT) device 21 including the (PMGW) microprocessor 60 data use for route and engagement collaboration management.
  • weather parameters are not necessarily installed inside each (IAT) also its data are collected via external sensors or meters.
  • normal old vehicle or new autonomous vehicle 66 via its (IAT) device 21 will send by itself information and data to a cloud server over a network to be shared with the current and future expected neighborhood vehicles, up to the next 24 hours navigation plans, all needed data for each autonomous vehicle and the majority for non-autonomous vehicles, will include but not limited to the following:
  • NSH noise vibration harness
  • the vehicle repeated history manners data and analysis of their probabilities, such as attending work location, visiting restaurants, visiting cinema, back home relative to road map and timings.
  • the category of the vehicle or status ambulance, firefighter, police car, school bus, municipality maintenance, road construction, caravan, bank, fuel, presidential parade, wedding procession, official procession, parade of consolation, car race, public: passengers / luggage, or private, salon, hatchback, sporty, 4 WD, pickup, truck ...
  • the vehicle driving performance and maintenance status e.g: low tire pressure, faulty devices, faulty sensors.
  • City urban areas include but not limited all physical matter which exists over the barren surface of the Earth's soil, rock and water, as a result the followings as whole and part should be scanned and mapped via 3D mapping + time and computer vision cameras to locate them for the flying and land vehicle, such that the flying and driven machines (AALVs 20, UAVs 217) does not need many tools and devices to recognize and observe these urban facilities and infrastructure, the city facilities tell and educate while the (AALVs) 20 and (UAVs) 22 know and react.
  • the city facilities which will be divided into geographical zones within tens or hundreds of meters will be provided with a single built-in (IAT) device 21 and a set of distributed (Sub-IAT) devices 67 for minor static facilities, generally including but not limited to the following: 1- All types of infrastructure urban sites 3D mapping and site plans.
  • 2- Buildings, towers, warehouses, stores, factories shapes over and underground, geometry, topography, dimensions, including accessories, occupied space, GPS location and coordinates, perfect direction, levels, walls, ceilings, service rooms, tanks and windows dimensions, and all sides tilt angle in addition to protrusions, posts, road signs, columns, windows and doors swapping volumes, that is to mean all (PMGW) data .
  • PMGW static and dynamic
  • the perfect and accurate (PMGW) data for the autonomous aerial delivery machines 22 shapes, dimensions, including accessories, occupied space, GPS location and coordinates, perfect direction and height. (Refer to prior cited art)
  • PMGW perfect and accurate
  • the perfect and accurate (PMGW) data for the flats, offices, villas... autonomous aerial delivery windows 75 shapes, dimensions, including accessories, occupied space, GPS location and coordinates, perfect direction and height. These windows 75 addresses details are to be provided when an online order is made or the address is mentioned as an in home P.O. Box to receive aerial deliveries, wherein the windows 75 (Sub-IAT) 67 device will provide its location full details to the aerial delivery UAV 22: Street No. (IAT) device 21 location and data, Block no (IAT) device 21 location and data, Building no. (IAT) device 21 location and data, level no. (PMGW) location and data, flat or office (PMGW) location and data, window 75 no.
  • IAT Street No.
  • IAT Block no
  • IAT Building no.
  • PMGW level no.
  • PMGW flat or office
  • 11- Aerial facility rooms 68 full (IAT) device 21 data including: GPS location data and spatial coordinates, these smooth shaped rooms are to be installed normally over high rise buildings and crowded low level buildings, wherein each room will have three gates 69, each gate opens to a firefighter extinguisher drones set 70 located on shelves, or aerobatic fagade cleaners 71 , or local pick-up drone set 72 to unload parcels 73 from bigger aero-carriers 74 and deliver it to the target windows 75 (Fig.
  • IAT device 21 data including shapes, dimensions, all data related to their GPS locations.
  • ATC Artificial Telepathy Chips
  • ACT Artificial Telepathy Chips
  • Figure 11 illustrates a 3D view for samples of Autonomous aero-land vehicles (AAVL) 20, flying Cedric vehicle 42, Container Aero-carriers 105, autonomous vehicles 66, UAVs 22, flying robots 26 (either winged, carried on flyboard or provided with jetback), manipulators 107, vehicle aero-carriers, UAV aerocarriers 74, Local Pick-up drone 72, aerial facility room 68, container aerocarrier 105...
  • AAVL Autonomous aero-land vehicles
  • the server 61 and City central artificial telepathy station (CATS) 112 collects instant data from all of these, and communicates data back to each of them; updating it about (PMGW) data about each part which it need to cooperate with it, or to have a route near to it, in addition to expanded artificial intelligence data manipulations to facilitate movement, transportation, collaboration of them.
  • PMGW City central artificial telepathy station
  • AALVs Autonomous vehicles 66
  • UAVs Autonomous vehicles 66
  • the current tests and trials on autonomous vehicles 66 are not reflecting the actual case when tens or hundreds of autonomous vehicles 66 with different types from different manufacturers are moving on part of the streets' lanes in- between tens or hundreds of other normal self-driving vehicles 77 of different types, sizes, and use, with different driving styles and manners that are different from country to country and from city to city, and different on different roads.
  • a clear vision proposing creative visions and solutions ahead of getting into such negative probabilities should not be limited to the ones in the prior art related to the collaboration in-between the autonomous vehicles 65 themselves without including the self-driving vehicles 77 in the collaboration and modification, but the real vision is achieved by creating the methods and devices that match these autonomous and normal machines in a synchronized manner, and bridging the communication in-between them in an HSE systematic harmony; here the following methods will be followed to create technical solutions, technical methods, technical bridging and synchronization to let the autonomous vehicles 66 and normal vehicles 77 adapt to each other on search for solving such issue as an urgent prior need, while many governmental authorities and transportation means manufacturers have to do many international summits, conferences, meetings, researches, inventions, regulations... to formalize a unified system for (IAT) devices 21 and any other facilities to let different transportation means from different manufacturers collaborate and adapt to any country regulations.
  • IAT unified system for
  • IAT devices 21 in all existing self-driving (non- autonomous) vehicles 77 which are using the roads, these are including: Bikes, sporty vehicles, hatchback vehicles, salon vehicles, 4WD vehicles, trucks, lorries, caravans, trains, dragged boats or containers... etc.
  • IAT devices 21 should be programmed, set, and uploaded with all of the possible data which are already listed in this invention in addition to (PMGW) data microprocessors 78, integrated circuits 79 and micro-meters 80 and micro-devices 81 , in addition to being connected with the vehicle's instrument cluster 82 and navigation system 83, they will be either provided with an assistant (IAT) display 84 or if available using the existing vehicles displays 84, it should receive all data about the self-driving motion parameters: speed, acceleration, deceleration, direction, body tilt coordinates, steering angle, turn signal lights, reverse and tail lights, fog light, braking, faulty messages and also being connected with the semi-autonomous vehicles 85 and fully autonomous vehicles 66 (IAT) devices 21 via a cloud server receiving data from neighborhood group of vehicles of three types (Autonomous 66, semi-autonomous 85 and non-autonomous vehicles 77).
  • PMGW data microprocessors 78, integrated circuits 79 and micro-meters 80 and micro-devices
  • the self-driving vehicle 77 (IAT) device 21 will issue the instructions on the display screen 84 and announce them through speakers 86 to the driver who can communicate with it through voice recognition system 87, during this stage the vehicle can run simple level dialogs too with the driver or passengers related fully to road issues and simply to their personal needs, these messages are including but not limited to the following:
  • the (IAT) devices 21 in the self-driving vehicles 77 are set to receive data entered by the drivers about their daily destinations and timings, driving method, speed mode, these data are to be used by the servers to carry out assessment for either the short term, mid-term and long term traffic on each street and road depending on data received from all types of vehicles equipped with (IAT) devices 21 , not only autonomous ones 66. That is to recommend for the drivers in prior the best road to reach his destination, wherein the drivers should inform the system if he accepts the recommended road or not, while the server will be controlling the autonomous vehicles 66 to follow its road map instructions after being offered to the passengers.
  • the (IAT) devices 21 installed inside non-autonomous vehicles 77 will assist a lot the autonomous vehicles 66 to know which car park is empty, as the current art requires from an autonomous vehicles 66 to use all of their devices to scan, observe search for a car parking in-between other non-autonomous vehicles 77, but when the non-autonomous vehicles are sharing their exact location with shape spatial dimensions and coordinates with the autonomous vehicles 66, then the autonomous vehicle 66 will know directly where is the available parking space to move directly towards it without rounding and searching, this is so helpful too where two yards or more are completely full, the autonomous vehicle 66 does not need to waste time searching inside them for a parking space, or in another way, it can know if a self-driving vehicle 77 just started leaving its space, as the (IAT) device 21 inside an autonomous vehicle 66 will track the occupied space of all vehicle types, and will be updated about any self- driving vehicle 77 or any of the other types is leaving its parking occupied space, but if there are many autonomous vehicles 66 waiting, then
  • the parking issue is solved too for the self-driving vehicles 77, wherein these too are getting shared data from a local server about the empty parking space.
  • Still an option can be added wherein two vehicles one autonomous 66 and one non-autonomous 77 with relatives or colleagues inside them looking for two nearby spaces in a car parking, wherein the autonomous vehicle 66 will be guided automatically while the non-autonomous 77 is guided via instructions issued to the driver, such that both will approach the two parking spaces which are close to each other.
  • the collaboration in-between the autonomous vehicles 66 and self-driving vehicles 77 is not governed only by the (IAT) devices 21 which should be created according to international standards governing all vehicle manufacturers to unite the collaboration methods, but also telepathic intelligent learning systems (TILS) 88 related to bridging such collaborations should be installed in all (IAT) devices 21 and neighborhood servers 61, to learn, analyze, educate, guide and instruct either drivers or autonomous vehicles 66 in the same, b- Traffic lights 89: where ever a set of traffic lights 89 at road's cross knows in-prior how many vehicles are approaching from each side, it can control the lighting such that the highest number of vehicles pass with the least time in a fully optimized manner, as a result, (IAT) devices 21 installed inside traffic lights 89 can calculate how many non-autonomous 77, semi- autonomous 85 and autonomous 66 vehicles are moving toward it, as all types of these vehicles are tracked via their (IAT) devices 21 which share and track their own vehicles location, motion and direction parameters... In such a way,
  • the driver can answer by saying:
  • the server 61 will assist him in providing a space in-between the autonomous vehicles 66, once he turn on his car turn signal lights 90.
  • 2- Non-autonomous vehicle 77 drivers are offered two to join convoys of non-autonomous vehicles.
  • Non-autonomous vehicles can pass their messages via their signal lights, braking, high beam to autonomous vehicles 66 not by being observed via the available devices in the art which are installed on autonomous vehicles 66 to observe the physical lights, but via the local server 61 which shares these data received from the (IAT) devices 21 of the non-autonomous vehicles 77, Meanwhile the autonomous vehicles 66 pass their messages to the non-autonomous vehicle's 77 driver via lighting, and if no response via a preset timing, they pass voice messages, else the drivers may receive a fine due to traffic violation according to a point accumulation system, depending on the ratio of seriousity of the case.
  • Non-autonomous governmental vehicles 91 (police, ambulance%) are to be defined and identified by both autonomous vehicles 66 and private or public non- autonomous vehicles 77 according to the type of the vehicle: Ambulance, police car, presidential vehicles, road construction machinery, school bus... etc., for example both private autonomous 66 and non-autonomous vehicles 77 are warn ahead about a police car 91 heading up from the back, wherein non-autonomous vehicle 77 driver should take action, while autonomous vehicle 66 responds directly.
  • the driver will be warned directly via the shared locations of the truck 91 (IAT) device 21 with the local server 61 which shares it with the (IAT) device 21 of the speeding non-autonomous vehicle (77).
  • an action may be instructed by (IAT) device 21 and carried out by a semi-autonomous 85 or non-autonomous vehicle 77 by automatically activating engine cut-off via the digital motor electronics (DME) or engine control unit, or by activating the ABS, in addition to activating the signal lights or hazard lights to warn other vehicles from this situation, and to instruct the driver to go back to his lane.
  • DME digital motor electronics
  • the (lAT) devices 21 in autonomous 66 and non-autonomous 77 vehicles not only provides data about spaces in car parking, but also how much an area is near to a restaurant of fast food, bank... or what else is currently crowded and how many persons are waiting their role in queue.
  • f- Street radars 94 installing the (lAT) devices 21 in existing non- autonomous vehicles 77 will carry out monitoring the vehicle speed, wherein exceeding the speed limits, passing wrong lines, parking in wrong places, parking in non-allowed parking, racing in-between radars, all are monitored, warning voice and written instructions are introduced to the drivers, traffic fines should be issued instantly and by ratio depending on the location, and on either violating the rule was instant and mandatory and temporarily due to a serious situation, or it is without reason, considering too, the time length of violation, disturbing the autonomous vehicles 66, crossing a traffic light 89, violating the instruction to open the lane in prior for police cars 91 , ambulances... etc.
  • h- Adaptive headlight control headlights in new vehicle models are turning right or left automatically with turning the steering wheel, this may be helpful at very low speeds, but as the speed increases, the light turning will lag behind turning the steering wheel, that mean you cannot see and observe the turn before turning the steering wheel, through (IAT) devices 21 , specifically when the road map is selected and the street is having one lane or all lanes turning together in one direction, these lights can be turned right or left even instantly just before the steering wheel is turned, to navigate the road, or even by keeping one consistent with the steering wheel, and the other turning ahead of it, depending on the pre-selected road, turns locations, and speed.
  • IAT IAT
  • i- Autonomous city airport during the first stage, a multiple phase project should be launched to establish an autonomous city airport, starting by feasibility study, then selecting a site, making urban planning for four terminals: Autonomous Aero-Land vehicles (AAVL) 20 terminal, UAVs 22 terminal, Logistics terminal, Autonomous aerial reception and delivery terminal, up to starting constructions for the infra-structures.
  • AAVL Autonomous Aero-Land vehicles
  • the UAVs 22 civil service implementation will start, it will not cover every part in the city, but it will be limited to cover the followings which are provided with (IAT) devices 21 :
  • UAVs 22 services can cover parcels 73 delivery, firefighting and facade cleaning.
  • the UAVs 22 should be provided too with (IAT) devices 21, the handling of parcels to the UAVs 22 can be manual, then under human control, then to be automated, while the delivery will be to autonomous ground stations (refer prior cited art) provided with (lATs) and located inside reception area for the residence or employees easy approach, and easily installable unit without modifications on the buildings.
  • New building are to be provided with autonomous aerial delivery conveyors delivering parcels received from the building roof or ground floor through elevator via a flat or office specific window 75 into a specific chamber (refer prior art).
  • Firefighters carried by drones (UAVs) 22 here every 2.0 km a firefighter drone normally carrying a set of three fire extinguishers located in the right place in a forest where it can access easily toward a fire, within around one minute after the first warning is received, four firefighter drone sets 70 can attend to carry out firefighting, and within 5 minutes around 15 firefighter drone sets 70 can attend to carry out firefighting, while according to the distance distribution in figure 13, within the first hour hundreds of these will be attacking the fire and surrounding it from all around, this arrangement will lower the chances for the fire extensions to extreme conditions.
  • Firefighter extinguisher drones 70 are to be distributed on highways every 10 kms inside docking stations, wherein a standby set or more of firefighter cylinders are kept there and arranged right, such that the drones can return their empty ones there, and pick up new full ones. That is to mean to provide each docking station of a firefighter with extra set of filled extinguisher cylinders to return and replace their empty ones (of course depending on its own (IAT) device 21).
  • these aerocarriers 105 are provided with vertical take-off and landing capability (Fig.3 (A-C, F) and Fig.4), and so depending on their engines either ducted-propellers or jet-prolusion or both with top traditional propeller, they can approach the fires within less than ten minutes, if the weather is windy, they can handle another role via splashing fluids via pumps or spreading powder through extending nozzles over the flying ash, to cool it, before falling down and causing new fires nearby, as this flying ash is a main reason for spreading the fires far away from the first spot by many kilometers and sometimes tens of kilometers.
  • firefighting drone sets 70 every 10 km, as static providers where new refilling technologies may be applied for fast refilling of the empty ones, also fire extinguishers to be carried by trucks as dynamic providers located every 50 km, wherein firefighter drone (IAT) devices are communicated with the truck (IAT) to know where to meet to handle its empty fire extinguishers to receive new ones.
  • IAT firefighter drone
  • Aero-carrier 105 empty vessels 44 refilling centers, also to be provided every 50 km.
  • FIG. 13 illustrates a rough distribution map of the firefighting tools spread over part of a forest, these tools are not spread to scale, and not repeated on each part of the map, as this map is limited for illustration only.
  • the symbols meanings are like the followings: : Firefighter drone set 70.
  • Firefighter cylinders warehouse, full vessels 44, or vessels refilling center
  • Firefighter truck with top drone set arrangement carrying firefighting fluid hose carrying firefighting fluid hose.
  • (IAT) devices 21 are a magical description for a wide range of solutions either bridging and linking the different vehicles, long term customizing of old vehicles drivers to cooperate with them and being ready to use them, and even to solve many other road safety and comfort issues, while boosting the thinking of inventors to find more applications for them in the near future.
  • a- Computer Vision Panorama cameras 95 The obstacle not to use them in autonomous vehicles 66 is that even they are relatively capable according to some conditions to reconstruct real-time 3D views, but in the case of autonomous or non-autonomous vehicles 77 an absolute or a better construction of each 3D object in the views is a necessity to overcome this issue, it is proposed here that these computer vision cameras 95 to be created as panorama cameras installed on top of autonomous and 66 semi-
  • the local server 61 should be upgraded with such technologies to process, assign, title and identify all the similar static and moving bodies appearing on a set of vehicles computer vision cameras to know how to deal with them via commands sent to the vehicles.
  • the server 61 will identify and assign the person and his trolley an identity (Unit (x): Moving person + trolley) as if the person and his trolley are equipped with a recognized (Sub-IAT) device 67, this artificial 3D view reconstruction can be supported too by installing multiple panoramic computer vision cameras 95 on the most critical parts of a road, to pick-up views which perfectly reconstruct 3D actual views from all sides with the actual dimensions, to be shared with the local (neighborhood) server 61, this whole system can be called collaborative reconstructive computer vision (CRCV) 98, wherein no more old normal vision cameras used, while the new ones starts to occupy new installations and replace old ones.
  • CRCV collaborative reconstructive computer vision
  • These cameras 95 can be a better substitute for the bulky Lidars and other observatory radar, sonar, laser vision, distance sensor devices... during this stage too. It is not only because the cameras are compact, cheaper, needs less service, not having motorized mechanisms, with less probability of being faulty, and not only because these cameras 95 are showing real-time live videos, but because the computer vision technology will process the instant images sent from the live real-time cameras 95 views to a microprocessor to process these viewed images via computer vision algorithms recognizing shapes and objects within images, and their dimensions and even names and predicted habits and motion style depending on technologies available in the art, wherein every physical item moving or static in the road, either vehicles, bikes...humans, livestock...
  • non-autonomous vehicles 77 they will relatively assist autonomously in braking when a driver is going to hit an object which is either having a (IAT) device 21 such as vehicles identifying their occupied space, dimensions, and distance, so that they will not get into the space of the observed vehicle + safety distance enveloping its 3D shape/geometry e.g.: 0.75m.
  • IAT (IAT) device 21
  • It also can provide and share live real-time views ahead with non-autonomous vehicles 77 at the back, wherein a driver can have an option to select to view the road for the next: 50m, 250m, 500, 1,000m via receiving live views shared from both autonomous 66 and non-autonomous 77 vehicle panorama cameras.
  • These views can be either viewed on the screen (display) 84, or projected on head-up display 94, or in a more created device in this invention, by installing a holographic dome 99 over the dashboard from the driver side, to make the holographic real-time view inside the dome viewed despite of the daylight, or street light, the dome glass should be covered with a dark tinting film, such that the outer light does not get inside it, as a result it will be dark from inside, while the inner projected light creating the holographic 3D real-time views will be observable.
  • the computer vision panorama cameras 95 will take the major role to monitor and control the vehicle's motion parameters rather than data collected from Lidars, Radars, distance sensors... During this stage the panorama computer vision cameras 95 can be installed over the Lidars 49, while replacing them later when more autonomous vehicles 66 get more occupied with panorama cameras 95 + their (IAT) devices 21 locating them either static or in motion, all of this in addition to having live real-time updating about the road situation not only from the autonomous vehicles but also from the non-autonomous vehicles 77, in addition to the data stored and updated each second, minute, day, week, month, year and more via the Intelligent data learning system TILS (Telepathic intelligent learning system) 88 in each local server 61, wherein the roads and their borders are daily scanned and viewed by the cameras 95 from all types of vehicles, which draws a fully detailed view for the road and each part of the road even road cracks, each physical item on the road side, shops, barriers...
  • IAT Intelligent data learning system
  • UAVs 22 services can cover parcels 73 delivery, firefighting and facade cleaning, to carry out parcels 73 delivery, the UAVs 22 should be provided too with (IAT) devices 21 , these devices are installed too inside each pick-up and delivery facility (building) to locate items which should be picked up from belts 101 , shelves 102 or windows 75, such that when an order is received online, it is sent electronically to an assigned UAV 21 with the pick-up line, shelf 102, belt 101 details, which will be located via a (IAT) device 21 installed there, providing full data about the location's 3D- spatial coordinates, location of warehouse, shelves set, shelve line, and perfect location of item over a shelf 102, belt 101 , window 75, the UAV 22 will fly to that specific part of the space following the preset route, it does not need to scan labels, bar codes., on it, nor to use laser communication with any device to assure the right item, it just pick the imaginary location in the space, where the item is located.
  • IAT IAT
  • the whole shape of the warehouse is located in the space according to GPS spatial coordinates or (PMGW) parameters, including its all dimensions, borders and surfaces, then after the UAV 22 gets inside, the whole warehouse inner shape is provided including shelves, lights... etc. perfectly in relation to their actual shape, dimensions and location, then each after the UAV 22 moves toward the specific shelves 102, the cell in the shelves 102 is numbered with a reference number connected to its (PMGW) spatial location coordinates in the space is located.
  • PMGW GPS spatial coordinates or
  • the whole warehouse including each physical part inside it even tiny protrusions is provided in its three actual 3D shape connected to its spatial coordinates, the whole is divided into 3D grids of 1 mm length or less, these grids spatial coordinates are to be defined, like in screen mesh search for spare parts, but here the search is through actual 3D gridded system starting from bigger units to small, smaller, micro, tiny... such that when a UAV 22 flies toward a warehouse it navigates for it as a major grid, when it approach it, it gets inside via a gate which is having a defined shape and dimensions including its tiny grid 3D dimensions in the space, then the UAV (drone) 22 crosses the defined safe route towards the specific shelf 102, then once it approaches the item (parcel)
  • the aerocarrier 74 can be guided in the same near to a belt 101 or robot 26 to fill it with parcels 73, wherein each parcel 73 which is identified to its prior occupied space on a shelf 102 cell, once picked up by a robot 26, or pushed toward a belt 101, its location which is tracked by the (IAT) device 21 according to: 1- the time when it is pushed over the belt 101, the speed of the belt 101 and so the time when the UAV 22 approaches the belt 101 , then parcel's 73 sequence number when its pushed inside the aerocarrier
  • the artificial telepathy data communicated in-between the devices (IAT) 21 according to locating specific shapes from specific spatial coordinates into new spatial coordinates and the crossed distance over a calculated time specifies there new locations, these are enough to track, identify and locate a physical body.
  • the Aerocarrier 74 or UAV 22 will leave the warehouse and engage to a specific preset route towards its destination, where it should always adjust itself to be in route. For fear it may hit a bird, or an idle falling UAV 22 it can be provided with panoramic computer vision camera 95 to avoid such rare incidents, and to readjust its track back to fit its original route.
  • the Aerocarrier 74 or UAV 22 loaded with parcels 73 should have its (IAT) device 21 in contact with both, a (IAT) device 21 in the local pick-up drones 72 in the destination and (IAT) device 21 in an aerial room 68 for (drones) or warehouse installed over a building, to be housing local pickup drone 72 set, firefighter drone set 70, and a facade cleaning drone set 71 (Fig. 10 (A-C)).
  • the Aerocarrier 74 or UAV 22 (IAT) device 21 will be in contact with local servers 61 to share the spatial location and motion parameters and to receive them too, not only about the neighborhood UAVs 22 to avoid collision... etc, but also to receive data about all of the high rise physical objects via their (IAT) devices 21.
  • a tower whole (PMGW) including: shape, dimensions, coordinates of it as whole and as in partial grids in the space.
  • the Aerocarrier 74 moves directly toward a specified spatial coordinates in the space according to the gridding system where the local pick-up drones 72 are located, once its (IAT) device 21 finds it is close to the room 68 via measuring the distance in-between its body (geometry) spatial location and the UAVs 22 room geometrical 68 spatial location, it will slow down its speed gradually according to the distance, it will adjust its direction toward the direction and coordinates of the room's 68 gate 69 as received from the room's (IAT) device 21 , it will keep adjusting its distance, speed and direction according to the spatial coordinates in-between them, until it approaches it, wherein the gate 69 is opened, the pick-up drone 72 will receive signals about the spatial coordinates location of the Aerocarrier 74 or UAV 22, and so it will adjust itself toward the gridded coordinates of the parcel 73 which is received from the occupied gridded shape, dimensions, coordinates, height...
  • the local pick-up drone set 72 receive data about each parcel 73 according to its special gridded coordinates, so when the local pick-up drone 72 adjusts itself to pick-up something under it, it knows that this thing is a parcel 73 referring to order no.:..., address:..., local pick-up drone 72 no.:..., as a result, the pick-up drone set 72 one by one start to unload the parcels 73, such that each one carries the parcel 73, adjusts its coordinates to have its front direction pointing to the gate 69 exit, then according to the delivery procedure the local pick-up UAV (drone) 72 flies like the following:
  • the drone 72 receives from the (IAT) device 22 in a building the perfect gridded spatial coordinates...of the specific aerial delivery window 75, it adjusts itself to follow a safe route without obstacles toward that window with artificially calculated speed, there, the window 75 opens for dropping the parcel 73 on a table or shelf, as instructed by the signals sent from the (IAT) device 21 of the building or (Sub-IAT) 67 device installed in the window 75, wherein the building (IAT) device 21 calculates the distance, angles, coordinates, height, of the pick-up drone 72 according to location data received from the pick-up drone 72 itself for the delivery address details, when it finds that the drone's geometry approaches and so close to the window 75, it either sends a signal to the window 75 motorized mechanism to open or the (Sub-IAT) device 67 of the window 75 to do that, then according to calculations of (PMGW) in-between both (lATs), it is computed how much the local drone 72 should move a distance, direction, tilt, adjust
  • the building telepathic device 21 will locate the spatial coordinates for the pick-up drone 72 to navigate toward the delivery station or conveyor elevator or belt, which may be provided with (Sub- IAT) 67.
  • the smoke and fire detectors, sensors communicates instantly with the building (IAT) device 21 , which knows from preset data and connections the location of these, which floor, which flat and even which room, as a result the spatial coordinates for an office/flat/floor under fire are directly and perfectly located, a message is sent to a firefighter drone 70 set, which follows a route toward the spatial coordinates of the fire place, the firefighter drone 70 set can either follow stairs space for inner fires, or guided to firefight the fire from outside the building, the firefighting is carried according to heat sensors 103 sensing the most hot spots to be extinguished, or by being monitored from the civil defense station, these firefighter drones 70 can be provided with long metallic rods with metallic ball ends that can break a window 75 glass and get to inside if necessary.
  • the facade cleaners 71 can be preset for whole building fagade cleaning, and can be additionally booked for private cleaning for specific windows 75 of some offices, the whole building fagade special dimensions including fully gridded details and spatial dimensions for the protrusions, the fagade cleaner drone is guided in the same toward the fagade from top to down, but when it faces a protrusion, it will not hit it, neither it needs to have distance sensors or cameras to distinguish it, nor it needs to be remotely controlled, it just receives the order, timing, specific fagade part, then it moves there to connect to a retard-able thin water hose, then starts cleaning, and wherever it meets a protrusion as specified and located by the building (IAT) 21 device compared to its own (IAT) device data 21 , it just passes it by moving its geometry boundaries a specific measured distance far away from the protrusion occupied space geometrical boundaries at specific heights, and back to the fagade, according to a comparison and computing of distance/angles...
  • the drones' warehouse (room) 68 is provided with outer top solar cells 104 for inner drone's battery charge (Fig 10).
  • c- Interchangeable Aero-Land vehicles 20 these too are provided with telepathy devices 21 , in the air it guides them and share data in the same procedure for UAVs 22, while landing in an autonomous city airport, they book in prior their landing field and spot, wherein they communicate their arrival time and pass it to the main (IAT) device 21 which is managing the whole traffic over the autonomous airport, and to the (AALV) 20 terminal (Sub-IAT) devices 67, these will provide spatial coordinates for them to carry out a safe landing, the same too applies on take-off either for individual or convoys of (AALV) 20. While being moving on land roads, the (AALVs) 20 to move perfectly the same like land autonomous vehicles.
  • d- Autonomous city airport The autonomous city airport will be ready at the start of the second stage to receive (AALVs) 20, while during next phases, via a shipping terminal, it will be handling aerial-containers carried by container-aerocarriers 105 (Fig. 3), wherein each terminal and its facilities, yards, take-off and landing spots are provided with (IAT) devices 21 to control locating and moving the machines and containers autonomously.
  • AALVs aerial-containers carried by container-aerocarriers 105
  • IAT take-off and landing spots
  • Autonomous 66, semi-autonomous 85 and non-autonomous 77 vehicles voice recognition will be upgraded from simple questions and dialogs into medium level dialogs wherein the vehicles can run a medium level dialog with the driver or passengers related to their private requests, and to contact them while they are out to meet them for pick up, or to go and pick some items.
  • This stage is the final stage, wherein the aim of it is to change the city to be fully autonomous, and all physical dynamic items plus the static items in contact with them will be fully equipped with either (IAT) 21 or (Sub-IAT) 67 devices, while man driven machines will be decreased to the minimum limits, such that: a- Non-autonomous vehicles 77 nearly disappear from all the roads in modern cities, while agricultural machines in the farms harvesting, irrigating, spraying... the plants, construction trucks and machines, oil field drilling machines, sea boats, ships, yachts, submarines, helicopters, aircrafts, and may be satellites, space stations, will be either fully autonomous, or at least these machines will be equipped with suitable (IAT) 21 or (Sub-IAT) 67 devices.
  • Autonomous vehicles 66 become more self-dependent, they can do tasks without being incorporated with passengers, they just inform the owners or users, discuss with them, and arrange for substitute vehicles either from auto-dealer service, car rent, or home other vehicles who can assist to substitute them, to work in their place when they are busy, examples includes but not limited to: 1- Fixing an appointment with an Auto dealer for periodic checkup, service, maintenance, repair, or updating its electronics programs, they do not need to be incorporated with passengers, they can visit the auto dealers by themselves, wherein the gates their autonomously open for them, they will know over which lifter they have to park, wherein mostly robots 26 or manipulators 107 provided with (IAT) 21 or (Sub-IAT) 67 devices can move to the specific parts of the vehicle where (IAT) 21 device 21 of the vehicle locates and shares its spatial location, such that the robot can localize and approach specific plugs, data diagnostic connectors, defected parts, screws, batteries, wheel bolts according to the gridding system of the vehicle in the space, at the end of service the
  • the self-dependence of the autonomous vehicle to include the following: 1- to move by itself to receive a wireless electric charge from a nearby charging facility. 2- To carry students to schools and come back alone, then to bring them back on time...3- No more valet parking, as the vehicle 66 will find its clear way toward a free (empty) parking space without searching, and will come back once it is called, and so no hired or employed drivers are required. 4- On the highways, at the end of the third stage: Autonomous vehicle 66 single convoys can include hundreds of vehicles, wherein the traffic smoothness will replace the high traffic.
  • 6- Autonomous ambulances 109 can communicate with UAVs 21 carrying ambulance capsule 110 loaded with and vacating injured (horizontal capsule 110) or stuck person (vertical capsule 110) from a nearby narrow place in-between buildings, mountains, a nearby island...deep valleys... unapproachable location which is under fire, and to receive them in a specific agreed location (Fig. 14).
  • 5- Autonomous vehicles 66 can go to the markets by themselves to receive online orders from autonomous supermarket/restaurant, wherein the goods, parcels, luggage, packages, or items are filled and handled limitedly but not specifically like any of the following methods:
  • the autonomous trolley 113 is moving to face a specific part of an autonomous shelve 114 where a specific ordered item is located according to (IAT) devices 21 , then the autonomous trolley 113 adjusts itself up or down to receive the item, wherein the item is pushed to be dropped inside the trolley conventionally, for example like in the vending machines, or via piston-rod plate set or any similar mechanism which pushes the selected item to drop inside an adjustable height trolley.
  • the autonomous trolley 113 itself is provided with robotic arms (manipulators) 107 to pick up specific items to achieve carrying out autonomous shopping on behalf of other remote persons making online orders. Once the trolley is full, it can tie/close/seal a major bag/carton inside it which is housing the whole order.
  • the autonomous trolley 113 then moves toward the back door of an autonomous vehicle 66 waiting to receive the order, via (IAT) device 21 communication in-between both autonomous vehicle 66 and autonomous trolley 113, the vehicle opens the back door, then receiving the order can be carried out in-multiple manners, after the trolley adjust its height, either the front and rear vertical sides of it opens and a motorized drive manipulator 107 or hydraulic-robotic piston-rod-plate faces the trolley 13 from the back wherein the rod or piston pushes the rod to push the plate which pushes the sealed order inside the autonomous vehicle 66, or motorized robotic manipulator pushes the order inside the vehicle (Fig.
  • the autonomous vehicle 66 itself is provided with its own autonomous trolley 113, wherein it will drop it in a specific location near a shopping center by providing an access for it over a tilted smooth stair (tilt gate), the autonomous trolley 113 can do the shopping as explained, then come back, wherein it gets in via the tilted smooth stair (tilting gate) (Fig.
  • the delivery of order is done inside a shopping center parking via a method similar to the capsule deliver in Fig. 14, wherein a motorized driven manipulator pick-up from in-between normal shelves 102 an order which is manually or autonomously prepared and handle it to a UAV 22, which flies out, and handle the order to an autonomous 66, semi-autonomous 85 or non-autonomous vehicle 77 for example via a specified opening in its roof (Fig. 18).
  • Autonomous city airports these are expanded to be a business hub for the whole city controlling the transportation of humans and goods, wherein extra terminals are opened with fulfillments centers receiving dividable containers from ports, industrial cities... either by air via aerocarriers 74 (carrying containers 44) or conventionally via autonomous trucks 111 by land, such that the multi-level fulfillment centers divide the containers 44 into smaller dividable ones, and distribute their items via autonomously driven belts through multiple, elevators, levels and belts to be handled to the aero-carriers 74 carrying the parcels 73, this process is not carried via scanning labels or bar codes but by arranging the items in theoretically numbered sequence, assigning numbers for each item order according to its location, tracking its location via the movement sequence it is following, as example, a parcel located in the 4 th row third column inside a container will be already assigned the number 43, then a robot 26 which is downloading the container items, once it picks up this item, it will know that this item which is occupying this spatial coordinates in the container 44 is item 43 referring to
  • the autonomous airports in the third stage will be handling parcels 73 to UAVs 22 to manage city retail commerce, in addition to restaurants sending meals via UAVs 73 from one terminal inside city airport under one authority running the whole facility departments and machines via artificial telepathy communications, rather than a vast restaurants, shops, malls, couriers scattered randomly around a city with unorganized aerial delivery routes, meanwhile the city autonomous airport authority (Sever 61 and CATS 112) will run the operations and routes of thousands of AALVs 20, UAVs 22, aerial delivery rooms 68 in the air over a city via communicating with their (IAT) devices 21, and providing recovery service,
  • Robots 26 can approach their destinations easily, depending on artificial telepathy, wherein they can recognize each tiny part in anything front of them equipped with (IAT) device 21, for example a robot setting in front of a computer, depending on the geometrical positioning of its fingers in relation to the geometrical positioning of each key on the keyboard... it can move and locate its two hands and fingers over the keyboard and start printing, in the same it can serve, diagnose, service, repair, maintain either dynamic or static objects which are having well known spatial coordinates (geometrical topography) provided by their (IAT) devices 21, such that a robot 26 can carry out a job of either a surgeon or a technician or even a pilot.
  • IAT spatial coordinates
  • e- City facilities during this stage, everything scanned, pictured, located, by the all kinds of vehicles and UAVs during stages 1 and 2, in addition to their movements through every point inside the city, these data in addition to any uncovered ones, will be used to establish a city central artificial telepathy station (CATS) 112 or city minds, locating each physical dynamic or static part in the city, such that UAVs 22 and robots 26 services can be extended to be comprehensive and covering every corner in the city without a danger that it may hit anything.
  • CAS city central artificial telepathy station
  • communications in-between production lines wherein the machines can communicate data to each other such as: a- if a machine face a problem, the others in the production line will not continue working blindly such that the fault may approach them or even damage them, b- the machines will know the current situation of a product and its specific location before it approaches them or being received from the preceding ones with a faulty case.
  • f- Humans and livestock Even the prior art is showing examples of humans implanted with RFID chips (transponders) 53, in the third stage all humans have to be equipped with highly secured (IAT) devices 21 , to be added to the recognized dynamic physical bodies which are becoming observable and recognized by other machines and structures.
  • IAT highly secured
  • g- police and traffic fines Policemen will disappear from the roads, traffic fines will disappear too, the provision of police will be watching the security of these (lATs) 21 and (Sub-IATs) 67 devices to be acknowledged about any failure of them or any physical attack, hacking, or cyber-attack.
  • TTCs twin tiny telepathic chips
  • the (TTCs) 115 can be a passport, identity card, ticket, gate pass, multiple devices pass, and credit card updated with data and to be used to identify and charge its bearer just by being identified and communicated via a telepathic device in any facility computer, gate, robot, cashier machine... etc. 4-
  • the hostess in aircrafts, trains., can recognize easily on any display that the right number of passengers and correct ones are seated in their own specific seats.
  • a person filling an autonomous trolley 113 or basket with items can be communicated by the trolley about the value, he can just press accept, then he will be charged directly and the value is deducted from his credit without being counted by cashier.
  • the (TTC) 115 can be loaded by many applications related to health, job, and security gate pass...
  • the artificial telepathy which is applied on-land and in air will be applied on and in sea, rivers, and even in-between remote space stations tools, devices, and machines for carrying perfect tasks and operations without unsafe, risky, inaccurate human interference.
  • i- Human to machine telepathy the developments to be extended during this stage to include many areas for human to machine telepathy, such that: 1- the currently and future developed tiny electronic chips which are to be swallowed to be moving inside a human body to perform a job of a tiny lab, will transfer their data to the (TTCs) 115 which are too may be connected to the human nervous system watching his pains and heartbeat, to be passed to his doctor, or even the nearest hospital (IAT) devices 21 at emergencies (During 2010 scientists in Canada got managed to connect neuro-chips to many nervous cells) in the brain.
  • TTCs tiny electronic chips which are to be swallowed to be moving inside a human body to perform a job of a tiny lab
  • the human (TTCs) 115 can read the thoughts directly or indirectly using other tiny devices, (TTCs) 115 may receive verbal voice or telepathic orders from the humans, and pass these orders to an autonomous vehicle 77 to come to address:... at time:..., or to order a robot 26 to do a task by receiving verbal voice or telepathic thoughts and orders via his (IAT) device 21 , or even to order some viewed items online via specific gestures, and to select the method of delivery via specific gestures too.
  • IAT his
  • TTCs j- Human to human telepathy: even it looks so complicated, but (TTCs) 115 wired or wireless connected to the nervous system, can assist humans provided with synchronized, mating or identified telepathic chips, with downloaded application on their smart mobiles, watches, lenses, virtual reality lenses...etc., with an option to make a telepathic phone call, wherein via such a call humans can share their: viewed visual scenery, heard acoustic sounds, thoughts, discussions, emotions, and even share relatively adjusted pains with doctors, or perfect pain with artificially robotic doctors, to know, assess, and diagnose a disease, or even to train the patients.
  • a human (TTCs) 115 can receive data from a Nano-lab chip rounding inside his body about the ratios and concentrations of microbes, germs, bacteria cells, cancer cells, biochemical constituents... and recommend him to adjust his food or suggest food list or diet, or report these remotely to a doctor or to an artificially intelligent robotic doctor...
  • the sceneries and sounds can be shared with the visual nerves in the brain of a blind person walking near to a normal one to navigate his road clearly.
  • Dead human's locations or living humans' heart beat spatial coordinate's recognition wherever they are trapped under collapsed buildings, structures due to earthquakes, volcano, and floods or even trapped inside fallen aircrafts.
  • TTCs Humans can pick up photos, scenery video, record sounds and voices without cameras or sound recorders, but by passing these through from their eyes visual nerves, ears...to the memory of their (TTCs) 115.
  • a voice of preacher speaking to crowds is passed without mikes or speakers 86.
  • Selected channel news at specific timing is received via (TTCs) 115 without watching or listening to any device.
  • a coach can instruct his team or any player in the field via his and their (TTCs) 1 5.
  • a football game video assistant referee (VAR) can locate artificially a player if he got into offside by installing such (TTCs) 115 in the players' shoes, and informing autonomously and instantly the main referee.
  • TTCs can be installed inside a football to decide instantly if the ball crossed the goal line or yard other lines.
  • a reporter drone provided with a TV screen, mike and speakers 86 can navigate through the crowds to approach specific corner depending on (IAT) devices 21.
  • IAT inertial activity monitor
  • a teacher can pass his telepathic lectures to his receiving students anywhere, wherein virtual reality boards or glasses can be used to be the background for eyes and senses projected views.
  • a person can know if his flat or villa (IAT) device 21 discovered an undefined person with an unidentified (TTCs) 115 crossing its identified shape boundaries or dimensions such as a thief attempting to make a theft.
  • a facility outdoor built-in drone can be launched to track anybody if any mass shooting, attack; theft trial is observed or heard by the building (IAT) device 21 when an unidentified person passed its barriers or boundaries.
  • Personal weapons, pistols, guns... can be provided with (TTCs) 115 that may inform the police instantly if shooting is made.
  • a person anywhere getting injured, heart attack, killed can be approached according to messages received by the nearest medical center about his heart beat... or other health safety parameters, a message can be passed to police if the artificially analyzed parameters are related to an attack. Streets, homes, or other facilities lights can be operated according to the availability of (lATs) devices 21 or (TTCs) 115 according to their types and quantities in specific locations without the need for sensors sensing them.
  • Telepathically inspecting chips (nana- labs) (TIC) 116 can analyze and gather data from undersea or inside an oil well and pass it to nearby ones up to the deck on the top.
  • the third stage of the comprehensive autonomous city or the artificially telepathic city will not end by being limited to removing non-autonomous vehicles 77, posts, columns, traffic lights, signboards, reflectors, and even removing many lights or switching them off while roads are not in use as they are no more required for drivers to view the roads, or fully solving vehicles high traffic in the roads and humans random crowds in the markets and malls, or long queue, or time waste on driving or rounding between shelves, shops, authorities, farms...etc., or using less cameras, less use of smart screen devices, no pollution, no ID cards, passports, bank cards, keys, or even wallets, but also comprehensive revolutionary high end HSE, training, education, security, energy saving, organized commerce, organized low aero-space, aerial, and land traffic infrastructure, and a better fast understanding govern the humans, robots, and machines communications.
  • Obstacles can be put against human-machine-human artificial telepathy, but the current art already provided the industry with successful trials to pass the neural signals via artificial devices from one part in the human body (brain or spinal cord) to the legs. Another major obstacle is the crossing thoughts in human minds which will be transferred to the other party may be disturbing the whole communication. But, nowadays, there are millions of different wireless telecommunications and radio, TV stations... loaded over electromagnetic waves, without interference, as these data are loaded in different lengths and frequencies of waves, and filters are used in devices to filter interferences, the same principle can be applied on human crossing thoughts after further investigations, wherein specific subject thoughts are communicated in-between human telepathic devices.
  • a modified flat, office, villa or warehouse window 75 with aerial reception capabilities, which is equipped with (IAT) device 21 will follow up the shipment or parcel 73 route carried by the UAV 22 and can update the buyer..., once the UAV 22 arrives, the aerial delivery window 75 will be already recognizing the visitor (delivery drone 22 or local pick-up drone 72) it will automatically open to receive the shipment 73 without communication via scanning tools or laser beams, the artificial telepathy recognition is enough. According to its type, as demonstrated in Fig. 19 (A- C), the window 75 will open in one of the following manners:
  • a- Linear (sliding) reception mechanism 117 The window 75 glass is opened upwards, a receiving box 118 with open top slide on rails or pushed conventionally outwards to receive the shipment 73, then its weight or load proportioning sensor 119 measures the weight or sense the load of the shipment 73, then the box 118 is pulled conventionally inward and the window 75 is closed (Fig. 9- A).
  • b- Swinging reception mechanism 120 The window 75 glass is opened upwards, a reception basket or box 118 with open outer side is swung conventionally outwards to receive the shipment 73, then its weight or load proportioning sensor 119 measures the weight or sense the load of the shipment 73, then the basket 121 is swung conventionally inward and the window 75 is closed (Fig. 19- B)..
  • c- Rotary reception mechanism 122 The window 75 glass is opened upwards, a receiving box 118 with open top is rotated conventionally from the window 75 sides outwards to receive the shipment 73, then its weight or load proportioning sensor 119 measures the weight or sense the load of the shipment 73, then the box 118 is rotated conventionally inward and the window 75 is closed.
  • the motorized mechanism of the window 75 can be harvested too to pull the box or basket out via ropes, string or belts in any suitable conventional arrangement.
  • an aerial delivery transparent smoothly curved duct 121 is installed vertically along a building side facing a fixed glass of a building facade to handle parcels 73 to flats in different floors, such that an opening is made in each flat glass open to the duct, the duct 121 receives parcels via a top room over a building, and moves them down similar to elevator principle, when a parcel over a horizontal plate moves down to face a glass opening, wherein behind the parcel from the transparent duct side, a vertical plate is located (can be transparent glass too), such that via a retractable motorized string or rope, pulling the string inwards over a pulley on the front and back edges of the horizontal plate carrying the parcel 73, will pull the vertical plate forward sliding on rails or through grooves from the bottom, the forward pull will push the parcel forward inside a flat receiving box 118, while s motor (located under the plate and connected from both sides to the belt/rope/string) will retard the
  • a (lAT) 21 device in a traffic lights 89 at crossed roads receives the data from the (lATs) inside the vehicles and from the humans (TTCs) 115, through algorithms and matrixes, each traffic light 89 can evaluate the timing and size of vehicles or humans who will be coming from one way or one side, as a result it will manage a precisely organized flow and traffic in a neat and systematic manner.
  • an autonomous city does not need traffic lights 89, as (lATs) 21 and (TTCs) 115 will cooperate and organize themselves where to give the priority for traffic flow, and where to stop to let the walking people pass via modified lighting or pointers systems.
  • (lATs) devices 21 are installed to prevent accidents to extreme ratios, and if any accident happens, then to decrease its drawbacks to extreme ratios, wherein if the human drivers understand the intentions of the other drivers depending on the lights: brake lights, turn signal lights, or acceleration, deceleration, or trying to read or guess their intentions, or even to assess the capabilities of the other cars based on their types, the (lAT) devices 21 are far more intelligent than to watch or observe to predict guess, as each one can know in prior the following: a- The other vehicle selected speed mode: low, medium, high, wherein those with similar driving speed mode are following the same lanes.
  • assessment of the numbers and ratios of each group of vehicles with similar driving speed modes is done to carry out a unique distribution of the vehicles upon the road lanes and even to adjust the different speed modes autonomously to converge when the lanes number is not enough to distribute the vehicles with different speeds modes over them.
  • the data for next 24 hours planned trips and planned handling of the autonomous vehicles 66 via a server 61 will be collected from the following resources: 1- It will be highly recommended from the private autonomous vehicle 66 riders to set in prior their editable driving plans for the next 24 hours. 2- To set in all of their vehicles their home location and job location and working hours and timings. 3- The public autonomous vehicles 66 too should be provided with such data from their companies, schools, institutions...operators. 4- All road service authorities should feed their next 24 hours planned trips into their vehicles in addition to any specific locations for road maintenance. 5- The traffic departments receive data about the existing road physical condition from the autonomous vehicles 66 which used the road in the past hours, in addition to data provided from the meteorological department about the expected weather conditions: rain, dust, ice, fog, snow, floods... etc.
  • the official regional artificial intelligence learning systems in the service providers carry out intelligent assessment for the expected traffic levels, timings and make a pre-set programs how to distribute the autonomous vehicles for the next 24 hours over the city roads in a fully optimized manner with a factor of safety and considering the other non-planned trips and road users where instant updating of the program is carried out, while updating all autonomous vehicles 66 in addition to messaging their passengers about the timings which they should be available inside their autonomous vehicles 66 to arrive their target destination smoothly, on time.
  • NVHS Noise-Vibration-Harness system
  • D- a human interference should be carried out instantly, wherein in a control center it is seen via vehicles' panoramic computer vision cameras 95 that the instant assessment of the state requires an instant handling, as a result a controller may initiate a specific system level to be activated by one click to handle the traffic autonomously, such systems can activate: 1- Drive carefully mode, b- Slow down, c- Fully stop driving, d- Change the road. 2- To send a call message for all teams and autonomous Aero-Land vehicles 20, Ambulance 109, Police car, recovery service, in addition to UAVs 22, and police drone for assisting and assessment of accident, firefighter drone 70, flying robotic road cleaner to carry out any urgent tasks.
  • NVH system triple springs + triple shock absorbers.
  • the suspension comfort style should be developed more from at least the following aspects:
  • the current conventional spring-damper (spring-shock absorber) assembly is a one unit designed to handle different types of road textures: humps, potholes, rough surface, cracked surface with all types and sizes and dimensions.
  • stage-1 The road texture variations: 0- 5.0mm to be filtered, tuned and absorbed as a slight roughness in the texture.
  • Stage-2 the road texture variations: 5.0-25.0mm be filtered, tuned and absorbed as a normal roughness in the texture.
  • Stage-3 the road texture variations: over 25.0mm in addition to other humps to be absorbed as a high roughness in the texture ...etc.
  • three spring-dampers units should be used.
  • they are demonstrated in Figure 20 separately, while they can be reproduced in a compact single unit (multiple-stage spring damper unit 124) with varying number of coils per unit of distance, varying stiffness along its length, while the damper is provided with more three inner pistons instead of one with different dampening, such a modification can provide unique handling of different road textures via one unit but with different dampening levels.
  • the three spring-shock absorbers will operate like the following:
  • a- Stagel- first part spring-damper 125 a small diameter, small width spring with extra coils and less stiffness, coiled around a small diameter shock absorber, and it's the only one starting to handle the 0.0-5.0mm rough texture.
  • Figure (20- A) demonstrates how this part handle a small texture roughness in a road alone
  • Figure (20- B) is more close to actual demonstration, wherein it shows how this part of a compact unified unit of a spring-damper system takes role to filter/absorb a small texture roughness in a road.
  • the overlapping multiple- mode frequency represents a part of road with multiple modes of roughness at once handled at ones via multiple-mode spring-damper system.
  • b- Stage-2 spring-damper 126 a medium diameter, medium width spring with medium number of coils and stiffness, coiled around a medium diameter shock absorber, it's starting its operations by handling the 5.0- 20.0mm rough texture.
  • Figure (20- A) demonstrates how this part handle a medium texture roughness in a road alone
  • Figure (20- B) is more close to actual demonstration, wherein it shows how this part of a compact unified unit of a spring-damper system takes role to filter/absorb a medium texture roughness in a road.
  • c- Stage-3 spring-damper 127 a conventional normal diameter spring width with the normal number of coils and stiffness, coiled around a normal diameter conventional shock absorber, it's starting its operations by handling the 20.0mm and above rough texture, humps, potholes...
  • Figure (20- A) demonstrates how this part handle a high texture roughness in a road alone
  • Figure (20- B) is more close to actual demonstration, wherein it shows how this part of a compact unified unit of a spring-damper system takes role to filter/absorb a large texture roughness in a road.
  • Semi-pneumatic tire 128 (refer prior art): Conventional run-flat tires will not be a suitable choice for autonomous vehicles, because once these got punctures, their side wall reinforcements will not support any comfortable driving, nor to support a high speed or long distance driving, a better solution is to use semi-pneumatic tires 128, these tires can provide a better load and heat distribution while driving which minimize vibrations and elongate tires life, and they will lose a very little amount of their tires pressure and can be still driven safely and comfortably until changing or repairing the tire.
  • Noise, vibration and smoke As the drives will be busy in other activities including sleeping, watching movie, working on machines... etc inside the autonomous vehicles 66, they may not hear the noises resulting from a faulty part in the vehicle specially suspension parts, and other motors, wherein mechanical part are unlike electrical parts which can be diagnosed via reading the fault memories stored in the memory of their related control units, mechanical parts diagnosing mostly is based on noise / vibration symptoms and visual check.
  • a net of acoustic sound collectors 129 (sound-noise level meters) distributed all around the vehicle are installed with filtering tools, these aim to pick up and filter the abnormal sounds or noises related to faulty parts rather than road noise, audio, other used optional machines inside vehicle compartments, these sounds and noises are to be compared with pre-programmed sounds and noises, such that an abnormal sound microprocessor estimate and makes diagnosis and assessment of the seriousity of the fault, upon which the passengers are warned, speed and lane is adjusted, (IAT) device 21 should share the diagnosing results with the (IAT) devices 21 of nearby autonomous cars to be aware that a serious fault in such a car may happen, meanwhile the vehicle's (IAT) device 21 can contact a (IAT) device 21 in the most close autonomous recovery truck 108 to agree where to meet, while the same vehicle's (IAT) device 21 arrange with another rented autonomous vehicle 66 via its vehicle's (IAT) device 21 too, to come ahead of the autonomous recovery truck 108 to shift the
  • faulty electronic devices or wirings which are burnt and creating smoke can be diagnosed via smoke detectors 131.
  • Faulty mechanical and electronic parts data are passed to the (IAT) device 21 to handle serious emergency cases in the same, wherein in the case of smoke or burnt devices, the telepathy device send a message to a telepathy device in a firefighter drone station 132 to send one or more firefighter drones 70 to handle the case if things gone bad or out of control.
  • the net of acoustic sound collectors 129 and vibration meters can be used too for recording of road noise to know road type, texture, conditions, to adjust vehicle's speed and share these data with other a data server 61 over a network to share it with other vehicles.
  • the autonomous vehicles 66 will be diagnosing the roads conditions from any aspects using a set of visual or sensing devices.
  • the wheels turning can be controlled via two side hydraulic piston-cylinder units 133, and so no tie rods are required, wherein in such a case it will be more easy to install wheel turn units to the rear wheels too, such that four wheel turning (steering) (4WT or 4WS) 134 electric commercial and passenger autonomous vehicles 66 are manufactured, which can be driven autonomously either forward or backward without noticing any difference, this is so helpful in access or exit of narrow locations.
  • four wheel turning (steering) (4WT or 4WS) 134 electric commercial and passenger autonomous vehicles 66 are manufactured, which can be driven autonomously either forward or backward without noticing any difference, this is so helpful in access or exit of narrow locations.
  • a Sedric Volkswagen type or Electric Mini bus are offered with inner facilities and equipment style, creating a stylish comfort like: office style, restaurant or coffee shop style, trip style, aerial and land city cruise style, entertainment style movie (cinema style), comfort with bedroom (foldable sofa), diabetic comfort with toilet, business center style, sporty style with exercise facilities, tourist style, prayer style (Church, Mosque, Synagogue...or any other temple like selected option or even yoga style), ambulance facilitated with robotic surgeons, delivery truck equipped with sorted autonomous shelving and delivery drones, or having a Jacuzzi, truck workshop style equipped with 3D printers or any other tools..., kitchen fast-food style, mini-showroom style, artist style...
  • an athlete car is unlike an engineer car
  • a doctor car is unlike an IT provisional car
  • a businessman car is unlike a student's car, each one is customized from the factory with a built-in or optional facilities meeting the desires, hobbies, professions, or specific situations and selections.
  • seat belts may not be mandatory used during normal driving, but alarms to remain seated or return to seats and fix seat belts will be issued when necessary.
  • the airbag system will be customized for each autonomous vehicle 66 depending on its area of use.
  • dialog management electronic control unit (DM-ECU) 135 will be updated with all humans dialogs in each specific language in addition to hugely yearly edited texts referring to specific practical dialogs which should be communicated in-between the autonomous vehicle 66 and the owner, the following is an example for a dialog in-between an autonomous vehicle 66 and its owner, not only reflecting the dialog management capabilities of an autonomous vehicle 66, but the technical features and applications of the new autonomous vehicles 66 provided with (IAT) devices 21 supporting: daily management of the owner requests, vehicles own needs self-management, its communications with other machines, robots 26, servers 61 , UAVs 22, interchangeable aero-land configurations...etc., expressing a feeling, having a sense of humor, remembering and reminding, updating and recommending, handling and receiving, picking up and dropping, waiting and suggesting, fixing appointments and waiting, carrying indoor (inside vehicle) automated tasks for a busy owner, intelligent self-task management (ISTM) 136 wherein each part is run via a microprocessor inside the (DM-ECU)
  • My-Chevy-1 Sir: After dropping you to your office, I went to the washing station, I was a little bit delayed, the waste washing liquid tester 138 showed extra muds was sticking on my bottom side, you insisted yesterday to drive off- road through a muddy road.
  • My-Chevy-1 No sir! They said as goodwill, they will either send it to your home via aerial delivery today evening at 5:30, or else their delivery UAV 22 can handle it to me in the parking, So, now I am going to the restaurant to pick-up the order from their window 75 delivery manipulators 140, then to the school to pick-up kids:..., then back to you.
  • My-Chevy-1 Today I will receive and download a program from the neighborhood server 61 related to be acknowledged to leave a place under fire to a specified recommended safe place.
  • Figure 23 is a flow chart briefing all devices, control units and autonomous machines collaborations.
  • UAV 141 with compact shape, like a closed book cover, such that it is composed of a foldable screen 142 e.g.: of two folded parts, from outer side provided with a foldable screen, while the inner side is provided with foldable arm rotor blades 143, mike 144, hook for carrying small and mini-parcels or hanging orders from nearby restaurants, camera..., so it is a book shape UAV with outer smart screen and inner accessories (Fig. 24).
  • a foldable screen 142 e.g.: of two folded parts, from outer side provided with a foldable screen, while the inner side is provided with foldable arm rotor blades 143, mike 144, hook for carrying small and mini-parcels or hanging orders from nearby restaurants, camera..., so it is a book shape UAV with outer smart screen and inner accessories (Fig. 24).
  • Aero-Land vehicles are based on vehicles currently available in the market with modifications and installations of ducted propellers, hydraulic and motorized mechanism for moving the side propellers, gates, control systems... etc made from available tools, parts, mechanisms, with applicable modifications.
  • 2- Autonomous vehicles, UAVs, AALVs intelligent artificial telepathic (IAT) device is made by collecting positioning devices, and measurement meters of motion and directions into a compact united device with uploading geometrical topography shapes and dimensions data which are available for each machine before its manufacturing.
  • IAT intelligent artificial telepathic
  • IAT artificial telepathic
  • 4- Computer vision panoramic cameras are made by increasing vision lenses for each of the currently available computer vision camera, and simply using these from different positions for collective data construction of a realistic 3D shape with actual dimensions, position for each part of the vision.
  • IAT intelligent artificial telepathic
  • AAVL Autonomous aero-land vehicles
  • 50- Distance sensor 50- Distance sensor
  • IAT Intelligent artificial telepathy
  • Unmanned aerial vehicles UVS. 52- GPS unit.
  • TILS Telepathic intelligent learning system
  • TTC Tiny telepathic chip
  • TIC Telepathically inspecting chips
  • NVHS Noise-Vibration-Harness system
  • D-ECU Dialog management electronic control unit
  • EP 3 253 084 A1 16.Dec, 2017 Quan et al

Abstract

An autonomous city transportation means based on and interchangeable autonomous aero-land vehicle (AALV) 20 UAVs 22, autonomous vehicles 66 interactive and collaporative in-between and with other machines, robots, buildings, structures, and humans via intelligent artificial telepathy (IAT) devices 21. The autonomous aero-land vehicles (AAVL) 20 are of compact and easily interchangeable shapes, with storable side propellers and stowable top propellers. The intelligent artificial telepathy (IAT) devices 21 are based on vastly modified RFID (Radio-Frequency Identification) chips 53, adding to the uploaded data parameters related to GPS spatial location, and motion parameters, a geometry parameter data related to a grided actual shape, geometry, topography, texture, tilt angles and dimensions of each consistuent of this shape to assist in locating perfectly each occupied part in the space by each physical dynamic or static body, to move, transport, communicate, predict, engage, synchronize, fit, or cooperate with each in a perfect manner with the least numbers of devices depending mainly on (IAT).

Description

AUTONOMOUS CITY TRANSPORTATION MEANS WITH ARTIFICIAL
TELEPATHY
Description of the Invention
Technical Field of Invention
This invention relates to an autonomous city with its transportation means such as unmanned aerial and land vehicles capable for telecommunicating via artificial telepathy with each other or other machines, robots, structures...
Background Art
Currently, autononomous cars still did not achieve the full self-driving without the interference of a human driver or monitoring, this target level may be achieved in-between 2025 - 2030, while the unmanned aerial vehicles still not put in a wide use other than taking photos or recording videos, meanwhile the unmanned aerial vehicles are still: not carrying passengers, not carrying small containers of 100 kg to 1 ton, and not carrying out civil services for a city.
Both types of these machines (Autonomous vehicles and unmanned aerial vehicles) are not unified into interchangeable ones, and both are still not topping up the telecommunication in-between them up to the level of communicating via intelligent artificial telepathy (IAT) communication which may be a new concept to the art, wherein (IAT) communication should not be limited in-between them only, but efficiently with all other structures, buildings, robots, traffic lights, aerial delivery facilities, machines...etc with which both types of machines should communicate and carry out physical tasks.
Even the Int. Patent application WO2017178899 A2 disclosing a Multiple Task Aerocarrier attempting to provide a unified shape for a flying car, still the ducted propellers carrier at the bottom is too thick and high to get in easily inside. The same applies for DARPA TX program wherein their VTOL which is called Aerial Reconfigurable Embeded System (ARES) and the following models for remotely controlled aircraft are not compact at all.
The following patent publications or patents: US2018/0183873, WO2015068501 US9,947,145 B2, US20120083960, US20100256835, US20140136414, and WO2017079229 mainly, tried to use a huge number of devices and sensors to provide autonomous collaboration in-between limitedly autonomous vehicles sharing data about their spatial envelope dimensions, motion parameters, mapping, navigating...but without communication with other non-autonomous vehicles, machines, robots... inside the city.
It is one aim of this invention to provide multiple embodiments of unified versions of autonomous vehicles interchangeable into unmanned aerial vehicles in a compact model called Autonomous Aero-Land Vehicle, which can be fit (installed) on the normal smooth vehicle shapes available in the art.
It is another aim of this invention to povide each of these autonomous aero-land vehicles, UAVs and the interferring structures, flyingrobots, machines, humans... with a single device or chip to pass and share data in-between, rather than using many bulky devices in each one to observe, navigate, and locate others, but here all data is provided via a straight forward solution, wherein each vehicle creates its fusion algorithms data including shape, geometry, topography, texture, spatial dimensions with dynamic location parameters, to produce an active and easy communication in-between them, supporting self- recognition and synchronization in-between their parts with regard to dynamic/static timed intentions and programmed plans to provide an optimized artificial interaction and collaporation in-between them as if they are parts of one body in the city.
Via (IAT) these separate machines are no more blind toward each other, they communicate their shapes geometry, to identify, recognize, see, watch, feel, and provide telepathic specifications to hear, listen, understand, analyse, navigate, predict, update, learn, teach, message, command, monitor, warn, avoid, remember, remind, behave, respond, argue, agree, arrange, deal, connect, meet, manage, decide, handle, cooperate, guess intentions, interact without sensors, lidars, radars, sonars, laser imagers or even humans.
This may make the City Autonomous Airport disclosed in App. WO2018122821 A2 realistic, and the whole future city structures, autonomous flying and land vehicles, aerial delivery machines, tools... operate and cooperate via communication channels in three stages bridging process, wherein each stage may need 5-7 years to be accomplished, which expand the internet of things widely via the use of ambient intelligence telepathy. Disclosure of Invention
Brief Description
An autonomous city transportation means based on UAVs, autonomous vehicles and interchangeable autonomous aero-land vehicle (AALV) interactive and collaporative in-between and with other machines, robots, buildings, structures, humans via intillegent artificial telepathy (IAT) devices.
The autonomous aero-land vehicles (AAVL) are of compact and easily interchangeable shapes, by building and installing over smooth shaped vehicles' roof a smooth aerodynamic body containing two side ducted propellers or small jet propulsion engines, while at the middle an engine or electric motor is installed to drive a stowable top propellers located over the smooth body, meanwhile side propellers are built-in vertically or horizontally inside the vehicle, and pushed out vertically or horizontally when put in use, and rotated conventionally to take a horizontal or vertical configuration while they are fully extending out of the vehicle body. In another embodiment the (AALV) is a modified new shape of Volkswagen Sedric, wherein the horizontal side ducted propellers are built-in inside the bottom part of the vehicle and conventionally pushed out sidewards.
These machines as well as all machines, facilities, and humans cooperating with them, are provided with intelligent artificial telepathy (IAT) devices, which are based on vastly modified RFID (Radio-Frequency Identification) chips, wherein in addition to the uploaded data parameters related to GPS spatial location coordinates, altitude, latitude, longitude, direction, and roughly estimated occupied space envelope, in addition to motion parameters: speed, acceleration, deceleration using many devices, an additional parameter is added related to a grided actual shape, geometry, topography, texture, tilt anngles and dimensions of each consistuent of this shape to assist in locating perfectly each occupied space by each part of it, to engage, synchronize, fit with it, or cooperate in a perfect manner with the least numbers of devices which are excluded into one, in addition to providing it with other telepathic applications and technical features to simulate actual telepathy. Brief Description of the Drawings:
• FIG. 1 (A- F): Illustrates multiple 3-D views for a first embodiment of the interchangeable Autonomous Aero-Land Vehicle (AALV).
• FIG. 2 (A-E): Illustrates multiple 3-D views for a second embodiment of the interchangeable Autonomous Aero-Land Vehicle (AALV) with multiple configurations for ducted propellers or armed foldable blades.
• FIG. 3 (A-F): Illustrates multiple 3-D views for an (AAVL) modified smooth body carrying a baseball shape body, capsule, or container for civil service use.
• FIG. 4: Illustrates multiple 3-D views for (AALV) modified smooth bodies in two parallel aero-convoy configuration.
• FIG. 5: Illustrates multiple 3-D views for the second embodiment of the interchangeable Autonomous Aero-Land Vehicle (AALV) in aero-convoy configuration.
• FIG. 6 (A-H): Illustrates multiple 3-D views for (AALV) modified smooth body picking up an aero-land vehicle with its propellers in.
• FIG. 7: Illustrates dynamic bodies' movement and engagement with relation to static natural and urban bodies by observing track via p re- updated data about geometrical topography positioning and occupied space borders.
• Fig.8: Illustrates (PMGW) microprocessor and the input data parameters.
• FIG. 9: Illustrates flow chart for (PMGW) and artificial intelligence working principle
• FIG. 10 (A-C): Illustrates a 3D views for the aerial facility room.
• Fig. 11 : Illustrates a full city view for positioning and artificial telepathy collaboration lines in-between transportation means, structures, machines, natural obstacles... with a neighborhood server and (CATS).
• Fig. 12: Illustrates a list for a method of three stages for making an autonomous city transportation means with artificial telepathy. • Figure 13: illustrates a rough distribution map of the firefighting tools and vehicles spread over part of a forest.
• Figure 14: Illustrates an aerial delivery of a capsule to a vehicle under the control of (IAT) devices collaboration.
· Figure 15: Illustrates a manipulator picking up an item and filling an autonomous trolley under the control of (IAT) devices collaboration.
• Figure 16: Illustrates a motorized robotic manipulator pushes a package out of an autonomous trolley into an autonomous vehicle under the control of (IAT) devices collaboration.
· Figure 17: Illustrates an autonomous trolley getting inside an autonomous vehicle via the tilted smooth stair (tilted gate).
• Figure 18: Illustrates a manipulator picking up a parcel from in-between shelves and handling it to a UAV.
• Figure 19 (A-C): Illustrates 3D views for window aerial delivery mechanisms.
• Figure 20 (A-B): Illustrates a multiple-stage spring-damper unit handling rough textures.
• Figure 21 : Illustrates NVH data collectors, detectors and meters distributed in a vehicle.
· Figure 22: Illustrates autonomous vehicle customized inner equipment.
• Figure 23: Illustrates a flow chart briefing all devices, control units, microprocessors, data manipulators, servers, and autonomous machines collaborations.
• Figure 24: Illustrates a multiple-task foldable flexible screen UAV. Detailed description for carrying out the Invention:
Best Mode for Carrying out the Invention:
Various embodiments and aspects of the invention are described with reference to details discussed below, and the accompanying drawings illustrating the various embodiments. The following description and drawings are illustrative of the invention and not to be constructed as limiting the invention. Numerous specific details are described to provide a thorough understanding of various embodiments of the present invention. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments of the present invention.
In order to make it easy to carry out the invention, a detailed description of the parts of the invention, supported with figures, is provided here, wherein the main parts are arranged sequentially, according to the importance of the part, it is made easy to read, by referring to each feature, with a number included in the parts description text, and in the parts numbering list, the numbering of part features is indicated here, by starting it sequentially from number 20, wherever a part feature appears in a text, it will be directly assigned its required serial number. As example in FIG. 1 , the parts' features are arranged sequentially from number 20, 21 , 22...
Multiple embodiments for autonomous aero-land vehicles (AAVL) 20 are provided to carry out multiple civil services including aerial carrying and delivery machines, wherein these machines are provided with intelligent artificial telepathy (IAT) devices 21 to cooperate with unmanned aerial vehicles (UAVs) 22 and similar ones of both in the space, and on or in land structures 23, buildings 24, machines 25, robots 26, and humans 26 to make the recognition of identity, tracking the route, handling the load, and engaging to carry out a task easily optimized, and to be approached clearly without using plenty of navigating devices, or probabilities of hitting obstacles (hills, trees, posts, protruding parts...) 28, or failing to achieve the assigned tasks. The Autonomous aero-land vehicles (AAVL) 20 are of compact and easily interchangeable shapes aero-vehicle into land vehicle and vice versa, in a first embodiment a normal smooth vehicle 29 available in the market e.g: Toyota Previa 29 is to be modified indoor before sale (Fig. 1) to be a vertical take-off or landing vehicle (VTOL), by building and installing over its roof 30 a smooth aerodynamic body 31 containing two side ducted propellers 32 or small jet propulsion engines 33 with their ducting 34 extending from the front to the rear exhaust nozzles 35, while at the middle an engine or electric motor 36 is installed to drive stowable propeller 37 located over the smooth body 31 , meanwhile side propellers 38 are built-in vertically or horizontally inside the vehicle 29, and pushed out vertically or horizontally when put in use, and rotated conventionally to take a horizontal or vertical configuration while they are fully extending out of the vehicle body 29.
For pushing out or retarding the side propellers, traditional mechanisms can be used such as a hydraulic system, or motorized rack and pinion gearing, with motorized rotatable rods, shafts and gears. Side gates 39 are fully parallel with the vehicle body in a fully sealed manner, and opened up to make access for the side propellers 38 to be pushed out, these side gates 39 can be either retarded into a closed position while the side propellers 38 are out, or other inner hidden gates (not shown) are closing the propellers exit passage. These gates 39 are driven too via conventional motorized or hydraulic mechanisms available in the art.
The tail 40 is made of two parts; each part in the top side is containing a small propeller 41 to adjust the reverse rotation of the main top propeller 37. In another embodiment the (AALV) 20 is a modified shape of a Volkswagen Sedric 42 or any similar electric mini-bus, wherein three configurations are provided: a- the horizontal side ducted propellers are built-in inside the bottom part of the vehicle and conventionally pushed out sidewards when put in use Fig. 2 (A-C), or b- side rotor blades are built underneath the vehicle and pushed out via telescopic arms Fig.2-D , or C- an expanded bigger right and left side armed rotor blades are installed on a foldable mode into grooves (not shown) in the roof sides when not in use while the front and rear rotor blades are stored in the bottom, in this combined configuration, the bottom rotor blades telescopic mechanism pushes them out, and the right/left side rotor blade arms (or wing shaped) are unfolded downward when put in use Fig. 2-E. The rotor blades too can be rotated from horizontal to vertical configuration. It is obvious here that many configurations can be drawn from such embodiments and configurations.
As shown in Figure 3, these (AAVL) 20 smooth aerodynamic body 31 flying mechanisms can be used and modified by adding two foldable wings and two vertical side ducted propellers (Fig.3 A-B), each ending with one or two ducted branches for carrying baseball shape containers, capsule, or normal containers 44 (Fig.3 C-F), tanks or to have small shapes like the drones to carry parcels, wherein the containers 44, tanks, capsules...etc can be used in sequential firefighting, fueling other machines, and irrigation...(Fig. 3). These unmanned aero-carriers can be collected in parallel series of convoys carrying out effective firefighting, wherein their vessels are either filled by dipping them inside a nearby lake, or carrying the filled ones from a nearby station, or provided over trucks, then carrying out firefighting or cooling the ash which is carried by the wind, returning back, dropping the empty ones for refilling, and carrying filled ones... and so on in a continuous cycle (Fig. 4). AAVL 20 can be attached or connected conventionally to be set to fly in aerial convoys 45 as well as UAVs (Fig. 5), such applicable flight configuration is applicably providing safe, compact, less air traffic, and well organized semi- train carriages flight, wherein if one engine or propeller got idle, all of the other AAVL 20 units will support a safe flight for the defected unit until the next destination or safe emergency landing, or a until a modified smooth flying body 46 makes and aerial (recover) pick-up of the idle unit from a convoy (FIG. 6) with its propellers in, while other parts not shown for clarity. It is obvious that the AAVL 20 second embodiment can be equipped with a parachute over its roof for emergencies. So, Aero-convoy configurations are not limited to passenger Aero-Land vehicles, but Aero-Land vehicle convoys loaded with parcels, luggage or fluids can be created too, while Aero-Carriers (with top aerodynamic bodies) 31 loaded with parcels, luggage or fluids (Firefighting powder, water, fuel) can be arranged in aero-convoys, wherein a aero-carrier firefighting convoys can carry firefighting container, capsule, baseball like vessel to carry out a sequential firefighting for fire frontiers, and return back to unload these empty used ones, and pick up new filled vessels to go back to extinguish fire, forming an aero- convoy firefighting series and cycle.
As a result, and upon three stages, these (AALVs) 20 and (UAVs) 22, aircrafts, other available types of vehicles in the art which are currently in use, in addition to all city facilities which are taking a physical static or dynamic shapes are to be provided during a three stages timed incubation with a single device substituting the multiple navigation devices 47, radars 48, Lidars 49, distances sensors 50, cameras 51...etc which are currently used to make such machines recognizing each other, wherein in this invention each of these machines or physical bodies will have a pre-set programmed data about its shape, size, dimensions and dynamic GPS 52 location inside a city 3D map, informing the neighborhood machines about its status without the need from the others to have devices to observe or recognize it.
Telepathy device principle of operation As illustrated in (Fig.7) different bodies are distributed throughout a specific cubic part of a city and nearby mountains space, each part refers to an identified physical body, the used geometries are simulating these physical bodies, they are numbered in sequence for reference, wherein body 1 is assumed to be located and moving on road level, body 2 is a mountains, body 3 is a building compound, with body 4 located over it, wherein body 4 is a parcel, while body 5 is flying in the space in-between all.
At this instant, to recognize the location of each body, the location conventional data parameters can be got depending on the prior art and available technologies depending on the following devices: global positioning system (GPS) unit 52, RFID (Radio frequency identification) unit 53, remote sensing unit 54, altitude meters 55 to know their heights, also their motion conventional data parameters can be measured via speedometers 56 and accelerometer 57. Their tilt angles are measured via tilt meters 58 and their orientations and angular velocities are measured via gyroscopes 59. So, the so defined instant motions, position (location), tilt and orientation parameters can be measured via available devices, these devices are to be compacted in one device in this invention.
But, if body 5 is assigned an autonomous task to engage to and pick up body 4, then to get into body 1 which is moving to pass the narrow road in-between the building body 3 and the mountains body 2, then the mentioned compacted meters will not be enough to achieve this task successfully, even depending on the compact device data: controlled speed, acceleration, position, height, angles, directions and tilts. Why?
Here appears the next step in building the compact device, as it is not enough to assign envelope geometry around a body like in the prior art to achieve this need and task Again, why?
Because using cameras, Lidars, radars, laser imaging, cameras, distance sensors, TV monitoring is a lot consuming weight, money and humans' time to achieve a task when it is needed to be done inside a city may be millions of times during one day for autonomous machines daily operations. So, For body 5 to engage to body 4 without using these devices specifically in the prior paragraph, and to pick up body 4 from the right place, it should know the following:
1 - The full geometrical shape, topography, and dimensions of all natural, urban, industrial and weather obstacles in-between body 5 and body 4 and body 1 too, and the location/coordinates of these geometrical shapes boundaries, such that it will not follow an aerial route that crosses any pre-known point on a shape boundary coordinates, because that means it is hitting a tree, mountain, building...etc.
2- The full geometrical shape of body 4 with its full dimensions to scale, including all of its protrusions, textures, and topography, the same applies on body 5 itself, that mean body 5 too should know all details related to its geometry shape and topography.
As a result, each body should have identified shape with full details about its dimensions to scale + all of its protrusions, texture, and topography, while current devices in the art are not providing such data in prior, so these data too should be identified and titled as geometrical parameters.
It means in addition to position and motion parameters, a new parameter data to be added identifying geometrical shape, in addition to other important parameter related to weather conditions (wind speed, air temperature, fog, rain, humidity, dust... conditions), that may affect the motion parameter, wherein weather conditions are not to be provided generally by TV news, or meteorology department, but they are aviation meteorological data which should be provided separately as an instant private parameter provided and shared from each body sharing in the process. As a result, these four parameters: Position, Motion, Geometry, and Weather are to be titled (PMGW) data (Fig. 8). The part of these data which is related to positioning the geometry and topography borders occupying a particular part of the space, are the most important part in this invention, their data microprocessor unit can be titled (GTPS) 60: Geometry Topography Positioning System, wherein a huge technical applications can be harvested from this unit as it will come throughout this invention, for example: if the physical bodies locate through this unit their actual physical space occupied by them for other machines so that they will not cross their imaginary position geometries and topography, then in an extended application, the regions where flights are prohibited over them, can be specified by initiating an imaginary expanded geometrical topography extending up and sideways around their actual physical geometrical topography borders (boundaries), wherein any flying body, will be warned autonomously to move away from these imaginary borders (boundaries), and if there is no response, control will be taken over that body immediately. These (PMGW) data are necessary to know where to engage with the other body without hitting any protruding part of it, or hitting anybody around it, but, how can a machine to know blindly the 3D dimensions, topography, texture, shape and geometry of another one, without a variety of measurement tools, Lidar or radar devices, cameras, sensors, remote sensing units, human remote control, and such bulky expensive devices...?
The innovative solution which is behind the telepathy device 21 working principle is based on providing the compact (PMGW) device 60 with preset downloaded data related to its geometry parameters, wherein its whole geometry including all of its parts are gridded into tiny spatial cubes, wherein each big, small, very small or tiny 3D part of the geometry is too having identified geometrical data to help in recognizing where each small part is located inside this 3D geometry, in another way, each physical body or prohibited flying zone will be providing by itself its occupied space data Saying: Hello, I am here! Occupying these particular space boundaries!
Instead of providing other bodies with huge number of devices, to search, navigate, and observe it, not only this, but each body will locate even tiny parts of its body in the occupied space.
As a result, the (PMGW) device 60 will not be randomly installed inside a physical body, but it will be installed such that its focal point is synchronized with the focal point of the mother body, in addition to mating its position direction, tilting... with that of the mother body, such that after installing the (PMGW) device 60 into a specific preselected and calculated safe position inside the physical body, the digital geometrical shape including its inner constituents, dimensions, texture, topography and even coloring, is providing (PMGW) device 60 data that is 100.0% mating and synchronized with the real body (PMGW) data, that is to mean, the (PMGW) device 60 is providing to other similar devices, its mother body digital (PMGW) data simulating perfectly its mother body real (PMGW) data. So, for a building, the (PMGW) device 60 can provide position data, altitude, tilt angles data not only the general shape of the building, but even it provides a specific window altitude, direction, tilt angles, shape, dimensions, frames dimensions, coloring, the same applies for its lifts, lights, protrusions... etc., ...etc, which are already available via 3D architect designs and their amendments for the same building.
For a vehicle, the (PMGW) device 60 can provide position data, motion data, and altitude, tilt angles, in addition to its shape, dimensions, coloring, and topography... etc., and similar details about each part of it. So, when a (PMGW) 60 device is installed inside a UAV 22, wherein its data is shared and sent while flying via a data emitter to a remote data receiver, the data receiver without using radars scanning the space, will know not only the global position, altitude, speed, acceleration and direction of the UAV 22, but it will know its actual shape with the actual dimensions, topography, texture and location of each part in it, big, small, or tiny.
As a result, the flying body 5 which is assigned to pick up body 4, will inform the server 61 about its task, the server will create an imaginary spatial cube in the space which encloses all cooperating bodies, and will share their (PMGW) data in-between specially the moving bodies inside the imaginary cube, such that each body share its (PMGW) device data 60 with body (4) and vice versa, then according to the distance in-between body (5) and body (4) and considering the obstacles which are known not via Lidar, radar, camera, distance sensors, or TV monitoring..., but according to data that are provided by the (PMGW) data devices 60 received from the bodies located in-between or nearby the route of body 5 to body 4, the (PMGW) device 60 of the flying body 5 will receive data over a server 61 from from the mountains body 2 (PMG) device 60 data and building body 3 (PMGW) device 60 data. Accordingly, the flying body 5 should create a flight route toward body 4 without hitting the mountains body 2 (including its trees) or building body 3 which are occupying a space inside the imaginary cube in-between it and body 4, also it should adjust variably its speed, acceleration, altitude, coordinates... until approaching body 4, wherein it will not use cameras to assist a remote controller human to make trials to engage the clamps (hooks) 62 of body 5 with the handles 63 which are protruding over body 4, neither to use laser emitter, or distance sensors to adjust itself and such that its clamps engage to the handles 63 of body 4, but its clamps (Sub-PMGW) data are manipulated compared to the handles 63 (Sub- PMGW) data which are collected from both devices (Refer Fig. 7 circled parts), while the propellers 64 are instructed via artificial intelligence calculations comparing both (X, Y, Z) and time if needed distances in-between both parts (Clamps (hooks) 62 and handles 63) to adjust body 5 real physical body, to occupy the space (spatial coordinates) specifically that allows its clamps 62 (Sub-PMGW) to be synchronized with the theoretical spatial coordinates to occupy a space that allows it to fit to the static handles 63 of body 4 according to a reset program identifying pick-up tools and procedures, wherein once the target spatial position is occupied by body 5, its clamps 62 close, that will result blindly in engaging to body 4 through its handles 63. The flying body 5, does not need to use specific devices to inspect if the engagement is done successfully, all of what it should do, is to follow its route towards body 4 according to its body (PMGW) device 60 data, considering not to go in a straight line, as it will not be allowed to cross the imaginary geometrical shape boundaries of the building body 3 which is recognized to it without observatory tools but according to shared (PMGW) data of body 3. The next route in- between Body 5 and body 1 , is manipulated in the same, once the flying body 5 approaches body 1 , it will follow the preset order to get inside it via a back window 65, again, it will not use observatory tools to search for the window 65 of body 1 , as body 1 will provide it with its window 65 (Sub-PMGW) data, and again, the flying body 5 will not move in a straight line towards the window 65, as it will hit body 1 according to body 1 (PMGW) data which is identifying the imaginary spatial body shape, dimensions and coordinates occupied by body (1), wherein no machine is allowed to cross it randomly, as it will hit body 1 , and so, the flying body 5 will artificially calculate and manipulate its shortest safe route toward the window 65, meanwhile body 1 which is updated about body 5 (PMGW) data is updated and recognizing that body 5 is now near, so it will instruct the window 65 to open to give access to body 5, and so body 5 will land inside with body 4 on the specified landing location, according to landing location (Sub-PMGW) data which are provided by body 1 in reference to its geometry main body (PMGW) data. Then body 1 which is updated about the body 5 (PMGW) that it is inside it currently, closes the window 65 and moves to pass in-between the building body 3 and the mountains body 2, but it will move autonomously according to (PMGW) data received from both the building and mountains, such that it will navigate its route in-between them without crossing their occupied imaginary geometrical shape boundaries. As the (PMGW) data device is used here to share data which is used to know, approach, handle, communicate, cooperate, engage... in addition to further coming and following technical additions, modifications and applications which will be demonstrated throughout this invention, it will be titled Intelligent artificial telepathy (IAT) device 21 instead of (PMGW) device 60. (Fig. 9) illustrates a flow chart briefing the (IAT) device 21 including the (PMGW) microprocessor 60 data use for route and engagement collaboration management.
Please note that the weather parameters are not necessarily installed inside each (IAT) also its data are collected via external sensors or meters.
Intelligent artificial telepathy (IAT) device 21 data
Before discussing the development stages through which the (IAT) device 21 should go, and its expanded applications up to Human-Machine-Human telepathy, with further technical features, it is necessary to demonstrate the further data including the (PMGW) data which should be uploaded too in building up the software of the device.
As an example, in a modern city, normal old vehicle or new autonomous vehicle 66, via its (IAT) device 21 will send by itself information and data to a cloud server over a network to be shared with the current and future expected neighborhood vehicles, up to the next 24 hours navigation plans, all needed data for each autonomous vehicle and the majority for non-autonomous vehicles, will include but not limited to the following:
1- Vehicle's type, model, year and color, registration no...
2- Vehicle's current shape including accessories, protrusions.
3- Vehicle's occupied space dimensions according to its shape, unlimited to height, length width or even weight.
4- Vehicle's GPS location and coordinates, perfect direction, and all sides tilt angles.
5- Perfect height of each wheel from sea level (above or under) in addition to wheel angles (directions), camber/caster/toe-in/toe-out and tires pressure.
6- The current dynamic parameters of the vehicle, speed, acceleration, deceleration, braking, parking, turning (steering angle sensor), changing path, passing vehicles, passing humps, and driving uphill/downhill.
7- The current short term driving parameters intentions for the next 30 seconds according to the preset navigation map (from... to) and the interactive cooperation with the surroundings.
8- The current medium term driving parameter intentions for the next 5 minutes according to the preset navigation map (from... to) and the interactive cooperation with the surroundings.
9- The current long term driving parameter intentions for the next hour according to the preset navigation map (from...to) and the interactive cooperation with the surroundings.
10- The next day driving parameter intentions for the next 24 hours according to the preset navigation map (from... to) and the interactive cooperation with the surroundings.
11- The fuel tank level and expected fueling timing and location.
12- The noise vibration harness (NVH) status of the vehicle suspension, body and other parts.
13- The dynamic stability and active stability control data related to the driving style and road dynamic conditions, in addition to (other side vehicles) and road static conditions (road texture and surface conditions such as: wet, icy, muddy, sandy, rough, cracked... passable, passable with care, risk of slipping...)
- The vehicle customized driving style, slow, medium, or fast (comfort, sporty, or emergency).
- The vehicle dynamic capabilities, engine power, braking performance, sporty level, maneuvering performance...
- The vehicle repeated history manners data and analysis of their probabilities, such as attending work location, visiting restaurants, visiting cinema, back home relative to road map and timings.
- The current location of the vehicle in relation to a traffic lights and the traffic lights status.
- The vehicle loads, weights: empty and full.
- The availability of kids, old persons, or animals inside the vehicle.
- The category of the vehicle or status: ambulance, firefighter, police car, school bus, municipality maintenance, road construction, caravan, bank, fuel, presidential parade, wedding procession, official procession, parade of consolation, car race, public: passengers / luggage, or private, salon, hatchback, sporty, 4 WD, pickup, truck ...
- If a car is pulling another body behind it either a car, boat, bike...etc, then the status, type, shape, geometry spatial coordinates and dimensions should be provided.
- The vehicle driving performance and maintenance status, e.g: low tire pressure, faulty devices, faulty sensors.
- Any shared views of the surrounding area (traffic status, road condition, buildings, posts, opened or closed shops...) to be demonstrated via a display for the passengers inside another vehicle upon their request about a selected location where the current vehicle is available.
- Any emergencies currently happening in the close and nearby surrounding such as road construction repair, maintenance, cleaning, accident and its assessment, lane or lanes closed or trees and boxes falling on road or what else. 25- Any police drone messages received about the status of the road ahead (refer prior art).
While for an Autonomous Aero-Land Vehicle 20 and UAVs 22, the following data should be sent and shared from them in addition to all of the above suitable data:
1- The imaginary aerial route number from point of departure to point of arrival, GPS location 3D map.
2- The GPS location 3D map for the imaginary aerial U-turns, aerial bridges, or air humps.
3- The route climate conditions: wind speed, rain sensor, snow, air temperature, humidity, thunder storms, dust or fog levels, pollution levels, and accidently falling idle aerial machines.
City urban areas topography:
City urban areas include but not limited all physical matter which exists over the barren surface of the Earth's soil, rock and water, as a result the followings as whole and part should be scanned and mapped via 3D mapping + time and computer vision cameras to locate them for the flying and land vehicle, such that the flying and driven machines (AALVs 20, UAVs 21...) does not need many tools and devices to recognize and observe these urban facilities and infrastructure, the city facilities tell and educate while the (AALVs) 20 and (UAVs) 22 know and react. As a result, the city facilities which will be divided into geographical zones within tens or hundreds of meters will be provided with a single built-in (IAT) device 21 and a set of distributed (Sub-IAT) devices 67 for minor static facilities, generally including but not limited to the following: 1- All types of infrastructure urban sites 3D mapping and site plans.
2- Buildings, towers, warehouses, stores, factories shapes over and underground, geometry, topography, dimensions, including accessories, occupied space, GPS location and coordinates, perfect direction, levels, walls, ceilings, service rooms, tanks and windows dimensions, and all sides tilt angle in addition to protrusions, posts, road signs, columns, windows and doors swapping volumes, that is to mean all (PMGW) data .- Constructions, structures, cranes, building materials shapes, dimensions and all data related to their static and dynamic (PMGW).
- The perfect and accurate (PMGW) data for the autonomous aerial delivery machines 22 shapes, dimensions, including accessories, occupied space, GPS location and coordinates, perfect direction and height. (Refer to prior cited art)
- The perfect and accurate (PMGW) data for the autonomous aerial delivery elevator shapes, dimensions, including accessories, occupied space, GPS location and coordinates, perfect direction and height. (Refer to prior cited art)
- The perfect and accurate (PMGW) data for the flats, offices, villas... autonomous aerial delivery windows 75 shapes, dimensions, including accessories, occupied space, GPS location and coordinates, perfect direction and height. These windows 75 addresses details are to be provided when an online order is made or the address is mentioned as an in home P.O. Box to receive aerial deliveries, wherein the windows 75 (Sub-IAT) 67 device will provide its location full details to the aerial delivery UAV 22: Street No. (IAT) device 21 location and data, Block no (IAT) device 21 location and data, Building no. (IAT) device 21 location and data, level no. (PMGW) location and data, flat or office (PMGW) location and data, window 75 no. and location, + height, coordinates (Sub-IAT) device 67 location and data. (Note: when the data are related to main (IAT) device 21, PMGW data for any location can be received from either main or sub (IAT) device 21 that is to mean (Sub-IAT) device is providing high accuracy for specific parts of a main object.
- Roads streets and bridges 3D navigation maps, street's boundaries, UAVs 22 (docking) stations, platforms, posts, columns, cables, piping, trees, greenery... etc.
- Road humps full PMG data, actually in a second and third stage, the artificial road humps should be removed, wherein the autonomous cars 65 should know already were to slow the speed near to crowded places, pinholes, crowds cross....
9- Stadiums, Gardens, parks and car parking's (IAT) devices 21 data including 3D navigation maps, boundaries, platforms, posts, columns, cables, piping, trees, sunshades... etc.
10- River, port, airport machines, structures, constructions, jet skis, boats, ships, aircrafts...etc (IAT) devices 21 data including spatial coordinates and full shapes and dimensions data.
11- Aerial facility rooms 68 full (IAT) device 21 data including: GPS location data and spatial coordinates, these smooth shaped rooms are to be installed normally over high rise buildings and crowded low level buildings, wherein each room will have three gates 69, each gate opens to a firefighter extinguisher drones set 70 located on shelves, or aerobatic fagade cleaners 71 , or local pick-up drone set 72 to unload parcels 73 from bigger aero-carriers 74 and deliver it to the target windows 75 (Fig.
10 A- C).
12- Traffic lights (IAT) device 21 data including shapes, dimensions, all data related to their GPS locations.
13- Municipality and public works authorities updating to all of the above public (IAT) 21 and (Sub-IAT) devices 67 data via a server cloud over a network when major, minor, tiny or micro changes are done on the autonomous city macro or micro., parts.
14- Private entities and residence updating of all of the above private (IAT) 21 or (Sub-IAT) devices 67 via a server cloud over a network through authorities revision when major, minor, tiny or micro changes are done on their physical belongings or aerial delivery/reception macro or micro parts.
In a final stage the human beings and even their livestock need to be provided with artificial telepathy implanted microchips (ATC: Artificial Telepathy Chips) 76 integrated circuits of transponder or modified conventional RFID micro device with the same prior parameters and for the same reasons. These chips 76 need to be attached to the body of the bearer either in his watch, necklace, or implanted into two locations of their body, one from the center right of the body, and the other from the neck back side, such that their locations provide data about the human if he is standing, setting, laying on his back, side or face.. (ACT) 76 should not be removed, as it will provide 24hrs direct updating about its location and even his heart beating, not to be shared with other humans, but with through a highly safe and secure wireless line with a cloud server to update the autonomous cars about them while crossing the roads...
Figure 11 illustrates a 3D view for samples of Autonomous aero-land vehicles (AAVL) 20, flying Cedric vehicle 42, Container Aero-carriers 105, autonomous vehicles 66, UAVs 22, flying robots 26 (either winged, carried on flyboard or provided with jetback), manipulators 107, vehicle aero-carriers, UAV aerocarriers 74, Local Pick-up drone 72, aerial facility room 68, container aerocarrier 105... occupying specific locations in the city land and air space, in addition to the land structures 23, buildings 24, machines 25, humans 27 crossing the road, natural or industrial obstacles 28, container 44, belt 101 , autonomous trolley 113, autonomous shelves 114 autonomous aerial delivery elevator, autonomous ground station interfacing aerial delivery, wherein the squares are resembling the (IAT) devices 21 shown separately out of the related bodies (near to them) for clarity, lines connecting them to the neighborhood server 61 are for demonstration of wireless communications.
The server 61 and City central artificial telepathy station (CATS) 112, collects instant data from all of these, and communicates data back to each of them; updating it about (PMGW) data about each part which it need to cooperate with it, or to have a route near to it, in addition to expanded artificial intelligence data manipulations to facilitate movement, transportation, collaboration of them. Three stages timed strategy and methods for Autonomous City Transportation (Fig. 12):
- (IAT) device 21 installation.
- Autonomous vehicles 66 (AALVs) 20 and (UAVs) 22... use. The current tests and trials on autonomous vehicles 66 are not reflecting the actual case when tens or hundreds of autonomous vehicles 66 with different types from different manufacturers are moving on part of the streets' lanes in- between tens or hundreds of other normal self-driving vehicles 77 of different types, sizes, and use, with different driving styles and manners that are different from country to country and from city to city, and different on different roads. So, the introduction of autonomous vehicles 66 into these differences and contradictions should not be bulky and random without vision, because there will be a contradiction in-between the collaboration of autonomous vehicles 66 coming from variable manufacturers with different settings and specifications, and also the autonomous vehicles cannot be merged easily in-between the normal self-driving vehicles 77, trucks, bikes... etc.
Succeeding in testing one individual autonomous 66 vehicle does not mean a collective success.
So, there should be a bridge matching, adjusting or harmonizing in-between these contradictions, else, the autonomous vehicles 66 experience may fail or face genuine problems when they are introduced in mass. Road users HSE is a red line, wherein accidents may delay the use of autonomous vehicles 66, or even pushes some countries or cities to cancel their introduction until being sure that every aspect is solved via guaranteed safe and secure methods, and proven long time experiences in other cities.
As a result, a clear vision proposing creative visions and solutions ahead of getting into such negative probabilities should not be limited to the ones in the prior art related to the collaboration in-between the autonomous vehicles 65 themselves without including the self-driving vehicles 77 in the collaboration and modification, but the real vision is achieved by creating the methods and devices that match these autonomous and normal machines in a synchronized manner, and bridging the communication in-between them in an HSE systematic harmony; here the following methods will be followed to create technical solutions, technical methods, technical bridging and synchronization to let the autonomous vehicles 66 and normal vehicles 77 adapt to each other on search for solving such issue as an urgent prior need, while many governmental authorities and transportation means manufacturers have to do many international summits, conferences, meetings, researches, inventions, regulations... to formalize a unified system for (IAT) devices 21 and any other facilities to let different transportation means from different manufacturers collaborate and adapt to any country regulations.
All of that from one side, and from another side to depend the following three stages methods and strategy as a bade for modifying and adapting a city facilities and infrastructure to handle the new transportation means, wherein it is obvious for the inventor that the following stages can be partially or wholly adopted depending on different countries and cities, while the stages or phases can interfere in each other, and the projects timings can be different here and there: First stage (5 - 7 years):
1- Introducing and installing (IAT) devices 21 in all existing self-driving (non- autonomous) vehicles 77 which are using the roads, these are including: Bikes, sporty vehicles, hatchback vehicles, salon vehicles, 4WD vehicles, trucks, lorries, caravans, trains, dragged boats or containers... etc.
These compact (IAT) devices 21 should be programmed, set, and uploaded with all of the possible data which are already listed in this invention in addition to (PMGW) data microprocessors 78, integrated circuits 79 and micro-meters 80 and micro-devices 81 , in addition to being connected with the vehicle's instrument cluster 82 and navigation system 83, they will be either provided with an assistant (IAT) display 84 or if available using the existing vehicles displays 84, it should receive all data about the self-driving motion parameters: speed, acceleration, deceleration, direction, body tilt coordinates, steering angle, turn signal lights, reverse and tail lights, fog light, braking, faulty messages and also being connected with the semi-autonomous vehicles 85 and fully autonomous vehicles 66 (IAT) devices 21 via a cloud server receiving data from neighborhood group of vehicles of three types (Autonomous 66, semi-autonomous 85 and non-autonomous vehicles 77).
The self-driving vehicle 77 (IAT) device 21 will issue the instructions on the display screen 84 and announce them through speakers 86 to the driver who can communicate with it through voice recognition system 87, during this stage the vehicle can run simple level dialogs too with the driver or passengers related fully to road issues and simply to their personal needs, these messages are including but not limited to the following:
Warning the driver to provide a space in front of him for an autonomous vehicle 66 or non-autonomous vehicle 85 which is for example on his front left side and trying to go right. 2- to move right to open the lane for an autonomous vehicle 66 or non-autonomous vehicle 77 trying to pass him from behind. 3- to provide him with a real-time map for all existing vehicles on the road either autonomous 66 or semi-autonomous 85 or non-autonomous vehicles 77 in their actual shapes: e.g.: a Ford raptor pulling a boat of 7 meters length, 2.5 meters height... etc. with measurements to scale in addition to shape and color (not meant to show dimensions on the display) this will assist the driver to know and do assessments for the sizes and even types of vehicles ahead of him. 4- to collect data from the autonomous 66 or non-autonomous 85 or self- driving vehicles 77 ahead of a self-driving vehicle 77, these data includes but not limited to: road texture, road potholes locations, humps, wet surface, icy, flooded, foggy...accident displaying shape of the vehicle or vehicles located in the accident area, either via real-time show, or accident vehicles direction, lane, lane status, which lane is recommended to continue driving. 5- to warn the driver from road constructions, road construction vehicles, their locations, occupied lanes, municipality service vehicles, and to open lane for a far police vehicle or ambulance coming from the back. 6- The (IAT) devices 21 in the self-driving vehicles 77 are set to receive data entered by the drivers about their daily destinations and timings, driving method, speed mode, these data are to be used by the servers to carry out assessment for either the short term, mid-term and long term traffic on each street and road depending on data received from all types of vehicles equipped with (IAT) devices 21 , not only autonomous ones 66. That is to recommend for the drivers in prior the best road to reach his destination, wherein the drivers should inform the system if he accepts the recommended road or not, while the server will be controlling the autonomous vehicles 66 to follow its road map instructions after being offered to the passengers. 7- The (IAT) devices 21 installed inside non-autonomous vehicles 77 will assist a lot the autonomous vehicles 66 to know which car park is empty, as the current art requires from an autonomous vehicles 66 to use all of their devices to scan, observe search for a car parking in-between other non-autonomous vehicles 77, but when the non-autonomous vehicles are sharing their exact location with shape spatial dimensions and coordinates with the autonomous vehicles 66, then the autonomous vehicle 66 will know directly where is the available parking space to move directly towards it without rounding and searching, this is so helpful too where two yards or more are completely full, the autonomous vehicle 66 does not need to waste time searching inside them for a parking space, or in another way, it can know if a self-driving vehicle 77 just started leaving its space, as the (IAT) device 21 inside an autonomous vehicle 66 will track the occupied space of all vehicle types, and will be updated about any self- driving vehicle 77 or any of the other types is leaving its parking occupied space, but if there are many autonomous vehicles 66 waiting, then the neighborhood server 61 will arrange them in sequence providing the priority for those who approached first. Note: The parking issue is solved too for the self-driving vehicles 77, wherein these too are getting shared data from a local server about the empty parking space. Still an option can be added wherein two vehicles one autonomous 66 and one non-autonomous 77 with relatives or colleagues inside them looking for two nearby spaces in a car parking, wherein the autonomous vehicle 66 will be guided automatically while the non-autonomous 77 is guided via instructions issued to the driver, such that both will approach the two parking spaces which are close to each other. 8- The collaboration in-between the autonomous vehicles 66 and self-driving vehicles 77 is not governed only by the (IAT) devices 21 which should be created according to international standards governing all vehicle manufacturers to unite the collaboration methods, but also telepathic intelligent learning systems (TILS) 88 related to bridging such collaborations should be installed in all (IAT) devices 21 and neighborhood servers 61, to learn, analyze, educate, guide and instruct either drivers or autonomous vehicles 66 in the same, b- Traffic lights 89: where ever a set of traffic lights 89 at road's cross knows in-prior how many vehicles are approaching from each side, it can control the lighting such that the highest number of vehicles pass with the least time in a fully optimized manner, as a result, (IAT) devices 21 installed inside traffic lights 89 can calculate how many non-autonomous 77, semi- autonomous 85 and autonomous 66 vehicles are moving toward it, as all types of these vehicles are tracked via their (IAT) devices 21 which share and track their own vehicles location, motion and direction parameters... In such a way, autonomous 66 semi-autonomous 85 and non- autonomous 77 vehicles will be engaged in a semi-collaborative traffic for similar targets.
c- The installation of (IAT) devices 21 inside the existing non-autonomous vehicles 77 still provides a vast applications and technical solutions. As far as we know from the prior art, the autonomous vehicles 66 can collaborate to move in convoys, but still autonomous-vehicle 66 convoys will create different types of restrictions against the non-autonomous vehicles 77, either from joining, passing or crossing them, meanwhile the non-autonomous vehicle 77 drivers may disturb such convoys, but the telepathy devices inside non-autonomous vehicles 77 can provide solutions including but not limited to the following: 1- Non-autonomous vehicles 77 being provided with options offered to the drivers through the speakers 86 while the driver can respond via a mouth voice recognition 87. For example, according to the preset road map in a non-autonomous vehicle 77 the driver can choose in prior to join all convoys on the road to destination, or while driving his non-autonomous vehicle 77 to be asked: As would be shown on the display 84:
"a convoy available on the second right lane is moving at medium speed mode, do you like to join it?"
The driver can answer by saying:
"Yes" or "No"
If yes, the server 61 will assist him in providing a space in-between the autonomous vehicles 66, once he turn on his car turn signal lights 90. 2- Non-autonomous vehicle 77 drivers are offered two to join convoys of non-autonomous vehicles. 3- To cross any autonomous convoy, the non- autonomous vehicle's 66 driver just need to activate his turn signal light, the message will be passed through the local server 61 to the nearby autonomous vehicles 66 in the convoy to open a space for the non- autonomous vehicle. 4- Non-autonomous vehicles can pass their messages via their signal lights, braking, high beam to autonomous vehicles 66 not by being observed via the available devices in the art which are installed on autonomous vehicles 66 to observe the physical lights, but via the local server 61 which shares these data received from the (IAT) devices 21 of the non-autonomous vehicles 77, Meanwhile the autonomous vehicles 66 pass their messages to the non-autonomous vehicle's 77 driver via lighting, and if no response via a preset timing, they pass voice messages, else the drivers may receive a fine due to traffic violation according to a point accumulation system, depending on the ratio of seriousity of the case. 5- Current Non-autonomous governmental vehicles 91 (police, ambulance...) are to be defined and identified by both autonomous vehicles 66 and private or public non- autonomous vehicles 77 according to the type of the vehicle: Ambulance, police car, presidential vehicles, road construction machinery, school bus... etc., for example both private autonomous 66 and non-autonomous vehicles 77 are warn ahead about a police car 91 heading up from the back, wherein non-autonomous vehicle 77 driver should take action, while autonomous vehicle 66 responds directly. It is worth to know how during this stage through (IAT) devices 21 , the most dangerous and horrible accidents are avoided, wherein such accidents occur when a non-autonomous vehicle 77 driver is using a municipality/agriculture vehicle (truck) lane 91 , wherein after speeding up suddenly he meets a stopping truck irrigating the plants where the traffic reflectors may be useless as they are after a turn... and the right side lane is loaded with traffic, which simply results in causing the speeding up non-autonomous vehicle 77 to be fully squeezed under the truck 91. But with the (IAT) device 20, the driver will be warned directly via the shared locations of the truck 91 (IAT) device 21 with the local server 61 which shares it with the (IAT) device 21 of the speeding non-autonomous vehicle (77). Depending on the availability of the options, if the driver on the wrong lane does not respond, an action may be instructed by (IAT) device 21 and carried out by a semi-autonomous 85 or non-autonomous vehicle 77 by automatically activating engine cut-off via the digital motor electronics (DME) or engine control unit, or by activating the ABS, in addition to activating the signal lights or hazard lights to warn other vehicles from this situation, and to instruct the driver to go back to his lane.
Care need to be taken too when intending to activate such devices, wherein municipality works on main roads, sometimes close the main lanes and change the direction to use the service lanes, such changes on the roads should be updated on the server 61 in prior with their activation timings for both autonomous 66 and non-autonomous vehicles 77. d- The (lAT) devices 21 in autonomous vehicles 66 leads them directly to their pre-booked parking space which are updated on the local server 61 , the parking gates 92 too are provided with (Sub-IAT) devices 67 to let them in, the same applies for non-autonomous vehicles 77, but the drivers should drive the cars to their parking space or activate automatic parking option.
The (lAT) devices 21 in autonomous 66 and non-autonomous 77 vehicles not only provides data about spaces in car parking, but also how much an area is near to a restaurant of fast food, bank... or what else is currently crowded and how many persons are waiting their role in queue. e- Looking ahead or behind trucks: one of the most annoying issues for drivers is being driving behind or even ahead of a vehicle blocking their view, like a salon car after a 4WD vehicle or truck, the non-autonomous vehicle 77 drivers can get a nearly live real-time view of the road ahead of them depending on the (lAT) devices 21 by projecting the views ahead of them or on the displays 84, wherein each vehicle, truck, or bike...will be shown in its actual three dimensional shape and color with dimensions to scale, either on the display screen 84, or via a 3D projected live (real-time) head-up display 93 with actual display coloring.
f- Street radars 94: installing the (lAT) devices 21 in existing non- autonomous vehicles 77 will carry out monitoring the vehicle speed, wherein exceeding the speed limits, passing wrong lines, parking in wrong places, parking in non-allowed parking, racing in-between radars, all are monitored, warning voice and written instructions are introduced to the drivers, traffic fines should be issued instantly and by ratio depending on the location, and on either violating the rule was instant and mandatory and temporarily due to a serious situation, or it is without reason, considering too, the time length of violation, disturbing the autonomous vehicles 66, crossing a traffic light 89, violating the instruction to open the lane in prior for police cars 91 , ambulances... etc. g- High beam / Low beam issues: High beams are so disturbing in two-way narrow roads, some drivers are not using them properly, but via (lAT) devices 21 especially for non-autonomous 77 or semi-autonomous vehicles 85, the control will be taken automatically by switching the low beams on when the vehicles meet inside a two-way road.
h- Adaptive headlight control: headlights in new vehicle models are turning right or left automatically with turning the steering wheel, this may be helpful at very low speeds, but as the speed increases, the light turning will lag behind turning the steering wheel, that mean you cannot see and observe the turn before turning the steering wheel, through (IAT) devices 21 , specifically when the road map is selected and the street is having one lane or all lanes turning together in one direction, these lights can be turned right or left even instantly just before the steering wheel is turned, to navigate the road, or even by keeping one consistent with the steering wheel, and the other turning ahead of it, depending on the pre-selected road, turns locations, and speed.
i- Autonomous city airport: during the first stage, a multiple phase project should be launched to establish an autonomous city airport, starting by feasibility study, then selecting a site, making urban planning for four terminals: Autonomous Aero-Land vehicles (AAVL) 20 terminal, UAVs 22 terminal, Logistics terminal, Autonomous aerial reception and delivery terminal, up to starting constructions for the infra-structures.
j- Buildings and UAVs: during this stage, the UAVs 22 civil service implementation will start, it will not cover every part in the city, but it will be limited to cover the followings which are provided with (IAT) devices 21 :
High rise buildings, major hotels, major governmental departments, major courier offices, major restaurants, and multiple-fulfillment centers related to online shopping. UAVs 22 services can cover parcels 73 delivery, firefighting and facade cleaning. To carry out parcels 73 deliveries, the UAVs 22 should be provided too with (IAT) devices 21, the handling of parcels to the UAVs 22 can be manual, then under human control, then to be automated, while the delivery will be to autonomous ground stations (refer prior cited art) provided with (lATs) and located inside reception area for the residence or employees easy approach, and easily installable unit without modifications on the buildings.
Meanwhile, to provide window 75 delivery, during this first stage, this will be carried out through the following:
- New building are to be provided with autonomous aerial delivery conveyors delivering parcels received from the building roof or ground floor through elevator via a flat or office specific window 75 into a specific chamber (refer prior art).
- Old building windows 75 are modified with motorized mechanisms to open for receiving parcels.
Forests fires fighting: During the year 2018 a dramatic increase in forest fires happened, following to the ascending ratios in the last years. A solution for this issue which is creating a destructive results not less than a real war, is by developing the civil defense lines inside forests which are the weak points that makes fires so easy to spread, so during the first stage, a site plan dividing a forest into zones covered with different firefighting defense lines spread like the following, to take control over fires immediately, and to limit their drawbacks to the minimum:
- Firefighters carried by drones (UAVs) 22: here every 2.0 km a firefighter drone normally carrying a set of three fire extinguishers located in the right place in a forest where it can access easily toward a fire, within around one minute after the first warning is received, four firefighter drone sets 70 can attend to carry out firefighting, and within 5 minutes around 15 firefighter drone sets 70 can attend to carry out firefighting, while according to the distance distribution in figure 13, within the first hour hundreds of these will be attacking the fire and surrounding it from all around, this arrangement will lower the chances for the fire extensions to extreme conditions.
- Firefighter extinguisher drones 70 are to be distributed on highways every 10 kms inside docking stations, wherein a standby set or more of firefighter cylinders are kept there and arranged right, such that the drones can return their empty ones there, and pick up new full ones. That is to mean to provide each docking station of a firefighter with extra set of filled extinguisher cylinders to return and replace their empty ones (of course depending on its own (IAT) device 21).
- Every 10 -20 km there will be a firefighter vessel aero-carrier 105, these aerocarriers 105 are provided with vertical take-off and landing capability (Fig.3 (A-C, F) and Fig.4), and so depending on their engines either ducted-propellers or jet-prolusion or both with top traditional propeller, they can approach the fires within less than ten minutes, if the weather is windy, they can handle another role via splashing fluids via pumps or spreading powder through extending nozzles over the flying ash, to cool it, before falling down and causing new fires nearby, as this flying ash is a main reason for spreading the fires far away from the first spot by many kilometers and sometimes tens of kilometers.
- To provide a nearby points warehousing fire extinguisher cylinders for the firefighting drone sets 70 every 10 km, as static providers where new refilling technologies may be applied for fast refilling of the empty ones, also fire extinguishers to be carried by trucks as dynamic providers located every 50 km, wherein firefighter drone (IAT) devices are communicated with the truck (IAT) to know where to meet to handle its empty fire extinguishers to receive new ones.
- Aero-carrier 105 empty vessels 44 refilling centers, also to be provided every 50 km.
- Flying-drone arrangement installed over a civil defense truck is provided to carry hydrants to carry out firefighting for fire near the roads or on high rise buildings. These can be located every 50-100 km if out of cities.
- During dry weather wind, and high ratios of fire expectations, firefighter drones can carry out routine patrols for watching, observing, monitoring...any fire case... Figure 13 illustrates a rough distribution map of the firefighting tools spread over part of a forest, these tools are not spread to scale, and not repeated on each part of the map, as this map is limited for illustration only. The symbols meanings are like the followings: : Firefighter drone set 70.
jj : Vessels, tanks Aero-carrier 105.
: Firefighter cylinders warehouse, full vessels 44, or vessels refilling center
g : Dynamic firefighter extinguisher provider truck.
: Firefighter truck with top drone set arrangement carrying firefighting fluid hose.
In total, these machines within hours can be arranged in multiple convoys autonomously and strictly fighting and attacking fires on the ground and their ashes in the air.
So, for the first stage, (IAT) devices 21 , are a magical description for a wide range of solutions either bridging and linking the different vehicles, long term customizing of old vehicles drivers to cooperate with them and being ready to use them, and even to solve many other road safety and comfort issues, while boosting the thinking of inventors to find more applications for them in the near future.
Second stage (5-7 years):
During the second stage, extra autonomous 66 and semi-autonomous vehicles 85 are sold and in use on account of non-autonomous vehicles 77, as their upgrading level is developed more, this should be accompanied with upgrading the existing or newly produced semi-autonomous 85 and non-autonomous vehicles 77, meanwhile the a start of dramatic changes starts on the roads and ending in the second stage, while interchangeable aero-land vehicles 20 are introduced, UAV 22 usage in multiple civil services is expanded, while autonomous city airports first phases are established and put in use to incubate and handle the UAVs 22 and the Aero-land vehicles 21, wherein: a- Computer Vision Panorama cameras 95: The obstacle not to use them in autonomous vehicles 66 is that even they are relatively capable according to some conditions to reconstruct real-time 3D views, but in the case of autonomous or non-autonomous vehicles 77 an absolute or a better construction of each 3D object in the views is a necessity to overcome this issue, it is proposed here that these computer vision cameras 95 to be created as panorama cameras installed on top of autonomous and 66 semi-autonomous 85 non-autonomous vehicles 77 inside a glass chamber 96 which is wiped by wipers 97 to clean any dust or water splashed from the road on them rather than being stuck to the panorama cameras 95 lenses, and as a result, wherein panorama cameras 95 are installed over each vehicle autonomous 66, semi-autonomous 85 or non- autonomous 77, then if as example a person is driving a trolley on the road side later turned left to cross the road, the available view for him will not be only a side view appearing on the camera 95 of a vehicle moving forward on his direction, but multiple views will be available for him from different sides picked by the panorama cameras 95 which are collecting views from all sides, as a result the vehicles which passed him while he was on the right of the road, picked photos for him from the back side, left side, and front side, while when he turned left, the panorama cameras 95 for the vehicles who passed him will take backward views, part of them for the right side of this person and his trolley, as a result collecting such photos from all sides will be summed up in creating a 3D view for him from all sides perfectly.
As artificial technologies already started to identify the bodies pictured and viewed by computer vision cameras with their specific names, titles, as if they are viewed by a human eye and processed by a human man identifying them in name, type, dimensions and distance, such that the (Al) can the constituents of a picture or video, then the local server 61 should be upgraded with such technologies to process, assign, title and identify all the similar static and moving bodies appearing on a set of vehicles computer vision cameras to know how to deal with them via commands sent to the vehicles.
And so, based on these views, the server 61 will identify and assign the person and his trolley an identity (Unit (x): Moving person + trolley) as if the person and his trolley are equipped with a recognized (Sub-IAT) device 67, this artificial 3D view reconstruction can be supported too by installing multiple panoramic computer vision cameras 95 on the most critical parts of a road, to pick-up views which perfectly reconstruct 3D actual views from all sides with the actual dimensions, to be shared with the local (neighborhood) server 61, this whole system can be called collaborative reconstructive computer vision (CRCV) 98, wherein no more old normal vision cameras used, while the new ones starts to occupy new installations and replace old ones..
These cameras 95 can be a better substitute for the bulky Lidars and other observatory radar, sonar, laser vision, distance sensor devices... during this stage too. It is not only because the cameras are compact, cheaper, needs less service, not having motorized mechanisms, with less probability of being faulty, and not only because these cameras 95 are showing real-time live videos, but because the computer vision technology will process the instant images sent from the live real-time cameras 95 views to a microprocessor to process these viewed images via computer vision algorithms recognizing shapes and objects within images, and their dimensions and even names and predicted habits and motion style depending on technologies available in the art, wherein every physical item moving or static in the road, either vehicles, bikes...humans, livestock... cartons, trolleys...fluids, oils..., rain, mud...bridges, roundabouts... traffic lights, street lights., which are using, fixed, or located on the road or road sides, will be recognized and identified as if they are real objects not picked up by a normal camera, but like being viewed by a human eye (a driver eye) which catch the views and recognize its objects, dimensions, and depth, then the (IAT) or (Sub- HAT) devices can decide according to experience how to handle and deal with these objects in the road.
So, it is not a camera vision, it is a computer vision replacing the eye vision and simulating it, it is another tool to bridge the interference in- between the autonomous vehicles 66, semi-autonomous vehicles 85 and non-autonomous vehicles 77, but their uses are handled in different ways: 1- In non-autonomous vehicles 77 they will relatively assist autonomously in braking when a driver is going to hit an object which is either having a (IAT) device 21 such as vehicles identifying their occupied space, dimensions, and distance, so that they will not get into the space of the observed vehicle + safety distance enveloping its 3D shape/geometry e.g.: 0.75m. Also it will be applying brakes or creating deceleration or even engine cut-off to prevent the non-autonomous vehicle 77 from hitting an object which is not equipped with (IAT) device 21 such as trolleys, humans, livestock, cartons, boxes, tree parts which are recognized via (CRCV) 98, it can even observe and recognize the oils, fluids, muds, sudden accidents, idle vehicles, not only to take an action to warn the driver or to assist in deceleration or braking or even stopping, but also to share these data with both other autonomous 66, semi- autonomous 85, and non-autonomous 77 vehicles to be aware ahead about such issues to decelerate or change lane, while for the local server 66, these data may be used to issue instructions for the remote vehicles to change the whole road. It also can provide and share live real-time views ahead with non-autonomous vehicles 77 at the back, wherein a driver can have an option to select to view the road for the next: 50m, 250m, 500, 1,000m via receiving live views shared from both autonomous 66 and non-autonomous 77 vehicle panorama cameras.
These views can be either viewed on the screen (display) 84, or projected on head-up display 94, or in a more created device in this invention, by installing a holographic dome 99 over the dashboard from the driver side, to make the holographic real-time view inside the dome viewed despite of the daylight, or street light, the dome glass should be covered with a dark tinting film, such that the outer light does not get inside it, as a result it will be dark from inside, while the inner projected light creating the holographic 3D real-time views will be observable.
2-Autonomous vehicles 66: The computer vision panorama cameras 95 will take the major role to monitor and control the vehicle's motion parameters rather than data collected from Lidars, Radars, distance sensors... During this stage the panorama computer vision cameras 95 can be installed over the Lidars 49, while replacing them later when more autonomous vehicles 66 get more occupied with panorama cameras 95 + their (IAT) devices 21 locating them either static or in motion, all of this in addition to having live real-time updating about the road situation not only from the autonomous vehicles but also from the non-autonomous vehicles 77, in addition to the data stored and updated each second, minute, day, week, month, year and more via the Intelligent data learning system TILS (Telepathic intelligent learning system) 88 in each local server 61, wherein the roads and their borders are daily scanned and viewed by the cameras 95 from all types of vehicles, which draws a fully detailed view for the road and each part of the road even road cracks, each physical item on the road side, shops, barriers... etc., hanging tree branches, in addition to every user of a road either a vehicle, human, or even dust and there daily timings, manners and methods , these data are passed to an Intelligent data educating system (IDES) 100 which uses artificial intelligence to educate, guide, instruct both autonomous vehicles 66 (IAT) devices 21 and drivers of non-autonomous vehicles 77, wherein such unified source of data instructions... will be another element bridging the understanding in-between autonomous 66 and non- autonomous vehicles 77.
Buildings and UAVs 22 delivery: during this stage, the UAVs 22 civil service implementation will continue, it will not cover every part in the city, but it will be expanded to cover more medium size and importance building blocks, hotels, governmental department, courier offices, restaurants, and multiple-fulfillment centers, malls online shopping. UAVs 22 services can cover parcels 73 delivery, firefighting and facade cleaning, to carry out parcels 73 delivery, the UAVs 22 should be provided too with (IAT) devices 21 , these devices are installed too inside each pick-up and delivery facility (building) to locate items which should be picked up from belts 101 , shelves 102 or windows 75, such that when an order is received online, it is sent electronically to an assigned UAV 21 with the pick-up line, shelf 102, belt 101 details, which will be located via a (IAT) device 21 installed there, providing full data about the location's 3D- spatial coordinates, location of warehouse, shelves set, shelve line, and perfect location of item over a shelf 102, belt 101 , window 75, the UAV 22 will fly to that specific part of the space following the preset route, it does not need to scan labels, bar codes., on it, nor to use laser communication with any device to assure the right item, it just pick the imaginary location in the space, where the item is located.
To make it more clear, the whole shape of the warehouse is located in the space according to GPS spatial coordinates or (PMGW) parameters, including its all dimensions, borders and surfaces, then after the UAV 22 gets inside, the whole warehouse inner shape is provided including shelves, lights... etc. perfectly in relation to their actual shape, dimensions and location, then each after the UAV 22 moves toward the specific shelves 102, the cell in the shelves 102 is numbered with a reference number connected to its (PMGW) spatial location coordinates in the space is located.
So, the whole warehouse including each physical part inside it even tiny protrusions is provided in its three actual 3D shape connected to its spatial coordinates, the whole is divided into 3D grids of 1 mm length or less, these grids spatial coordinates are to be defined, like in screen mesh search for spare parts, but here the search is through actual 3D gridded system starting from bigger units to small, smaller, micro, tiny... such that when a UAV 22 flies toward a warehouse it navigates for it as a major grid, when it approach it, it gets inside via a gate which is having a defined shape and dimensions including its tiny grid 3D dimensions in the space, then the UAV (drone) 22 crosses the defined safe route towards the specific shelf 102, then once it approaches the item (parcel)
73 to be picked up, it adjusts itself perfectly over to occupy the whole special grids in the space which is fit with its shape, once it is perfectly occupying this space, it means it is in the right place to make pick-up, so it pushes its clamps 62 to inside over the thing which is under its clamps 62 whatever it is, all of this is governed under the control, monitoring and follow up by both (IAT) devices 21 in the UAV 22 and the other in the warehouse, which works together to synchronize the UAV 22 to the specific multi-grid shaped space over the item (parcel) 73.
In the case of aerocarriers 74, which are carrying multiple parcels 73 for different orders from different units in one building, or one compound, the aerocarrier 74 can be guided in the same near to a belt 101 or robot 26 to fill it with parcels 73, wherein each parcel 73 which is identified to its prior occupied space on a shelf 102 cell, once picked up by a robot 26, or pushed toward a belt 101, its location which is tracked by the (IAT) device 21 according to: 1- the time when it is pushed over the belt 101, the speed of the belt 101 and so the time when the UAV 22 approaches the belt 101 , then parcel's 73 sequence number when its pushed inside the aerocarrier
74 or UAV 22, is it the first, second, third, forth..? and in which aerocarrier 74 tray, and which side, these data about it, are be passed from the (IAT) device 21 of the warehouse to the (IAT) device of the aerocarrier 74, such that the aerocarrier 74 knows that a second parcel which was pushed inside it from the belt 101 or filled by the robot 26 in the spatial coordinates referring to its lower shelf 102 inner right side is referring to order no.:..., address:..., local pick-up drone no.:...
So, as it is seen, there is no need for sensors, scanners, labels, data recognition stickers, laser emitters and receivers, no need for all of these, the artificial telepathy data communicated in-between the devices (IAT) 21 according to locating specific shapes from specific spatial coordinates into new spatial coordinates and the crossed distance over a calculated time specifies there new locations, these are enough to track, identify and locate a physical body.
Now, the Aerocarrier 74 or UAV 22 will leave the warehouse and engage to a specific preset route towards its destination, where it should always adjust itself to be in route. For fear it may hit a bird, or an idle falling UAV 22 it can be provided with panoramic computer vision camera 95 to avoid such rare incidents, and to readjust its track back to fit its original route. The Aerocarrier 74 or UAV 22 loaded with parcels 73 should have its (IAT) device 21 in contact with both, a (IAT) device 21 in the local pick-up drones 72 in the destination and (IAT) device 21 in an aerial room 68 for (drones) or warehouse installed over a building, to be housing local pickup drone 72 set, firefighter drone set 70, and a facade cleaning drone set 71 (Fig. 10 (A-C)).
Like the autonomous 66 and non-autonomous 77 vehicle (IAT) devices 21 , The Aerocarrier 74 or UAV 22 (IAT) device 21 , will be in contact with local servers 61 to share the spatial location and motion parameters and to receive them too, not only about the neighborhood UAVs 22 to avoid collision... etc, but also to receive data about all of the high rise physical objects via their (IAT) devices 21. As example a tower whole (PMGW) including: shape, dimensions, coordinates of it as whole and as in partial grids in the space.
Once the Aerocarrier 74 approaches the Building, it moves directly toward a specified spatial coordinates in the space according to the gridding system where the local pick-up drones 72 are located, once its (IAT) device 21 finds it is close to the room 68 via measuring the distance in-between its body (geometry) spatial location and the UAVs 22 room geometrical 68 spatial location, it will slow down its speed gradually according to the distance, it will adjust its direction toward the direction and coordinates of the room's 68 gate 69 as received from the room's (IAT) device 21 , it will keep adjusting its distance, speed and direction according to the spatial coordinates in-between them, until it approaches it, wherein the gate 69 is opened, the pick-up drone 72 will receive signals about the spatial coordinates location of the Aerocarrier 74 or UAV 22, and so it will adjust itself toward the gridded coordinates of the parcel 73 which is received from the occupied gridded shape, dimensions, coordinates, height... and direction of the Aerocarrier 74 with its parcels 73, next, the local pick-up drone set 72 receive data about each parcel 73 according to its special gridded coordinates, so when the local pick-up drone 72 adjusts itself to pick-up something under it, it knows that this thing is a parcel 73 referring to order no.:..., address:..., local pick-up drone 72 no.:..., as a result, the pick-up drone set 72 one by one start to unload the parcels 73, such that each one carries the parcel 73, adjusts its coordinates to have its front direction pointing to the gate 69 exit, then according to the delivery procedure the local pick-up UAV (drone) 72 flies like the following:
1- Flat/office window 75 delivery: the drone 72 receives from the (IAT) device 22 in a building the perfect gridded spatial coordinates...of the specific aerial delivery window 75, it adjusts itself to follow a safe route without obstacles toward that window with artificially calculated speed, there, the window 75 opens for dropping the parcel 73 on a table or shelf, as instructed by the signals sent from the (IAT) device 21 of the building or (Sub-IAT) 67 device installed in the window 75, wherein the building (IAT) device 21 calculates the distance, angles, coordinates, height, of the pick-up drone 72 according to location data received from the pick-up drone 72 itself for the delivery address details, when it finds that the drone's geometry approaches and so close to the window 75, it either sends a signal to the window 75 motorized mechanism to open or the (Sub-IAT) device 67 of the window 75 to do that, then according to calculations of (PMGW) in-between both (lATs), it is computed how much the local drone 72 should move a distance, direction, tilt, adjust itself according to instructed speed, angles, directions... to drop the parcel 73, then to move back, window 75 closes, return back to the room 68 to pickup another parcel 73 and so on.. 2- if the delivery is via a conveyor elevator (Fig. 11) specified for parcels 73 or via an autonomous ground station, the building telepathic device 21 will locate the spatial coordinates for the pick-up drone 72 to navigate toward the delivery station or conveyor elevator or belt, which may be provided with (Sub- IAT) 67.
In the case of fire, the smoke and fire detectors, sensors communicates instantly with the building (IAT) device 21 , which knows from preset data and connections the location of these, which floor, which flat and even which room, as a result the spatial coordinates for an office/flat/floor under fire are directly and perfectly located, a message is sent to a firefighter drone 70 set, which follows a route toward the spatial coordinates of the fire place, the firefighter drone 70 set can either follow stairs space for inner fires, or guided to firefight the fire from outside the building, the firefighting is carried according to heat sensors 103 sensing the most hot spots to be extinguished, or by being monitored from the civil defense station, these firefighter drones 70 can be provided with long metallic rods with metallic ball ends that can break a window 75 glass and get to inside if necessary.
The facade cleaners 71 can be preset for whole building fagade cleaning, and can be additionally booked for private cleaning for specific windows 75 of some offices, the whole building fagade special dimensions including fully gridded details and spatial dimensions for the protrusions, the fagade cleaner drone is guided in the same toward the fagade from top to down, but when it faces a protrusion, it will not hit it, neither it needs to have distance sensors or cameras to distinguish it, nor it needs to be remotely controlled, it just receives the order, timing, specific fagade part, then it moves there to connect to a retard-able thin water hose, then starts cleaning, and wherever it meets a protrusion as specified and located by the building (IAT) 21 device compared to its own (IAT) device data 21 , it just passes it by moving its geometry boundaries a specific measured distance far away from the protrusion occupied space geometrical boundaries at specific heights, and back to the fagade, according to a comparison and computing of distance/angles... in- between their instant adjusted spatial location, and the received spacial location of any protrusion, such that its body borders topography never cross a protrusion occupied special coordinates, while its brush distance is calculated from its own (IAT) device 21 which calculates the distance from the spatial coordinates of fagade glass which are received from the building (IAT) device 21 too, as a result, the squeegee (brush) is positioned to be in a perfect contact with the glass.
Furthermore to notice, the drones' warehouse (room) 68 is provided with outer top solar cells 104 for inner drone's battery charge (Fig 10).
Building, public, and private car parking: during this stage the building car parking are too identified via (IAT) devices 21 specified and installed inside each parking yard, wherein all passages and individual parking locations are identified and located for each vehicle, as a result, each autonomous vehicle according to its identification and (IAT) device 21 , will be let in through the parking security gate without security allowance to open gate, cards, stickers... the vehicle will move or driven directly to the booked parking spatial coordinates in the right floor,
c- Interchangeable Aero-Land vehicles 20: these too are provided with telepathy devices 21 , in the air it guides them and share data in the same procedure for UAVs 22, while landing in an autonomous city airport, they book in prior their landing field and spot, wherein they communicate their arrival time and pass it to the main (IAT) device 21 which is managing the whole traffic over the autonomous airport, and to the (AALV) 20 terminal (Sub-IAT) devices 67, these will provide spatial coordinates for them to carry out a safe landing, the same too applies on take-off either for individual or convoys of (AALV) 20. While being moving on land roads, the (AALVs) 20 to move perfectly the same like land autonomous vehicles.
d- Autonomous city airport: The autonomous city airport will be ready at the start of the second stage to receive (AALVs) 20, while during next phases, via a shipping terminal, it will be handling aerial-containers carried by container-aerocarriers 105 (Fig. 3), wherein each terminal and its facilities, yards, take-off and landing spots are provided with (IAT) devices 21 to control locating and moving the machines and containers autonomously.
e- Street posts, signs, radars, traffic lights, road reflectors... etc., it will be started to be removed, as an unnecessary items facilities the roads wherever it is found no more useful to keep it, it can be recycled or sold to other old system cities, this process which will start during the second stage will finish at the end of the third stage, wherein nothing of them will remain on the roads. A question may be raised, why to remove signal lights? Answer: because (IAT) devices 21 will manage the vehicle traffic crossing a road cross. Another question? Why to remove road posts? Because posting will be announced inside autonomous vehicles movie (TV) screens 106.
f- Voice recognition and dialogs: Autonomous 66, semi-autonomous 85 and non-autonomous 77 vehicles voice recognition will be upgraded from simple questions and dialogs into medium level dialogs wherein the vehicles can run a medium level dialog with the driver or passengers related to their private requests, and to contact them while they are out to meet them for pick up, or to go and pick some items.
Third stage (5-7 years):
This stage is the final stage, wherein the aim of it is to change the city to be fully autonomous, and all physical dynamic items plus the static items in contact with them will be fully equipped with either (IAT) 21 or (Sub-IAT) 67 devices, while man driven machines will be decreased to the minimum limits, such that: a- Non-autonomous vehicles 77 nearly disappear from all the roads in modern cities, while agricultural machines in the farms harvesting, irrigating, spraying... the plants, construction trucks and machines, oil field drilling machines, sea boats, ships, yachts, submarines, helicopters, aircrafts, and may be satellites, space stations, will be either fully autonomous, or at least these machines will be equipped with suitable (IAT) 21 or (Sub-IAT) 67 devices. Autonomous vehicles 66 become more self-dependent, they can do tasks without being incorporated with passengers, they just inform the owners or users, discuss with them, and arrange for substitute vehicles either from auto-dealer service, car rent, or home other vehicles who can assist to substitute them, to work in their place when they are busy, examples includes but not limited to: 1- Fixing an appointment with an Auto dealer for periodic checkup, service, maintenance, repair, or updating its electronics programs, they do not need to be incorporated with passengers, they can visit the auto dealers by themselves, wherein the gates their autonomously open for them, they will know over which lifter they have to park, wherein mostly robots 26 or manipulators 107 provided with (IAT) 21 or (Sub-IAT) 67 devices can move to the specific parts of the vehicle where (IAT) 21 device 21 of the vehicle locates and shares its spatial location, such that the robot can localize and approach specific plugs, data diagnostic connectors, defected parts, screws, batteries, wheel bolts according to the gridding system of the vehicle in the space, at the end of service the autonomous vehicle 66 will allow the auto-dealer online service to deduct the service charges from the owner's credit card, while during the whole service process the autonomous vehicle 66 can share with the owner a report about the work in progress, recommend approval on some or all repairs, when the service inside the workshop is done, the repaired autonomous vehicle 66 is joined by a robotic road tester 26 to carry out a road test, if any fault happens during the test, the robot 26 can take control and contact autonomous recovery service 108, the human being interference is called for at extreme conditions, after the vehicle is back an fixed right, it will move by itself toward the washing bay, where it will be washed, cleaned, dried, and then it will leave by itself back to its specific parking near the owner's home or near to his company, sending a message: I am back, to arrange for returning back the replacement or rent a car (autonomously) after confirming that the owner has no valuables inside it. In the same, the self-dependence of the autonomous vehicle to include the following: 1- to move by itself to receive a wireless electric charge from a nearby charging facility. 2- To carry students to schools and come back alone, then to bring them back on time...3- No more valet parking, as the vehicle 66 will find its clear way toward a free (empty) parking space without searching, and will come back once it is called, and so no hired or employed drivers are required. 4- On the highways, at the end of the third stage: Autonomous vehicle 66 single convoys can include hundreds of vehicles, wherein the traffic smoothness will replace the high traffic. 6- Autonomous ambulances 109 can communicate with UAVs 21 carrying ambulance capsule 110 loaded with and vacating injured (horizontal capsule 110) or stuck person (vertical capsule 110) from a nearby narrow place in-between buildings, mountains, a nearby island...deep valleys... unapproachable location which is under fire, and to receive them in a specific agreed location (Fig. 14). 5- Autonomous vehicles 66 can go to the markets by themselves to receive online orders from autonomous supermarket/restaurant, wherein the goods, parcels, luggage, packages, or items are filled and handled limitedly but not specifically like any of the following methods:
- Separate motorized manipulator 107 near specific shelves 102 are picking-up the items and filling them inside specific motorized autonomous trolleys 113 referring to specific orders, and navigating their ways to specific shelves 102 according to their (IAT) device 21 which tracks the ordered items according to their locations specified by the shelves 102 (IAT) device 21, and these trolleys 113 are rounding in-between specific shelves 102 according to arrival priority, once they approach a destination where their order item is available, the manipulator 107 is activated to pick-fill the item according to a 3-ponts localization method of (IAT) devices 21 communication, wherein point 1 : basket of trolley location, point 2 manipulator location, point 3: item location on shelf (Fig. 15) . - The autonomous trolley 113 is moving to face a specific part of an autonomous shelve 114 where a specific ordered item is located according to (IAT) devices 21 , then the autonomous trolley 113 adjusts itself up or down to receive the item, wherein the item is pushed to be dropped inside the trolley conventionally, for example like in the vending machines, or via piston-rod plate set or any similar mechanism which pushes the selected item to drop inside an adjustable height trolley.
Or the autonomous trolley 113 itself is provided with robotic arms (manipulators) 107 to pick up specific items to achieve carrying out autonomous shopping on behalf of other remote persons making online orders. Once the trolley is full, it can tie/close/seal a major bag/carton inside it which is housing the whole order.
- The autonomous trolley 113 then moves toward the back door of an autonomous vehicle 66 waiting to receive the order, via (IAT) device 21 communication in-between both autonomous vehicle 66 and autonomous trolley 113, the vehicle opens the back door, then receiving the order can be carried out in-multiple manners, after the trolley adjust its height, either the front and rear vertical sides of it opens and a motorized drive manipulator 107 or hydraulic-robotic piston-rod-plate faces the trolley 13 from the back wherein the rod or piston pushes the rod to push the plate which pushes the sealed order inside the autonomous vehicle 66, or motorized robotic manipulator pushes the order inside the vehicle (Fig. 16), or robotic hands located there pick-up the order from the trolley and drop it inside the autonomous vehicle, or facilities provided with (IAT) devices 21... to receive the orders from specific windows 75 with delivery manipulators 140, belts... etc. which handle the pre-orders in sequence to autonomous vehicles 66 facing them, then the autonomous vehicles 66 bring the order to the person who ordered them, of course depending on artificial telepathy communication. - Or the autonomous vehicle 66 itself is provided with its own autonomous trolley 113, wherein it will drop it in a specific location near a shopping center by providing an access for it over a tilted smooth stair (tilt gate), the autonomous trolley 113 can do the shopping as explained, then come back, wherein it gets in via the tilted smooth stair (tilting gate) (Fig. 17), both go back home, wherein the secured autonomous trolley 113 is dropped front of the building gate to move to the lift then to the specific owner flat, get access and deliver the order, while the autonomous vehicle 66 leaves to its specific parking. Note: this method of trolley in-out of autonomous vehicle 66, can be applied too for wheel-chair carrying persons of special needs, wherein autonomous vehicles will be so much achieving their needs.
- Or the delivery of order is done inside a shopping center parking via a method similar to the capsule deliver in Fig. 14, wherein a motorized driven manipulator pick-up from in-between normal shelves 102 an order which is manually or autonomously prepared and handle it to a UAV 22, which flies out, and handle the order to an autonomous 66, semi-autonomous 85 or non-autonomous vehicle 77 for example via a specified opening in its roof (Fig. 18).
Autonomous city airports: these are expanded to be a business hub for the whole city controlling the transportation of humans and goods, wherein extra terminals are opened with fulfillments centers receiving dividable containers from ports, industrial cities... either by air via aerocarriers 74 (carrying containers 44) or conventionally via autonomous trucks 111 by land, such that the multi-level fulfillment centers divide the containers 44 into smaller dividable ones, and distribute their items via autonomously driven belts through multiple, elevators, levels and belts to be handled to the aero-carriers 74 carrying the parcels 73, this process is not carried via scanning labels or bar codes but by arranging the items in theoretically numbered sequence, assigning numbers for each item order according to its location, tracking its location via the movement sequence it is following, as example, a parcel located in the 4th row third column inside a container will be already assigned the number 43, then a robot 26 which is downloading the container items, once it picks up this item, it will know that this item which is occupying this spatial coordinates in the container 44 is item 43 referring to specific order..., and when the robot 26 locate it on a part of the belt 101 which will drive it to the aerial handling outlet to be picked by a drone 22, the (IAT) device 21 of the belt will be acknowledged by the (IAT) device of the robot 26 that this item (parcel) which is put at this instant for example on cell 346 is referring to parcel 43 referring to specific order..., when the drone 22 receive it at the exit, it will be updated by the telepathy device of the belt 101 that this is parcel 43 referring to order details...and so on, the location in the spatial coordinates judges who is this... rather than stickers, labels, barcodes... even these parcels 73 are having order details. So, the autonomous airports in the third stage will be handling parcels 73 to UAVs 22 to manage city retail commerce, in addition to restaurants sending meals via UAVs 73 from one terminal inside city airport under one authority running the whole facility departments and machines via artificial telepathy communications, rather than a vast restaurants, shops, malls, couriers scattered randomly around a city with unorganized aerial delivery routes, meanwhile the city autonomous airport authority (Sever 61 and CATS 112) will run the operations and routes of thousands of AALVs 20, UAVs 22, aerial delivery rooms 68 in the air over a city via communicating with their (IAT) devices 21, and providing recovery service,
Robots 26: robots 26 can approach their destinations easily, depending on artificial telepathy, wherein they can recognize each tiny part in anything front of them equipped with (IAT) device 21, for example a robot setting in front of a computer, depending on the geometrical positioning of its fingers in relation to the geometrical positioning of each key on the keyboard... it can move and locate its two hands and fingers over the keyboard and start printing, in the same it can serve, diagnose, service, repair, maintain either dynamic or static objects which are having well known spatial coordinates (geometrical topography) provided by their (IAT) devices 21, such that a robot 26 can carry out a job of either a surgeon or a technician or even a pilot.
e- City facilities: during this stage, everything scanned, pictured, located, by the all kinds of vehicles and UAVs during stages 1 and 2, in addition to their movements through every point inside the city, these data in addition to any uncovered ones, will be used to establish a city central artificial telepathy station (CATS) 112 or city minds, locating each physical dynamic or static part in the city, such that UAVs 22 and robots 26 services can be extended to be comprehensive and covering every corner in the city without a danger that it may hit anything. During this stages other machines too become autonomous, examples are including but not limited to: For people living in compounds over shopping centers, hypermarket, they can make their orders online, wherein their ordered items will be filled either via autonomous robot 26 inside an autonomous trolley 113 from autonomous shelves 114, once the trolley 113 is filled, it will be motorized / driven towards the specific flat, wherein an assigned password will open it for the customer to receive his order. Other example: communications in-between production lines, wherein the machines can communicate data to each other such as: a- if a machine face a problem, the others in the production line will not continue working blindly such that the fault may approach them or even damage them, b- the machines will know the current situation of a product and its specific location before it approaches them or being received from the preceding ones with a faulty case.
f- Humans and livestock: Even the prior art is showing examples of humans implanted with RFID chips (transponders) 53, in the third stage all humans have to be equipped with highly secured (IAT) devices 21 , to be added to the recognized dynamic physical bodies which are becoming observable and recognized by other machines and structures. g- Police and traffic fines: Policemen will disappear from the roads, traffic fines will disappear too, the provision of police will be watching the security of these (lATs) 21 and (Sub-IATs) 67 devices to be acknowledged about any failure of them or any physical attack, hacking, or cyber-attack. h- (lATs) 21 and (Sub-IATs) 67 devices tiny sizes: During this stage the telepathy devices should become so tiny as Nano-technology, quantum physics, and modified artificial intelligence is expected to be used vastly to modify and produce them, as a result, a twin tiny telepathic chips (TTCs) 115 can be firmly attached or implanted in or on a human body surface as well as livestock, to support both existing technical application in the prior art in addition to new ones such as: 1- A vehicle can get from parking and come to its owner according to his shared spatial location via his (TTCs) 115. 2- Any person can know in prior if a restaurant, bank, park... is crowded, even he can know how many people are waiting in a queue of selected facilities, so it is not only to know vehicles traffic level, but humans too....3- The (TTCs) 115 can be a passport, identity card, ticket, gate pass, multiple devices pass, and credit card updated with data and to be used to identify and charge its bearer just by being identified and communicated via a telepathic device in any facility computer, gate, robot, cashier machine... etc. 4- The hostess in aircrafts, trains., can recognize easily on any display that the right number of passengers and correct ones are seated in their own specific seats. 6- A person filling an autonomous trolley 113 or basket with items, can be communicated by the trolley about the value, he can just press accept, then he will be charged directly and the value is deducted from his credit without being counted by cashier. 7- The (TTC) 115 can be loaded by many applications related to health, job, and security gate pass...
At the end, the artificial telepathy which is applied on-land and in air will be applied on and in sea, rivers, and even in-between remote space stations tools, devices, and machines for carrying perfect tasks and operations without unsafe, risky, inaccurate human interference. i- Human to machine telepathy: the developments to be extended during this stage to include many areas for human to machine telepathy, such that: 1- the currently and future developed tiny electronic chips which are to be swallowed to be moving inside a human body to perform a job of a tiny lab, will transfer their data to the (TTCs) 115 which are too may be connected to the human nervous system watching his pains and heartbeat, to be passed to his doctor, or even the nearest hospital (IAT) devices 21 at emergencies (During 2010 scientists in Canada got managed to connect neuro-chips to many nervous cells) in the brain. 3- The human (TTCs) 115 can read the thoughts directly or indirectly using other tiny devices, (TTCs) 115 may receive verbal voice or telepathic orders from the humans, and pass these orders to an autonomous vehicle 77 to come to address:... at time:..., or to order a robot 26 to do a task by receiving verbal voice or telepathic thoughts and orders via his (IAT) device 21 , or even to order some viewed items online via specific gestures, and to select the method of delivery via specific gestures too.
j- Human to human telepathy: even it looks so complicated, but (TTCs) 115 wired or wireless connected to the nervous system, can assist humans provided with synchronized, mating or identified telepathic chips, with downloaded application on their smart mobiles, watches, lenses, virtual reality lenses...etc., with an option to make a telepathic phone call, wherein via such a call humans can share their: viewed visual scenery, heard acoustic sounds, thoughts, discussions, emotions, and even share relatively adjusted pains with doctors, or perfect pain with artificially robotic doctors, to know, assess, and diagnose a disease, or even to train the patients. A human (TTCs) 115 can receive data from a Nano-lab chip rounding inside his body about the ratios and concentrations of microbes, germs, bacteria cells, cancer cells, biochemical constituents... and recommend him to adjust his food or suggest food list or diet, or report these remotely to a doctor or to an artificially intelligent robotic doctor... The sceneries and sounds can be shared with the visual nerves in the brain of a blind person walking near to a normal one to navigate his road clearly. Dead human's locations or living humans' heart beat spatial coordinate's recognition wherever they are trapped under collapsed buildings, structures due to earthquakes, volcano, and floods or even trapped inside fallen aircrafts. Humans can pick up photos, scenery video, record sounds and voices without cameras or sound recorders, but by passing these through from their eyes visual nerves, ears...to the memory of their (TTCs) 115. A voice of preacher speaking to crowds is passed without mikes or speakers 86. Selected channel news at specific timing is received via (TTCs) 115 without watching or listening to any device. A coach can instruct his team or any player in the field via his and their (TTCs) 1 5. A football game video assistant referee (VAR) can locate artificially a player if he got into offside by installing such (TTCs) 115 in the players' shoes, and informing autonomously and instantly the main referee. An arrangement of these (TTCs) 115 can be installed inside a football to decide instantly if the ball crossed the goal line or yard other lines. A reporter drone provided with a TV screen, mike and speakers 86 can navigate through the crowds to approach specific corner depending on (IAT) devices 21. A teacher can pass his telepathic lectures to his receiving students anywhere, wherein virtual reality boards or glasses can be used to be the background for eyes and senses projected views. A person can know if his flat or villa (IAT) device 21 discovered an undefined person with an unidentified (TTCs) 115 crossing its identified shape boundaries or dimensions such as a thief attempting to make a theft. A facility outdoor built-in drone can be launched to track anybody if any mass shooting, attack; theft trial is observed or heard by the building (IAT) device 21 when an unidentified person passed its barriers or boundaries. Personal weapons, pistols, guns... can be provided with (TTCs) 115 that may inform the police instantly if shooting is made. A person anywhere getting injured, heart attack, killed can be approached according to messages received by the nearest medical center about his heart beat... or other health safety parameters, a message can be passed to police if the artificially analyzed parameters are related to an attack. Streets, homes, or other facilities lights can be operated according to the availability of (lATs) devices 21 or (TTCs) 115 according to their types and quantities in specific locations without the need for sensors sensing them. Telepathically inspecting chips (nana- labs) (TIC) 116 can analyze and gather data from undersea or inside an oil well and pass it to nearby ones up to the deck on the top.
So, the third stage of the comprehensive autonomous city or the artificially telepathic city will not end by being limited to removing non-autonomous vehicles 77, posts, columns, traffic lights, signboards, reflectors, and even removing many lights or switching them off while roads are not in use as they are no more required for drivers to view the roads, or fully solving vehicles high traffic in the roads and humans random crowds in the markets and malls, or long queue, or time waste on driving or rounding between shelves, shops, authorities, farms...etc., or using less cameras, less use of smart screen devices, no pollution, no ID cards, passports, bank cards, keys, or even wallets, but also comprehensive revolutionary high end HSE, training, education, security, energy saving, organized commerce, organized low aero-space, aerial, and land traffic infrastructure, and a better fast understanding govern the humans, robots, and machines communications.
And so, such a piece of chip can be a peace of mind for humanity happiness. Based on its conceptual features and tremendous benefits, researches can be launched for other new branches of telepathy including but not limited to: biological and microbiological telepathy, cellular telepathy, chemical telepathy, biochemical telepathy, medical telepathy...using chemical and biological newly created types of software, up to atomic, quantum and space-time telepathy.
Obstacles can be put against human-machine-human artificial telepathy, but the current art already provided the industry with successful trials to pass the neural signals via artificial devices from one part in the human body (brain or spinal cord) to the legs. Another major obstacle is the crossing thoughts in human minds which will be transferred to the other party may be disturbing the whole communication. But, nowadays, there are millions of different wireless telecommunications and radio, TV stations... loaded over electromagnetic waves, without interference, as these data are loaded in different lengths and frequencies of waves, and filters are used in devices to filter interferences, the same principle can be applied on human crossing thoughts after further investigations, wherein specific subject thoughts are communicated in-between human telepathic devices.
Devices supporting (lATs) performance: During all stages, some services and scenarios are supported with (lATs) 21 and other devices, tools and sensors which are used to provide the artificial telepathy device with the needed data to be shared with others:
Examples for the artificial telepathy devices interactive collaboration:
1 - A modified flat, office, villa or warehouse window 75 with aerial reception capabilities, which is equipped with (IAT) device 21 will follow up the shipment or parcel 73 route carried by the UAV 22 and can update the buyer..., once the UAV 22 arrives, the aerial delivery window 75 will be already recognizing the visitor (delivery drone 22 or local pick-up drone 72) it will automatically open to receive the shipment 73 without communication via scanning tools or laser beams, the artificial telepathy recognition is enough. According to its type, as demonstrated in Fig. 19 (A- C), the window 75 will open in one of the following manners:
a- Linear (sliding) reception mechanism 117: The window 75 glass is opened upwards, a receiving box 118 with open top slide on rails or pushed conventionally outwards to receive the shipment 73, then its weight or load proportioning sensor 119 measures the weight or sense the load of the shipment 73, then the box 118 is pulled conventionally inward and the window 75 is closed (Fig. 9- A).
b- Swinging reception mechanism 120: The window 75 glass is opened upwards, a reception basket or box 118 with open outer side is swung conventionally outwards to receive the shipment 73, then its weight or load proportioning sensor 119 measures the weight or sense the load of the shipment 73, then the basket 121 is swung conventionally inward and the window 75 is closed (Fig. 19- B)..
c- Rotary reception mechanism 122: The window 75 glass is opened upwards, a receiving box 118 with open top is rotated conventionally from the window 75 sides outwards to receive the shipment 73, then its weight or load proportioning sensor 119 measures the weight or sense the load of the shipment 73, then the box 118 is rotated conventionally inward and the window 75 is closed.
The motorized mechanism of the window 75 can be harvested too to pull the box or basket out via ropes, string or belts in any suitable conventional arrangement.
d- Ducted reception: For avoiding opening and closing the windows 75 to the outer atmosphere, an aerial delivery transparent smoothly curved duct 121 is installed vertically along a building side facing a fixed glass of a building facade to handle parcels 73 to flats in different floors, such that an opening is made in each flat glass open to the duct, the duct 121 receives parcels via a top room over a building, and moves them down similar to elevator principle, when a parcel over a horizontal plate moves down to face a glass opening, wherein behind the parcel from the transparent duct side, a vertical plate is located (can be transparent glass too), such that via a retractable motorized string or rope, pulling the string inwards over a pulley on the front and back edges of the horizontal plate carrying the parcel 73, will pull the vertical plate forward sliding on rails or through grooves from the bottom, the forward pull will push the parcel forward inside a flat receiving box 118, while s motor (located under the plate and connected from both sides to the belt/rope/string) will retard the vertical plate back after the task is finished, this method of plate deliver can be titled Linear (sliding) reception mechanism 117 too (Fig. 19- C). - A (lAT) 21 device in a traffic lights 89 at crossed roads receives the data from the (lATs) inside the vehicles and from the humans (TTCs) 115, through algorithms and matrixes, each traffic light 89 can evaluate the timing and size of vehicles or humans who will be coming from one way or one side, as a result it will manage a precisely organized flow and traffic in a neat and systematic manner. Actually at the end of the third stage, an autonomous city does not need traffic lights 89, as (lATs) 21 and (TTCs) 115 will cooperate and organize themselves where to give the priority for traffic flow, and where to stop to let the walking people pass via modified lighting or pointers systems.
- Accident scenario: (lATs) devices 21 are installed to prevent accidents to extreme ratios, and if any accident happens, then to decrease its drawbacks to extreme ratios, wherein if the human drivers understand the intentions of the other drivers depending on the lights: brake lights, turn signal lights, or acceleration, deceleration, or trying to read or guess their intentions, or even to assess the capabilities of the other cars based on their types, the (lAT) devices 21 are far more intelligent than to watch or observe to predict guess, as each one can know in prior the following: a- The other vehicle selected speed mode: low, medium, high, wherein those with similar driving speed mode are following the same lanes. b- Via the server 61 , assessment of the numbers and ratios of each group of vehicles with similar driving speed modes is done to carry out a unique distribution of the vehicles upon the road lanes and even to adjust the different speed modes autonomously to converge when the lanes number is not enough to distribute the vehicles with different speeds modes over them.
c- To share with a cloud server 61 over a network with the neighborhood vehicles all other parameters related to the programmed/selected plan trip, such that the cloud server 61 knows already the short term, medium term, and long term intentions and distribute the routes in- between the vehicles, and to know already when a vehicle needs to move from lane to lane leading to its destination to create for it a space in-between other existing cars in that lane, while in the third stage the vehicles will be grouped as convoys, wherein the best practices for passing a convoy from one lane to the other is planned already, and the group starts shifting from left to right to take a right turn on the next cross is either done for them by merging with the right side lane existing vehicles according to the number of existing vehicles in that lane (distributing and merging according to ratios and priority), depending first on the availability of spaces, if enough spaces are available the merging will be in bulk and may be without speeding down or up the existing vehicle in the target lane, if not, the merging of them will depend on ratios, by making one vehicle space in-between each two, or three, or five... cars of the extreme right lane to receive vehicles from the middle lane. The speeding up or down of vehicles existing in a lane to make spaces to receive other cars, is done based on lane traffic levels, road condition, and next cross vacant availability and priority,
The data for next 24 hours planned trips and planned handling of the autonomous vehicles 66 via a server 61 will be collected from the following resources: 1- It will be highly recommended from the private autonomous vehicle 66 riders to set in prior their editable driving plans for the next 24 hours. 2- To set in all of their vehicles their home location and job location and working hours and timings. 3- The public autonomous vehicles 66 too should be provided with such data from their companies, schools, institutions...operators. 4- All road service authorities should feed their next 24 hours planned trips into their vehicles in addition to any specific locations for road maintenance. 5- The traffic departments receive data about the existing road physical condition from the autonomous vehicles 66 which used the road in the past hours, in addition to data provided from the meteorological department about the expected weather conditions: rain, dust, ice, fog, snow, floods... etc. 7- According to all data in points 1 -6, the official regional artificial intelligence learning systems in the service providers carry out intelligent assessment for the expected traffic levels, timings and make a pre-set programs how to distribute the autonomous vehicles for the next 24 hours over the city roads in a fully optimized manner with a factor of safety and considering the other non-planned trips and road users where instant updating of the program is carried out, while updating all autonomous vehicles 66 in addition to messaging their passengers about the timings which they should be available inside their autonomous vehicles 66 to arrive their target destination smoothly, on time.
e- Foggy weather, dark roads and autonomous vehicles 66: when the roads are fully occupied with vehicles using (IAT) devices 21 , the autonomous vehicles 66 will know their routes without stopping driving or even dramatically decreasing their speeds, as these vehicles will know already the location of each vehicle in relation to a group of neighborhood vehicles.
f- Accident scenario: based on the above it is clear how it is so difficult to get autonomous vehicles 66 getting in accident specially when knowing that they should be already provided with fail safe programs to handle emergencies, but let us imagine an accident happened due to accidentally fallen matters from one of the vehicles: cartons, oils...etc, or from the trees, or rocky falls or even something fallen from the sky, such issues are unavoidable, once an autonomous or semi-autonomous 85 or non-autonomous 77 vehicle faces such issues, the situation should be assessed then managed autonomously depending mainly on the data gathered by the panoramic computer vision cameras 95 and (IAT) devices 21 of the vehicles observing the incident, not only from one, but from all of them, according to the following: 1 - any sudden abnormal noise, vibration, bouncing... observed via the NVHS (Noise-Vibration-Harness system) 123. 2- Any accompanied change in tires pressure. 3- any observed uncontrolled change in speed, acceleration, deceleration, direction, steering, vehicle stability control, vehicle body turn over, ABS activation, sliding, wheels angle change, change in lane... 4- any observed or sensed physical matters (not having telepathy devices) sensed ahead of the vehicle by the panoramic computer vision cameras 95. 5- Any messages received from a (IAT) device 21 in one of the autonomous vehicles 66 in the front about change in a truck box luggage weight (reflecting falling items). 6- Any messages received from a (IAT) device 21 in one of the autonomous vehicles 66 in the front about change in a fluid level (reflecting sudden fluid leaks). 7- Any serious or hazardous warning messages appearing on the instrument cluster of the front vehicles related to malfunctions affecting safe driving such as: engine, ABS, tires, airbags, dynamic stability control, electronic suspension... All of these data should be handled via four methods: a- to slow down the speed of the autonomous vehicle 66, keeping the vehicle as much as possible inside its target and planned lane for the next seconds, b- to stop the vehicles if necessary, c- in parallel and connected to points a and b, to pass an instant messages to the server 61 and all (IAT) device 221 in the neighborhood to be updated about the emergency and the recorded situation and current autonomous handling, so that all of these vehicles behind and at the back instantly change their lanes autonomously and intelligently away from the accident area while decreasing their speed depending on the situation, while stopping directly if the state is related to turn over, or more than a vehicle sharing in an accident or fluids are observed on the road. D- a human interference should be carried out instantly, wherein in a control center it is seen via vehicles' panoramic computer vision cameras 95 that the instant assessment of the state requires an instant handling, as a result a controller may initiate a specific system level to be activated by one click to handle the traffic autonomously, such systems can activate: 1- Drive carefully mode, b- Slow down, c- Fully stop driving, d- Change the road. 2- To send a call message for all teams and autonomous Aero-Land vehicles 20, Ambulance 109, Police car, recovery service, in addition to UAVs 22, and police drone for assisting and assessment of accident, firefighter drone 70, flying robotic road cleaner to carry out any urgent tasks.
NVH system: triple springs + triple shock absorbers.
In fully autonomous vehicles, as the driver mainly will be no more busy with the road, he will be a passenger rather than a driver, then the vehicle compartment need to be reshaped, redesigned, and redeveloped to be used for other activities, but, because the current vehicle vibrations and noises may be unsuitable enough for most of drivers or passengers free activities inside an autonomous vehicle 66, smooth and comfortable vehicle motion should be developed. As a result, the suspension comfort style should be developed more from at least the following aspects:
1- The current conventional spring-damper (spring-shock absorber) assembly is a one unit designed to handle different types of road textures: humps, potholes, rough surface, cracked surface with all types and sizes and dimensions.
The current humps in the urban areas mainly made to slow down the vehicle speeds, but for autonomous vehicles 66 this is not required, as the vehicles will be programmed or to know already depending on (IAT) devices 21 estimates to slow down where needed, e.g: control the speed depending on: existing hump received specifications, or depending on the size of a crowded place...
Mainly, the majority of the road textures in a modern city are suffering from variations in the road texture which is currently varying in height in- between 0- 25.0mm, such variations in the road textures are creating unsmooth driving for those who will make new activities inside the autonomous cars 66 as their cars conventional spring-damper 124 units are handling all road variations using one single unit type in each corner for different types of road variations.
Part of the suspension system which needs to be modified is the spring- shock absorber (spring-damper) system, wherein it should handle the road textures in three stages depending on road texture variations: Stage-1 : The road texture variations: 0- 5.0mm to be filtered, tuned and absorbed as a slight roughness in the texture. Stage-2: the road texture variations: 5.0-25.0mm be filtered, tuned and absorbed as a normal roughness in the texture. Stage-3: the road texture variations: over 25.0mm in addition to other humps to be absorbed as a high roughness in the texture ...etc.
So, to handle the three stages, three spring-dampers units should be used. For demonstration, they are demonstrated in Figure 20 separately, while they can be reproduced in a compact single unit (multiple-stage spring damper unit 124) with varying number of coils per unit of distance, varying stiffness along its length, while the damper is provided with more three inner pistons instead of one with different dampening, such a modification can provide unique handling of different road textures via one unit but with different dampening levels. So, for demonstration, the three spring-shock absorbers will operate like the following:
a- Stagel- first part spring-damper 125: a small diameter, small width spring with extra coils and less stiffness, coiled around a small diameter shock absorber, and it's the only one starting to handle the 0.0-5.0mm rough texture.
Figure (20- A) demonstrates how this part handle a small texture roughness in a road alone, while Figure (20- B) is more close to actual demonstration, wherein it shows how this part of a compact unified unit of a spring-damper system takes role to filter/absorb a small texture roughness in a road. In figure (20- B) the overlapping multiple- mode frequency represents a part of road with multiple modes of roughness at once handled at ones via multiple-mode spring-damper system.
b- Stage-2 spring-damper 126: a medium diameter, medium width spring with medium number of coils and stiffness, coiled around a medium diameter shock absorber, it's starting its operations by handling the 5.0- 20.0mm rough texture. Figure (20- A) demonstrates how this part handle a medium texture roughness in a road alone, while Figure (20- B) is more close to actual demonstration, wherein it shows how this part of a compact unified unit of a spring-damper system takes role to filter/absorb a medium texture roughness in a road.
c- Stage-3 spring-damper 127: a conventional normal diameter spring width with the normal number of coils and stiffness, coiled around a normal diameter conventional shock absorber, it's starting its operations by handling the 20.0mm and above rough texture, humps, potholes...
Figure (20- A) demonstrates how this part handle a high texture roughness in a road alone, while Figure (20- B) is more close to actual demonstration, wherein it shows how this part of a compact unified unit of a spring-damper system takes role to filter/absorb a large texture roughness in a road.
The connectivity of this multiple stage unified spring-damper system can be done conventional in-between upper and lower control arms, or wheel ankle and cross member, it is important to note
The most important to note, is that in multiple stage unified spring-damper system, there is a symmetry of the top part and lower parts, that is to mean, the first stage is not at the top and the last stage is at the bottom, but the first stage is the same on both sides top and bottom, while the third stage is centered on the center and in-between them located the second stage, so from the bottom toward the center and from the top toward the center, the stiffness, diameter, of the unified spring-damper increases toward the middle, that is to mean the 1.0mm and up rough texture is having a triple type of slight, medium and humpy roughness, and so these are handled by three parallel and consistent stages of tuning and filtration, this will create an optimum drive comfort. Creative solutions can be invented in the same to develop and modify other parts of the suspension system.
Semi-pneumatic tire 128 (refer prior art): Conventional run-flat tires will not be a suitable choice for autonomous vehicles, because once these got punctures, their side wall reinforcements will not support any comfortable driving, nor to support a high speed or long distance driving, a better solution is to use semi-pneumatic tires 128, these tires can provide a better load and heat distribution while driving which minimize vibrations and elongate tires life, and they will lose a very little amount of their tires pressure and can be still driven safely and comfortably until changing or repairing the tire.
Noise, vibration and smoke: As the drives will be busy in other activities including sleeping, watching movie, working on machines... etc inside the autonomous vehicles 66, they may not hear the noises resulting from a faulty part in the vehicle specially suspension parts, and other motors, wherein mechanical part are unlike electrical parts which can be diagnosed via reading the fault memories stored in the memory of their related control units, mechanical parts diagnosing mostly is based on noise / vibration symptoms and visual check. As a need, a net of acoustic sound collectors 129 (sound-noise level meters) distributed all around the vehicle are installed with filtering tools, these aim to pick up and filter the abnormal sounds or noises related to faulty parts rather than road noise, audio, other used optional machines inside vehicle compartments, these sounds and noises are to be compared with pre-programmed sounds and noises, such that an abnormal sound microprocessor estimate and makes diagnosis and assessment of the seriousity of the fault, upon which the passengers are warned, speed and lane is adjusted, (IAT) device 21 should share the diagnosing results with the (IAT) devices 21 of nearby autonomous cars to be aware that a serious fault in such a car may happen, meanwhile the vehicle's (IAT) device 21 can contact a (IAT) device 21 in the most close autonomous recovery truck 108 to agree where to meet, while the same vehicle's (IAT) device 21 arrange with another rented autonomous vehicle 66 via its vehicle's (IAT) device 21 too, to come ahead of the autonomous recovery truck 108 to shift the passengers in-between, not only this, but through the vehicle's (IAT) device 21 , a communication is initiated with the auto-dealer robots 26 to book an appointment to receive this faulty autonomous vehicle 66 (Figure 21).
It may be important to pair or mate the noise level meters with vibration meters 130, as the noises sometimes can be differentiated according to vibration levels, this will be so helpful for the autonomous diagnosing tools or robots to approach the faulty part within a short period. Such an arrangement will be so beneficial for electronic management system monitoring a hydraulic suspension system which is built up to handle multiple modes of road surface roughness depending on suspension independent parts of a same unit handling such texture differences.
Another example, where other faulty electronic devices or wirings which are burnt and creating smoke can be diagnosed via smoke detectors 131. Faulty mechanical and electronic parts data are passed to the (IAT) device 21 to handle serious emergency cases in the same, wherein in the case of smoke or burnt devices, the telepathy device send a message to a telepathy device in a firefighter drone station 132 to send one or more firefighter drones 70 to handle the case if things gone bad or out of control.
The net of acoustic sound collectors 129 and vibration meters can be used too for recording of road noise to know road type, texture, conditions, to adjust vehicle's speed and share these data with other a data server 61 over a network to share it with other vehicles.
As a result, the autonomous vehicles 66 will be diagnosing the roads conditions from any aspects using a set of visual or sensing devices.
As no more steering wheels, steering collapsible column, steering rack and pinion, are required, the wheels turning can be controlled via two side hydraulic piston-cylinder units 133, and so no tie rods are required, wherein in such a case it will be more easy to install wheel turn units to the rear wheels too, such that four wheel turning (steering) (4WT or 4WS) 134 electric commercial and passenger autonomous vehicles 66 are manufactured, which can be driven autonomously either forward or backward without noticing any difference, this is so helpful in access or exit of narrow locations.
Vehicle's indoor/outdoor environment (IOE) At the final stages, the city will be fully autonomous, and the transportation means will not only be 100.0% fully driven, but also: a- The normal driving styles with sudden acceleration or braking, turns, changing mind, different styles and modes of driving habits, crazy, sleepy, drunk...styles will have no place on the road.
b- No more humps.
c- The NVH system with its modifications will create a full comfortable and smooth driving environment,
d- Convoy or train carriages driving style which is neat and smooth like military march will be lived daily. And so, the passengers will not only enjoy setting on seats without driving, or got free to read or watch sceneries, but the outdoor comfort will be reflected and result in a whole indoor comfort zone, which is created inside the transport means which creates an inner static feeling rather than dynamic driving with some vibrations and tilting on normal cars, this comfortable and static like environment on road or in air, will be so enough encouraging to change the whole current inner equipment and upholstery inside current vehicles, to be more attractive, selective and optional.
For example, a Sedric Volkswagen type or Electric Mini bus are offered with inner facilities and equipment style, creating a stylish comfort like: office style, restaurant or coffee shop style, trip style, aerial and land city cruise style, entertainment style movie (cinema style), comfort with bedroom (foldable sofa), diabetic comfort with toilet, business center style, sporty style with exercise facilities, tourist style, prayer style (Church, Mosque, Synagogue...or any other temple like selected option or even yoga style), ambulance facilitated with robotic surgeons, delivery truck equipped with sorted autonomous shelving and delivery drones, or having a Jacuzzi, truck workshop style equipped with 3D printers or any other tools..., kitchen fast-food style, mini-showroom style, artist style... dentist clinic, technician onsite service, or even equipped with task force UAVs 22 or robots 26 special services (Fig. 22). So, an athlete car is unlike an engineer car, and a doctor car is unlike an IT provisional car, and a businessman car is unlike a student's car, each one is customized from the factory with a built-in or optional facilities meeting the desires, hobbies, professions, or specific situations and selections.
Similar to aircrafts and sea ships, It is worth to notice here that the seat belts may not be mandatory used during normal driving, but alarms to remain seated or return to seats and fix seat belts will be issued when necessary.
The airbag system will be customized for each autonomous vehicle 66 depending on its area of use.
Autonomous vehicle's dialog with a driver Currently, artificial intelligence robotic psychologist updated with all psychiatry data succeeded to cure patients better than provisional psychologists depending on dialogs with the patients, the robotic psychologist data is so huge compared to knowledge of one, ten, or hundreds of psychologists. In the same an autonomous vehicle 66 dialog management electronic control unit (DM-ECU) 135 will be updated with all humans dialogs in each specific language in addition to hugely yearly edited texts referring to specific practical dialogs which should be communicated in-between the autonomous vehicle 66 and the owner, the following is an example for a dialog in-between an autonomous vehicle 66 and its owner, not only reflecting the dialog management capabilities of an autonomous vehicle 66, but the technical features and applications of the new autonomous vehicles 66 provided with (IAT) devices 21 supporting: daily management of the owner requests, vehicles own needs self-management, its communications with other machines, robots 26, servers 61 , UAVs 22, interchangeable aero-land configurations...etc., expressing a feeling, having a sense of humor, remembering and reminding, updating and recommending, handling and receiving, picking up and dropping, waiting and suggesting, fixing appointments and waiting, carrying indoor (inside vehicle) automated tasks for a busy owner, intelligent self-task management (ISTM) 136 wherein each part is run via a microprocessor inside the (DM-ECU) 135. The following is an example for a dialog in-between an autonomous vehicle 66 and an owner with an engineering provision:
Owner: Hello My-Chevy-1 !
My-Chevy-1 : Hello Sir!
Owner: Where are you? What are you doing now? My-Chevy-1 : Sir: Could I share with you my location and today tasks report via your smart screen 137?
Owner: No, my smart screen is away, tell me verbally please?
My-Chevy-1 : Sir: After dropping you to your office, I went to the washing station, I was a little bit delayed, the waste washing liquid tester 138 showed extra muds was sticking on my bottom side, you insisted yesterday to drive off- road through a muddy road.
Owner: Oh I am sorry, but may be it was the fate of that cat stuck in the mud, luckily we rescued it!
My-Chevy-1 : Oh, poor cat! Anyway, as agreed if I receive a message from the washing machine service workshop in-between 9:00 - 12:00 I should go there to bring the washing machine, I received the message, I went there, unfortunately the delivery robot 26 informed me to wait a little bit more, he offered me to receive an electric charge while waiting, I did.
Owner: Alright, but why the delivery is delayed? My-Chevy-1 : as per the robot, a human accidently poured his coffee on our machine; they required extra time to clean its body.
Owner: Then? My-Chevy-1 : Do not worry, the robot offered me a 15.0% extra discount as goodwill, then I paid it the bill; I saved it under in my desktop under home appliances bills. By the way, I recognized near me an autonomous vehicle 66 called Cadi236 from our same parking waiting there to receive a quick service for her boss floor robotic cleaner, I introduced myself and it did the same, she told me it got used to carry her boss and kids to a nice park in the city, it recognized they are always happy to do that, as it is an aero-land autonomous vehicle it shared with me top view photos for the park while it was flying over there last month, I will share it with you in addition to an address for autonomous vehicle kitchen 139 there providing service via robots 26 (vehicle to vehicle), it may be better than going to your muddy roads!
Owner: Ha Ha Ha, Are you kidding? By the way it looks that unlike our expectations before twenty years where we expected robots will serve humans, but it is serving autonomous vehicles 66 Instead. My-Chevy-1 : sorry boss! I am just kidding, yes that is true and we autonomous vehicles 66 have the honor to serve you instead. My neighbor autonomous vehicle 66 also mentioned to me the general job of her boss but without further details due to security reasons, I told her you are having the same job, I shared your company brochures with her, she promised me to share them on her movie screen 106 while her boss is back home.
Owner: Very great, what about my shoes replacement? Did you pick them up?
My-Chevy-1 : No sir! They said as goodwill, they will either send it to your home via aerial delivery today evening at 5:30, or else their delivery UAV 22 can handle it to me in the parking, So, now I am going to the restaurant to pick-up the order from their window 75 delivery manipulators 140, then to the school to pick-up kids:..., then back to you.
Owner: Ooh, today I am leaving earlier by 15 minutes, please come early. My-Chevy-1 : Alright, I will adjust my speed mode to High speed. Furthermore, please note that I will not be able to pick you up from your company main gate as I am updated about a minor road repair there up to 18:00 today, I will pick you up from the back gate.
Owner: Thank you for reminding me, anything else to remind me? My-Chevy-1 : Ah, my 3D-Printer will finish soon the models which you shared with me, I will share with you their photos, furthermore tonight you have a guest on my dick, you asked me to remind you to bring some coffee capsules from your office for my coffee machine!
Also, you may join your friend to the park if you like the photos! Owner: Oh! Thank you My-Chevy-1 ! Anything else?
My-Chevy-1 : Today I will receive and download a program from the neighborhood server 61 related to be acknowledged to leave a place under fire to a specified recommended safe place.
Owner: Great. I like you, good-bye! My-Chevy-1 : Me too, but without mud! Just kidding! See you...
From a sensor - control unit - actuator point of view, Figure 23 is a flow chart briefing all devices, control units and autonomous machines collaborations.
It is obvious and understood based on such a dialog how other autonomous vehicles 66 too will have their similar and own automated technical applications meeting their owners' provisions and daily demands especially outdoor, more than expected currently from robots.
Many manufacturers currently developing UAVs provided attached to vehicles, it is worth to add a multiple task UAV 141 with compact shape, like a closed book cover, such that it is composed of a foldable screen 142 e.g.: of two folded parts, from outer side provided with a foldable screen, while the inner side is provided with foldable arm rotor blades 143, mike 144, hook for carrying small and mini-parcels or hanging orders from nearby restaurants, camera..., so it is a book shape UAV with outer smart screen and inner accessories (Fig. 24). Industrial applicability:
1- Autonomous Interchangeable Aero-Land vehicles are based on vehicles currently available in the market with modifications and installations of ducted propellers, hydraulic and motorized mechanism for moving the side propellers, gates, control systems... etc made from available tools, parts, mechanisms, with applicable modifications.
2- Autonomous vehicles, UAVs, AALVs intelligent artificial telepathic (IAT) device is made by collecting positioning devices, and measurement meters of motion and directions into a compact united device with uploading geometrical topography shapes and dimensions data which are available for each machine before its manufacturing.
3- City urban planning structures, constructions, and facilities intelligent artificial telepathic (IAT) devices are made by collecting positioning devices, and measurement meters of motion and directions into a compact united device with adding geometrical topography shapes and dimensions data which are available for each urban part before its construction or modification.
4- Computer vision panoramic cameras are made by increasing vision lenses for each of the currently available computer vision camera, and simply using these from different positions for collective data construction of a realistic 3D shape with actual dimensions, position for each part of the vision.
5- Communicating data to a local (neighborhood) server over a network for machines automation and collaboration is currently available techniques and method in the art, which are to be expanded to handle collaborative data in-between intelligent artificial telepathic (IAT) devices.
6- Using a digitalized shape, geometry and texture to identify a machine or facility with its static or dynamic shape data occupying a space of identified boundaries is just data replacing expensive high tech devices such as: Lidars, Radars, Laser vision devices, Distance sensors...
7- Using intelligent artificial telepathic (IAT) devices not only makes roads and streets perfectly clear for autonomous vehicles, or low space flight routes visualized as actual, but also make every machine facing another machine for collaborative handling or engagement, having a clear vision of each small or tiny part in the other to engage with.
- Transportation vehicles noise, vibration, harness, or smoke, can be monitored using conventional technologies available in the art by modifying suspension parts, installing sound/noise level collectors, smoke detectors, vibration meters...
- Modifications on the inner upholstery and the whole inner furniture, airbags, and equipment of the vehicle and replacing it with a suitable environment to work, relax, enjoy instead of driving, easily done specially on high roof vehicles which may be introduced widely to the art.
0- As the machines are provided with clear vision how to handle, transport, navigate, search, observe, think, decide, care, watch, plan, communicate, obey...carry out assigned tasks and initiate dialogs with owners... these will save time of the owners to concentrate on their jobs rather than joining for example a vehicle and getting busy with driving.
1- City dramatic changes starting from removing traffic lights, post, signboards, reflectors, controlling road columns night lights timings, organizing clean routes, up to organizing commerce, goods and parcels handling, retail sales, aerial delivery stations, windows rooms...up to autonomous city airport acting as a city center hub of daily commerce.2- Expanded modification on (lATs) and their conceptual applications starting physically by providing (Sub-IATs) for fewer tasks, to Nano-IAT chips, up to human use for initiating human-human or human-self telepathy.
3- Minor accidents, no traffic violations or traffic fines, more HSE, observable trials of thefts or crimes with early warning systems.
4- Organizing the traffic, distributing the transportation means in-between the roads and lower aerospace multiple layer routes, online sales, telepathy, work... etc., means very less expenses on new conventional transportation infrastructure saved budget for establishing a substitute modern autonomous aero-land transportation. Parts Drawing Index:
20 Autonomous aero-land vehicles (AAVL). 50- Distance sensor.
21 Intelligent artificial telepathy (IAT) device. 51- Camera.
22 Unmanned aerial vehicles (UAVS). 52- GPS unit.
23 Land structures. 53- RFID unit.
24 Building. 54- Remote sensing unit.
25 Machines. 55- Altitude meter.
26 Flying or normal robot. 56- Speedometer.
27 Human. 57- Accelerometer.
28 Natural or industrial obstacles. 58- Tilt meter.
29 Smooth vehicle. 59- Gyroscope.
30 Vehicle roof. 60- Compact (PMGW) device
31 Smooth aerodynamic body. 61- Server.
32 Side ducted propellers. 62- Clamps (hooks).
33 Jet propulsion engines. 63- Handles.
34 Ducting. 64- Propellers.
35 Exhaust nozzles. 65- Window.
36 Engine or Electric motor. 66- Autonomous vehicle.
37 Stowable propellers. 67- (Sub-IAT) device.
38 Side propellers. 68- Aerial facility room.
39 Gate. 69- Gate.
40 Tail. 70- Fire Extinguisher drone.
41 Small propellers. 71- Aerobatic cleaners.
42 Volkswagen Sedric. 72- Local Pick-up drone.
43 Side propellers. 73- Parcel.
44 Container, tank, capsule. 74- Aero-carrier.
45 Aerial convoy. 75- Aerial delivery window.
46 Modified aerodynamic flying body. 76- Artificial telepathy chips.
47 Navigation device. 77- Self-driving vehicle.
48 Radar. 78- (PMGW) Microprocessor
49 Lidar. 79- Integrated circuits. 80- Micro measurement meters.
81- Micro-device.
82- Instrument cluster.
83- Navigation system.
84- Assistant (IAT) display.
85- Semi-Autonomous vehicle.
86- Speaker.
87- Voice recognition system.
88- Telepathic intelligent learning system (TILS).
89- Traffic lights.
90- Turn signal lights.
91- Non-autonomous governmental vehicles.
92- Parking gate.
93- 3D Live head-up display.
94- Street radars.
95- Computer Vision Panorama cameras.
96- Glass chamber.
97- Wipers.
98- Collaborative reconstructive computer vision (CRCV).
99- Holographic tinted dome.
100- Intelligent data educating system (IDES).
101 - Belts.
102- Shelves.
103- Heat sensors.
104- Solar cells.
105- Container Aero-carriers.
106- Movie (TV) screen.
107- Manipulator.
108- Autonomous recovery service.
109- Autonomous ambulance.
110- Ambulance capsule.
111- Autonomous trucks. 112- City central artificial telepathy station (CATS).
113- Autonomous trolley.
114- Autonomous shelves.
115- Tiny telepathic chip (TTC).
116- Telepathically inspecting chips (TIC).
117- Linear reception mechanism.
118- Receiving box.
119- Load proportioning sensor.
120- Swinging reception mechanism.
121 - Transparent rounded duct.
122- Rotary reception mechanism.
123- Noise-Vibration-Harness system (NVHS).
124- Multiple-stage spring damper unit.
125- stage-1 spring-shock absorber
126- stage-2 spring-shock absorber
127- stage-3 spring-shock absorber
128- Semi-pneumatic tire.
129- Net of acoustic sound collectors.
130- Vibration meter.
131- Smoke detectors.
132- Firefighter drone station.
133- Hydraulic piston-cylinder units.
134- Hydraulic four wheel drive steering (4WDS).
135- Dialog management electronic control unit (DM-ECU).
136- Intelligent self-task management (ISTM).
137- Smart screen.
138- Waste washing liquid tester.
139- Autonomous kitchen vehicle.
140- Window delivery manipulators.
141- Multiple task UAV.
142 Foldable flexible smart screen.
143 Rotor Blades. 144- Hook. Patent Application Cited documents:
Patent Application Publication No.s Publication date Inventors:
WO2018122821 A2 05Jul, 2018 ALSHDAIFAT et al
US2018/0183873 28.Jun, 2018 Wang et al US9,947,145 B2 17. Apr, 2018 Wand et al
WO2017178899 A2 19.0ct, 2017 ALSHDAIFAT et al
WO2017178898 A2 19.0ct, 2017 ALSHDAIFAT et al
WO2017079229 11. May, 2017 Levinson et al
US 2017/0123422 A1 04.May, 2017 Kentley et al US 2017/0088143 A1 30.Mar, 2017 Shenhar et al
US 9,573,684 B2 21. Feb, 2017 Kimchia et al
EP 3 253 084 A1 16.Dec, 2017 Quan et al
GB 2548977 A 04.Oct.2017 Filev et al
WO2016203322 A2 22. Dec,2016 ALSHDAIFAT et al US 9,523,984 B1 20. Dec, 2016 Herbach et al
US2016/0311522 Oct. 27, 2016 Wiegand et al
WO2016130719 A2 18. Aug, 2016 Amnon et al
CN105700553A 22.Jun, 2016 Zhiqiang et al
US 9,373149 B2 21. Jun, 2016 Abhyanker et al US 2016/0139594 A1 19. May, 2016 Okumura et al
US2016/0023754 Jan.28, 2016 Wiegand et al
US 2016/0018822 A1 21Jan, 2016 Nevdahs et al US 9,229,453 B1 5.Jan, 2016 Jin-Woo Lee et al
US 2015/0336502 A1 26.Nov, 2015 Hillis et al
CN105015545 A 04. Nov, 2015 Yunfei et al
US 2015/0248131 A1 03.Sep, 2015 Fairfield et al WO2015068501 14. May, 2015 Takuji et a I
WO2014080390 A2 30.May, 2014 ALSHDAIFAT et al
WO2014080389 A2 30.May, 2014 ALSHDAIFAT et al
WO2014080388 A2 30.May, 2014 ALSHDAIFAT et al
WO2014080386 A2 30.May, 2014 ALSHDAIFAT et al WO2014080385 A2 30. ay, 2014 ALSHDAIFAT et al
US20140136414 15. May, 2014 Abhyanker et al
CN103402179 20-Nov, 2013 Junliang et al
WO2013076711 A2 30.May, 2013 ALSHDAIFAT et al
CN102806912 05. Dec, 2012 Priyantha et al US20120083960 05.Apr, 2012 Zhu et al
WO2011104579 01. Sept.2011 ALSHDAIFAT et al
US 2011/0106339 A1 05.May, 2011 Phillips et al
US 2010/0292871 A1 18.NOV, 2010 Schultz et al
US20100256835 O7.0ct, 2010 Mudalige et al US 2010/0106356 A1 29.Apr, 2010 Trepagnier et al
US 2003/0005030 A1 2. Jan, 2003 Sutton et al
US 6,151 ,539 21. Nov, 2000 Bergholz et al

Claims

Claims
1- A city autonomous transportation means with artificial telepathy comprising:
an autonomous aero-land vehicles (AAVL) (20);
an antelligent artificial telepathy (IAT) device (21);
an unmanned aerial vehicles (UAVS) (22);
a flying or land robot (26);
a smooth shape vehicle (29);
a smooth aerodynamic body (31);
a side ducted propellers (32);
a jet propulsion engines (32);
a Stowable propellers (37);
a modified aerodynamic flying body (46);
a compact (PMGW) device (60);
a server (61);
an autonomous vehicle (66);
a (Sub-IAT) device (67);
an aerial facility room (68);
a self-driving vehicle (77);
a fire extinguisher drone (70);
an aerial delivery window (75);
an artificial telepathy chips (76);
a (PMGW) microprocessor (78);
a micro measurement meters (80);
a non-autonomous vehicle (85);
an assistant (IAT) display (86);
a voice recognition system (87);
a telepathic intelligent learning system (TILS) (88);
a 3D Live head-up display (93);
a computer Vision Panorama cameras (95);
a collaborative reconstructive computer vision (CRCV) (98);
a holographic tinted dome (99); an intelligent data educating system (IDES) (100);
a container Aero-carriers (105);
a manipulator (107);
an autonomous ambulance (109);
an ambulance capsule (110);
a city central artificial telepathy station (CATS) (112);
an autonomous trolley (113);
an autonomous shelves (114);
a tiny telepathic chip (TTC) (115);
a transparent rounded duct (121);
a noise-Vibration-Harness system (NVHS) (123);
a three stage spring damper (124);
a net of acoustic sound collectors (128);
a dialog management electronic control unit (DM-ECU) (135);
an intelligent self-task management (ISTM) (136);
a multiple task UAV (141).
2- The autonomous transportation means according to claim 1 , wherein the autonomous aero-land vehicle (AAVL) 20 in a first embodiment is composed of a normal smooth vehicle (29) modified indoor to be a vertical take-off or landing vehicle (VTOL), by building and installing over its roof (30) a smooth aerodynamic body (31) containing two side ducted propellers (32) or small jet propulsion engines (33) with their ducting (34) extending from the front to the rear exhaust nozzles (35), while at the middle an engine or electric motor (36) is installed to drive stowable propeller (37) located over the smooth body (31), meanwhile side ducted or un-ducted propellers (38) are built-in vertically or horizontally inside the vehicle (29), gates (39) opens up, while the propellers (37) are pushed out vertically or horizontally when put in use, and rotated conventionally to take a horizontal or vertical configuration while they are fully extending out of the vehicle body (29). The tail (40) is made of two parts; each part in the top side is containing a small propeller (41) to adjust the reverse rotation of the main top propeller (37). 3- The autonomous transportation means according to claim 1 , wherein the autonomous aero-land vehicle (AAVL) (20) in a first embodiment is composed of a modified shape of a Volkswagen Sedric (42) or any similar electric mini-bus, wherein the four horizontal side propellers (43) are built-in inside the bottom part of the vehicle (42) and conventionally pushed out sidewards in four directions when put in use.
4- The autonomous transportation means according to claim 1 , wherein the autonomous aero-land vehicle (AAVL) (20) smooth aerodynamic body (31) flying mechanisms are modified by adding two foldable side wings and two vertical side ducted propellers, each ending with one or two ducted branches to be a container aero-carrier (105).
5- The autonomous transportation means according to claim 1 , wherein the AAVL (20) second embodiment can be attached or engaged or connected conventionally to be set to fly in aerial convoys (45) configurations for safety reasons.
6- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the compact (PMGW) microprocessor (78) is receiving data from micro measurement meters (80 measuring position, motion, geometry, and weather parameters, such that global positioning system (GPS) unit (52), RFID (Radio frequency identification) unit (53), remote sensing unit (54), altitude meters (55) are providing position parameter, while speedometers (56), accelerometer (57), tilt meters (58) and gyroscopes (59) are providing motion parameters. Geometrical shape, topography, textures, and dimensions of a body to scale provided by a manufacturer are providing geometry parameter. Weather conditions (wind speed, air temperature, fog, rain, humidity, dust...conditions), that may affect the motion parameter, are measured with conventional devices to provide weather parameter.
7- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the compact (PMGW) microprocessor (78) is providing mainly (GTPS) data (Geometry Topography Positioning) locating specific boundaries of a body occupying a specific shape, gridded 3D-geometry, topography, and texture in the space, and sharing such data with a local server 61 , or another neighborhood body
8- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the compact (PMGW) device (60) including a (PMGW) microprocessor (78) is installed inside transportation means or any static or dynamic body such that its focal point is synchronized with the focal point of the mother body, in addition to mating its position direction, tilting...with that of the mother body, such that after installing the (PMGW) device (60) into a specific preselected and calculated safe position inside the physical body, the digital geometrical shape including its inner constituents, dimensions, texture, topography and even coloring, is providing (PMGW) device 60 data that is 100.0% mating and synchronized with the real body (PMGW) data, that is to mean, the (PMGW) device (60) is providing to other similar devices, its mother body digital (PMGW) data simulating perfectly its mother body real (PMGW) data.
9- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the compact (PMGW) device (60) inside a body is sharing and sending its data via a data emitter to a remote data receiver, the data receiver without using radars scanning the space receives and sends too in the same the global position, altitude, speed, acceleration and direction... actual shape with the actual dimensions, topography, texture, coloring, identification data and location of the whole body and each part in it, big or tiny.
10- The autonomous transportation with artificial telepathy means according to claim 1, wherein the compact (PMGW) device (60) shared data are used for a collaboration in-between bodies to know, approach, handle, communicate, cooperate, engage...in addition to further technical additions and applications, such that it is titled Intelligent artificial telepathy (IAT) device 21
11 - The autonomous transportation means with artificial telepathy according to claim 1 , wherein the autonomous aero-land vehicles (AAVL) (20) are provided with intelligent artificial telepathy (IAT) devices (21) to cooperate with unmanned aerial vehicles (UAVs) (22) and similar ones of both in the space, and on or in land structures (23), buildings (24), machines (25), robots (26), and humans to make the recognition of identity, tracking the route, handling the load, and engaging to carry out a task easily optimized.
-The autonomous transportation means with artificial telepathy according to claim 1, wherein the autonomous vehicles 66, self-driving vehicles 77, semi-autonomous vehicles 85 are to be provided with (IAT) devices 21 , including in addition to (PMGW) data: vehicle registration details, body accessories (PMGW) data, planned driving short term, medium term, long term parameters according to the preset navigation map, fuel tank level and expected fueling timing and location, noise vibration harness (NVH) status of the vehicle suspension, body and other parts, dynamic stability and active stability control data related to the driving style and road dynamic conditions, road static conditions (road texture and surface conditions such as: wet, icy, muddy, sandy, rough, cracked... passable, passable with care, risk of slipping, customized driving style, slow, medium, or fast (comfort, sporty, or emergency), dynamic capabilities, engine power, braking performance, sporty level, maneuvering performance..., repeated history manners data and analysis of their probabilities, such as attending work location, visiting restaurants, visiting cinema, back home relative to road map and timings, current location of the vehicle in relation to a traffic lights and the traffic lights status, loads, weights: empty or full, availability of kids, old persons, or animals inside the vehicle, category of the vehicle or status: ambulance, firefighter, police car, school bus, municipality maintenance, road construction, caravan, bank, fuel, presidential parade, wedding procession, official procession, parade of consolation, car race, public: passengers / luggage, or private, salon, hatchback, sporty, 4WD, pickup, truck.., if it is pulling another body behind it either a car, boat, bike...etc, then the status, type, shape, geometry spatial coordinates and dimensions should be provided, driving performance and maintenance status, e.g: low tire pressure, faulty devices, faulty sensors, and any shared views of the surrounding area (traffic status, road condition, buildings, posts, opened or closed shops...) to be demonstrated via a display for the passengers inside another vehicle upon their request about a selected location where the current vehicle is available, any emergencies currently happening in the close and nearby surrounding such as road construction repair, maintenance, cleaning, accident and its assessment, lane or lanes closed or trees and boxes falling on road or what else any police drone messages received about the status of the road ahead (refer prior art). While for an Autonomous Aero-Land Vehicle (20) and UAVs (22), additional data may include: the imaginary aerial route number from point of departure to point of arrival, GPS location 3D map, the GPS location 3D map for the imaginary aerial U-turns, aerial bridges, or air humps, the route climate conditions: wind speed, rain sensor, snow, air temperature, humidity, thunder storms, dust or fog levels, pollution levels, and accidently falling idle aerial machines.
13- The autonomous transportation means with artificial telepathy according to claim 1 , wherein (IAT) devices 21 provided for city urban areas and (Sub-IAT) devices (67) for minor static facilities in addition to these areas: all types of infrastructure, buildings, towers, warehouses, stores, factories, underground, levels, walls, ceilings, service rooms, tanks, windows, protrusions, posts, road signs, columns, windows and doors swapping volumes, cranes, building materials, autonomous aerial delivery elevator, flats, offices, villas... autonomous aerial delivery windows, roads streets and bridges, UAVs 22 (docking) stations, platforms, posts, columns, cables, piping, trees, greenery..., road humps, crowded places, pinholes, crowds cross... stadiums, gardens, parks and car parking's, river, port, airport machines, structures, constructions, jet skis, boats, ships, aircrafts..., aerial facility rooms 68 full (IAT) device 21 data including: GPS location data and spatial coordinates, these smooth shaped rooms, firefighter extinguisher drones set (70), aerobatic fagade cleaners (71), local pick-up drone set (72), traffic lights, ports, airports, bus/taxi stations, municipality and public works authorities updating of the above public (IAT) 21 and (Sub-IAT) devices 67 data via a server cloud over a network when major, minor, tiny or micro changes are done on the autonomous city macro or micro., parts, private entities and residence updating of all of the above private (IAT) 21 or (Sub-IAT) devices 67 via a server cloud over a network through authorities revision when major, minor, tiny or micro changes are done on their physical belongings or aerial delivery/reception macro or micro parts, Artificial Telepathy Chips (ATC) 76 data.
14- The autonomous transportation means with artificial telepathy according to claim 13, wherein the (IAT) devices (21) provided for city urban areas and (Sub-IAT) devices (67) are passing and sharing their data to a city server (61) coupled with a city central artificial telepathy station (CATS) 112, which collects instant data from all of these, and communicates data back to each of them; updating it about (PMGW) data about each part which it need to cooperate with it, or to have a route near to it, in addition to expanded artificial intelligence data manipulations to facilitate movement, transportation, collaboration of them.
15- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the (IAT) devices (21) installed in a self-driving vehicles (77) are communicating and monitoring data related to road lanes or parking yard space conditions and other traffic issues via instrument cluster (82) and navigation system (83), or an assistant (IAT) display, issuing the instructions on the display screen (84) and announce them through speakers (86) to the driver who can communicate with it through voice recognition system (87) e.g.: to join or cross a convoy, or to pass his message to autonomous vehicle (66) over a server (61).
16- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the (IAT) devices (21) installed in a self-driving vehicles (77) are including telepathic intelligent learning systems (TILS) (88) and included in the server 61 to use artificial intelligence to analyze traffic issues and create solutions or learn from the past how to manage bridging and collaborations in-between vehicles via learn, analyze, educate, guide and instruct either drivers or autonomous vehicles 66 in the same.
17- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the (IAT) devices (21 ) installed in traffic lights (89) track the nearby vehicles heading toward the traffic light (89) and assist to knows in-prior how many vehicles are approaching from each side, to control the lighting such that the highest number of vehicles pass with the least time in a fully optimized manner.
18- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the (IAT) devices (21) are managing collaboration in- between multiple sets of firefighting drones (70), firefighting drone patrols (70), vessels-aerocarriers (105), firefighting trucks distributed all around a forest, such that after a heat sensor 103 in a firefighter drone patrol 70, senses a fire, a nearby firefighting drone set (70) flies immediately toward the fire, followed with many sets depending on the fire, firefighting vessels-aerocarriers (105) may follow, while replacing empty vessels and fire extinguishers is done via nearby warehouses distributed evenly around.
19- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the collaborative reconstructive computer vision (CRCV) (98) is using a set of computer Vision Panorama cameras (95) installed on top of autonomous and (66) semi-autonomous (85) non- autonomous vehicles (77) to collaborate in constructing a 3D image for each static or dynamic object in the picked views, while computer Vision Panorama cameras (95) are installed too on traffic lights and other crowded parts of a city for the same sake.
20- A method for locating items or parcels 73 which should be picked up by UAVs (22) or robots (26) manipulators (107) from belts (101 ), shelves (102) or windows (75) depending on (IAT) devices (21) such that when an order is received online, it is sent electronically to an assigned UAV (21) with the pick-up line, shelf (102), belt (101) (PMGW) details including full data about the location's 3D-spatial coordinates, location of warehouse, shelves set, shelve line, and perfect location of item over a shelf (102), belt (101), window (75) such that the UAV (22) will flies to that specific part of the space following a preset artificially calculated route to approach and get inside a warehouse via a gate which is having a defined shape and dimensions including its tiny grid 3D dimensions in the space, then the UAV (22) crosses the defined safe route towards the specific shelf (102), then once it approaches the item (parcel) (73) to be picked up, it adjusts itself perfectly over to occupy the whole special grids in the space which is fit with its shape, once it is perfectly occupying this space, it means it is in the right place to make pick-up, so it pushes its clamps (62) to inside over the thing which is under its clamps (62) whatever it is, all of this is governed under the control, monitoring and follow up by both (IAT) devices (21) in the UAV (22) and the other in the warehouse, which works together to synchronize the UAV (22) to the specific multi-grid shaped space over the item (parcel) (73).
21 - The autonomous transportation means with artificial telepathy according to claim 1 , wherein the autonomous aero-land vehicle (AAVL) 20 uses its (IAT) device (21) to book in prior its landing or take-off field and spot inside an autonomous airport, wherein it communicates its arrival or departure time and pass it to the main (IAT) device (21) which is managing the whole air traffic over the autonomous airport and land traffic inside it.
22- The autonomous transportation means with artificial telepathy according to claim 1 , wherein fully autonomous vehicles (77) gets use of their (IAT) device (21) to communicate with auto dealers to schedule a service and lonely visit, send and bring luggage alone, leaves lonely to bring students from a school or owner from his job, carry autonomous trolleys, drop them near a shopping center, receiving them full and returning them back home, communicating with UAVs (22) or manipulators (107) to receive deliveries. 23- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the motorized autonomous trolley (113) referring to specific orders, navigating their ways to specific shelves (102) according to their (IAT) devices (21), tracking the ordered items according to their locations specified by the shelves (102) (IAT) device (21), and rounding in-between specific shelves (102) according to arrival priority, where manipulator (107) activated to pick-fill the item according to a 3-ponts localization method of (IAT) devices 21 communication, wherein point 1 : basket of trolley location, point 2 manipulator (107) location, point 3: item location on shelf.
24- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the city central artificial telepathy station (CATS) (112) or city mind is including the server (61), storing and manipulating comprehensive data related to city facilities, such that every outdoor macro or micro physical part either dynamic or static of the city viewed, scanned, pictured, or located, will be recognized and referenced by specific identification to cooperate neatly and harmonically with all kinds of autonomous machines macro or micro including but not limited to all kinds of UAVs (22) robots (26), (AALVs) (20)..., such that their services and technical applications can be extended to be comprehensive and covering every corner in the city without a danger that it may hit anything.
25- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the (IAT) (21) devices are developed using quantum mechanics (quantum superposition and quantum entanglement) to process extra data inside Nano-size microprocessors, such that a very tiny highly secured (IAT) device is built including RFID chips (transponders) to be transplanted in more than one part or side of humans and livestock, to have a recognizable (PMGW) body, such that they become observable and recognized by other machines and structures, while a so-called twin tiny telepathic chips (TTCs) (115) inside their Nano-IAT (21) to support their (human to machine telepathy) such as communications, orders... with their autonomous vehicles (66) vehicle or their autonomous machines...or to know in prior if a restaurant, bank, park... is crowded, to be a passport, identity card, ticket, gate pass, multiple devices to be used by hostess in aircrafts, trains... to recognize easily on any display that the right number of passengers and correct ones are seated in their own specific seats... share pain or health conditions with artificially intelligent robotic doctors, or share tiny chips connected to nervous system or swallowed tiny chips medical data with a medical center, or to order some viewed items online via specific gestures, and to select the method of delivery via specific gestures too.
26- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the tiny telepathic chips (TTCs) (115) supports human to human telepathy by reading the thoughts directly or indirectly, receiving verbal voice or telepathic orders from the humans, passing orders, downloadeding applications on human smart mobiles, watches, lenses, virtual reality lenses...virtual reality telepathy... augmented reality telepathy, making telepathic phone calls, sharing viewed visual scenery, heard acoustic sounds, thoughts, discussions, emotions, and even share relatively adjusted pains with doctors, sharing sceneries and sounds with the visual nerves in the brain of a blind person walking near to a normal one to navigate his road clearly, tracking trapped missed, injured, dead people...and expanded applications generally but not limitedly including the following sectors: training, coaching, education, security, traffic, energy saving, organized commerce, fast understanding in-between humans to humans, robots, and machines communities.
27- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the aerial delivery windows (75) are using (IAT) devices (21 ) to communicate with UAVs (22) and to control a window mechanism to open the window and receive a parcel (73) generally but not limitedly via: in-out linear (sliding) reception mechanism (117) of a receiving box (118) provided with a weight or load proportioning sensor (119) or a swinging reception mechanism (120) of a receiving basket or box (118) or a rotary reception mechanism 122 rotating conventionally the box in-out from the window (75) sides.
28- The autonomous transportation means with artificial telepathy according to claim 1, wherein the transparent rounded duct (121 ) or pipe is installed vertically along a building side facing a fixed glass of a building fagade to handle parcels (73) to flats in different floors, managing parcels reception with avoiding opening and closing the windows (75) to the outer atmosphere, such that an opening is made in each flat glass open to the duct provided with a motorized glass gate, such that the duct (121) receives parcels via a top room over a building, and moves them down similar to elevator principle, wherein a parcel over a horizontal plate moves down to face a glass opened glass gate, wherein behind the parcel from the transparent duct side, a vertical plate is located (can be transparent glass too), such that via a retractable motorized string or rope, pulling the string inwards over a pulley on the front and back edges of the horizontal plate carrying the parcel (73), pulls the vertical plate forward sliding on rails or through grooves from the bottom, the forward pulling pushes the parcel forward inside a flat receiving box (118), while s motor (located under the plate and connected from both sides to the belt/rope/string) retards the vertical plate back after the task is finished.
29- The autonomous transportation means according to claim 1 , wherein the three stage spring damper (124) is composed of three parts unified vertically but with a gradual or multiple-stage increase in the stiffness, diameter, and damping to handle variable road textures and topography conditions with the suitable absorb and damping reaction which results in perfect comfortable driving style.
30- The autonomous transportation means with artificial telepathy according to claim 1, wherein the noise-Vibration-Harness system (NVHS) (123) is composed of: a net of acoustic sound collectors (129) and vibration meters for recording and assessing vehicle suspension and mechanical parts noise-vibration status as well as road-tires contact noise and vibration to assess road type, texture, conditions, to adjust vehicle's speed, smoke detectors (131) to detect burnt faulty mechanical and electronic parts, and to share all of these data with a data server (61) over a network to share it with other vehicles for further actions if required.
31 - The autonomous transportation means with artificial telepathy according to claim 1 , wherein the autonomous vehicles (66) provided with (NVHS) (123), comfortable suspensions with a three stages spring-damper system (124), and semi-pneumatic tires (128) providing an inner static feeling rather than dynamic driving are reshaped and re-equipped with inner facilities and equipment style, creating a stylish comfort suiting the individual, personal professions and desires such as: office style, restaurant or coffee shop style, trip style, aerial and land city cruise style, entertainment style movie (cinema style), comfort with bedroom (foldable sofa), diabetic comfort with toilet, business center style, sporty style with exercise facilities, tourist style, prayer style (Church, Mosque, Synagogue...or any other temple like selected option or even yoga style), ambulance facilitated with robotic surgeons, delivery truck equipped with sorted autonomous shelving and delivery drones, or having a Jacuzzi, truck workshop style equipped with 3D printers or any other tools..., kitchen fast-food style, mini-showroom style, artist style... dentist clinic, technician onsite workshop service, or equipped with task force UAVs 22 or robots 26 special services.
32- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the standard a multiple task UAV (141) is of compact shape composed of a foldable flexible screen (142) like book cover shape, while its inner side is provided with foldable arm rotor blades (143), mike (144), hook for carrying small and mini-parcels or hanging orders from nearby restaurants, in addition to camera... and any required accessories.
33- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the dialog management electronic control unit (DM- ECU) (135) in an autonomous vehicle (66) are updated with all humans dialogs in each specific language in addition to hugely yearly edited texts referring to specific practical dialogs which should be communicated in- between the autonomous vehicle (66) and the owner, to provide dialog management capabilities of an autonomous vehicle (66), technical features and applications of the future autonomous vehicles (66) supporting: daily management of the owner requests, vehicles own needs self-management, its communications with other machines, robots (26), servers (61), UAVs (22), expressing a feeling, having a sense of humor, remembering and reminding, updating and recommending, handling and receiving, picking up and dropping, waiting and suggesting, fixing appointments and waiting, up to installing an extra unit of intelligent self- task management (ISTM) (136) for carrying indoor (inside vehicle) automated tasks for a busy owner.
34- The autonomous transportation means with artificial telepathy according to claim 1 , wherein the city central artificial telepathy station (CATS) (112) or city mind including the server (61) share with the neighborhood vehicles all other parameters related to the programmed/selected plan trip, intentions, predictions, distribution of vehicles on road lanes, creating space in-between other existing vehicles in that lane to pass a vehicle or shift it from lane to lane, grouping vehicles with similar selected drive speeds in convoys, merging evenly according to ratios, priority, and road conditions, receiving data from traffic departments, public works, municipality, meteorology...about planned or urgent issues related to road conditions, maintenance, closed lanes, public traffic (police, ambulance...) using a lane, to distribute, organize, change lane, change street, or bridge of a calculated ratio of autonomous vehicles (66) according to booked road priorities, and to monitor a street lanes, ambulance, police cars, rescuing, firefighting when data are received from (lATs) 22 about an accident or a case which may cause accident.
35- A method and strategy for constructing a fully autonomous city transportation means with artificial telepathy composed of:
Stage 1 (5-7 years): installing (IAT) devices (21 ) into all existing vehicles, traffic lights, connected to data server (61) to manage convoys, traffic, in addition to starting constructing an autonomous city airport for (AALVs) (20) and (UAVs) (22), starting aerial deliver via (UAVs) (22) to autonomous ground stations;
Stage 2 (5-7 years): Introducing collaborative reconstructive computer vision (CRCV) (98) and computer holographic real-time traffic view, upgrading (TILS) (88) and (IDES) (100), introducing aero-carriers, expanding (UAVs) delivery span, starting removing street traffic lights, posts, signal boards, radars..., upgrading voice recognition and dialogs in-between drivers and their autonomous vehicles (66);
Stage 3 (5-7 years): All newly manufactured vehicles in addition to all other machines servicing in transportation are autonomous and provided with (IAT) devices (21 ) and carrying tasks alone, city autonomous airports are expanded to be full business hubs and centers for passenger vehicles transport, single containers handling, UAVs deliveries to every part in a city, human-machine and human-human telepathy depending on Nano-telepathic chips (115), vehicles designed to meet customers own profession, desire, hobbies...
PCT/IB2018/001355 2018-11-26 2018-11-26 Autonomous city transportation means with artificial telepathy WO2019025872A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2018/001355 WO2019025872A2 (en) 2018-11-26 2018-11-26 Autonomous city transportation means with artificial telepathy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2018/001355 WO2019025872A2 (en) 2018-11-26 2018-11-26 Autonomous city transportation means with artificial telepathy

Publications (2)

Publication Number Publication Date
WO2019025872A2 true WO2019025872A2 (en) 2019-02-07
WO2019025872A3 WO2019025872A3 (en) 2019-10-03

Family

ID=65019541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2018/001355 WO2019025872A2 (en) 2018-11-26 2018-11-26 Autonomous city transportation means with artificial telepathy

Country Status (1)

Country Link
WO (1) WO2019025872A2 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111144247A (en) * 2019-12-16 2020-05-12 浙江大学 Escalator passenger reverse-running detection method based on deep learning
CN111861834A (en) * 2020-07-14 2020-10-30 自然资源部第一海洋研究所 Coastal zone land area space usage monitoring and surveying method based on water resource dependence degree
CN111861337A (en) * 2020-07-24 2020-10-30 江苏天长环保科技有限公司 Method, device, storage medium and system for monitoring transportation of hazardous waste
RU2745543C1 (en) * 2019-05-13 2021-03-26 Тойота Дзидося Кабусики Кайся Autonomous mobile body, autonomous mobile body control program, method of autonomous mobile body control and system server for managing autonomous mobile body from range
CN113581194A (en) * 2021-08-06 2021-11-02 武汉极目智能技术有限公司 Automatic early warning interaction system and method based on vehicle-mounted vision detection
US11180253B1 (en) * 2021-03-24 2021-11-23 Brien Aven Seeley System for fire suppression by autonomous air and ground vehicles
CN113706894A (en) * 2021-09-28 2021-11-26 长安大学 Method for driving-in vehicle fault of ultra-high-speed private land transportation system
US11198519B1 (en) * 2020-08-11 2021-12-14 Brien Aven Seeley Quiet urban air delivery system
CN114489029A (en) * 2020-10-26 2022-05-13 丰田自动车株式会社 Mobile service system and mobile service providing method
CN114578746A (en) * 2022-01-24 2022-06-03 江苏经贸职业技术学院 Computer information safety monitoring and early warning equipment
US11447269B2 (en) * 2020-08-11 2022-09-20 Brien Aven Seeley Quiet urban air delivery system
CN115116249A (en) * 2022-06-06 2022-09-27 苏州科技大学 Method for estimating different permeability and road traffic capacity of automatic driving vehicle
WO2022218219A1 (en) * 2021-04-14 2022-10-20 岳秀兰 Aircraft operation guarantee system consisting of remote driving, energy supply, and ground carrier
IT202100021896A1 (en) * 2021-08-13 2023-02-13 Zona Eng & Design S A S Di Zona Mauro & C Multicopter aircraft convertible into a motor vehicle
CN116614841A (en) * 2023-07-17 2023-08-18 中汽智联技术有限公司 Road side data quality assessment method and electronic equipment
CN116749866A (en) * 2023-08-22 2023-09-15 常州星宇车灯股份有限公司 Vertical take-off and landing lighting auxiliary system of aerocar and aerocar
US11767129B2 (en) 2020-01-31 2023-09-26 Southeastern Pennsylvania Unmanned Aircraft Systems, Llc Drone delivery system
CN116863205A (en) * 2023-06-15 2023-10-10 深圳市软筑信息技术有限公司 Container empty detection method and system for customs
US11816997B2 (en) 2021-04-29 2023-11-14 Ge Aviation Systems Llc Demand driven crowdsourcing for UAV sensor
CN117172423A (en) * 2023-10-26 2023-12-05 北京中联世建建设规划设计有限公司 Municipal engineering mapping method based on satellite remote sensing technology
CN117437564A (en) * 2023-12-20 2024-01-23 中铁三局集团广东建设工程有限公司 Unmanned aerial vehicle data processing method and device for bridge construction monitoring
CN117636270A (en) * 2024-01-23 2024-03-01 南京理工大学 Vehicle robbery event identification method and device based on monocular camera

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100256835A1 (en) 2009-04-06 2010-10-07 Gm Global Technology Operations, Inc. Fail-safe speed profiles for cooperative autonomous vehicles
US20120083960A1 (en) 2010-10-05 2012-04-05 Google Inc. System and method for predicting behaviors of detected objects
US20140136414A1 (en) 2006-03-17 2014-05-15 Raj Abhyanker Autonomous neighborhood vehicle commerce network and community
WO2015068501A1 (en) 2013-11-08 2015-05-14 本田技研工業株式会社 Convoy travel control device
WO2017079229A1 (en) 2015-11-04 2017-05-11 Zoox, Inc. Simulation system and methods for autonomous vehicles
WO2017178899A2 (en) 2017-07-27 2017-10-19 Wasfi Alshdaifat Multiple task aerocarrier
US9947145B2 (en) 2016-06-01 2018-04-17 Baidu Usa Llc System and method for providing inter-vehicle communications amongst autonomous vehicles
US20180183873A1 (en) 2016-07-21 2018-06-28 Baidu Usa Llc Efficient communications amongst computing nodes for operating autonomous vehicles
WO2018122821A2 (en) 2018-04-23 2018-07-05 Wasfi Alshdaifat City autonomous airport (caa)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9493235B2 (en) * 2002-10-01 2016-11-15 Dylan T X Zhou Amphibious vertical takeoff and landing unmanned device
US10074284B1 (en) * 2016-06-10 2018-09-11 ETAK Systems, LLC Emergency shutdown and landing for unmanned aerial vehicles with air traffic control systems

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140136414A1 (en) 2006-03-17 2014-05-15 Raj Abhyanker Autonomous neighborhood vehicle commerce network and community
US20100256835A1 (en) 2009-04-06 2010-10-07 Gm Global Technology Operations, Inc. Fail-safe speed profiles for cooperative autonomous vehicles
US20120083960A1 (en) 2010-10-05 2012-04-05 Google Inc. System and method for predicting behaviors of detected objects
WO2015068501A1 (en) 2013-11-08 2015-05-14 本田技研工業株式会社 Convoy travel control device
WO2017079229A1 (en) 2015-11-04 2017-05-11 Zoox, Inc. Simulation system and methods for autonomous vehicles
US9947145B2 (en) 2016-06-01 2018-04-17 Baidu Usa Llc System and method for providing inter-vehicle communications amongst autonomous vehicles
US20180183873A1 (en) 2016-07-21 2018-06-28 Baidu Usa Llc Efficient communications amongst computing nodes for operating autonomous vehicles
WO2017178899A2 (en) 2017-07-27 2017-10-19 Wasfi Alshdaifat Multiple task aerocarrier
WO2018122821A2 (en) 2018-04-23 2018-07-05 Wasfi Alshdaifat City autonomous airport (caa)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2745543C1 (en) * 2019-05-13 2021-03-26 Тойота Дзидося Кабусики Кайся Autonomous mobile body, autonomous mobile body control program, method of autonomous mobile body control and system server for managing autonomous mobile body from range
CN111144247A (en) * 2019-12-16 2020-05-12 浙江大学 Escalator passenger reverse-running detection method based on deep learning
CN111144247B (en) * 2019-12-16 2023-10-13 浙江大学 Escalator passenger reverse detection method based on deep learning
US11767129B2 (en) 2020-01-31 2023-09-26 Southeastern Pennsylvania Unmanned Aircraft Systems, Llc Drone delivery system
CN111861834A (en) * 2020-07-14 2020-10-30 自然资源部第一海洋研究所 Coastal zone land area space usage monitoring and surveying method based on water resource dependence degree
CN111861337A (en) * 2020-07-24 2020-10-30 江苏天长环保科技有限公司 Method, device, storage medium and system for monitoring transportation of hazardous waste
CN111861337B (en) * 2020-07-24 2024-04-23 江苏天长环保科技有限公司 Method, device, storage medium and system for monitoring dangerous waste transportation
US11198519B1 (en) * 2020-08-11 2021-12-14 Brien Aven Seeley Quiet urban air delivery system
US11447269B2 (en) * 2020-08-11 2022-09-20 Brien Aven Seeley Quiet urban air delivery system
CN114489029A (en) * 2020-10-26 2022-05-13 丰田自动车株式会社 Mobile service system and mobile service providing method
US11180253B1 (en) * 2021-03-24 2021-11-23 Brien Aven Seeley System for fire suppression by autonomous air and ground vehicles
WO2022218219A1 (en) * 2021-04-14 2022-10-20 岳秀兰 Aircraft operation guarantee system consisting of remote driving, energy supply, and ground carrier
US11816997B2 (en) 2021-04-29 2023-11-14 Ge Aviation Systems Llc Demand driven crowdsourcing for UAV sensor
CN113581194A (en) * 2021-08-06 2021-11-02 武汉极目智能技术有限公司 Automatic early warning interaction system and method based on vehicle-mounted vision detection
IT202100021896A1 (en) * 2021-08-13 2023-02-13 Zona Eng & Design S A S Di Zona Mauro & C Multicopter aircraft convertible into a motor vehicle
WO2023017407A1 (en) * 2021-08-13 2023-02-16 Zona Engineering & Design Sas Di Zona Mauro & C. Multicopter aircraft convertible into a motor vehicle
CN113706894A (en) * 2021-09-28 2021-11-26 长安大学 Method for driving-in vehicle fault of ultra-high-speed private land transportation system
CN113706894B (en) * 2021-09-28 2022-08-12 长安大学 Method for driving vehicles into ultrahigh-speed private land transportation system in fault mode
CN114578746A (en) * 2022-01-24 2022-06-03 江苏经贸职业技术学院 Computer information safety monitoring and early warning equipment
CN115116249B (en) * 2022-06-06 2023-08-01 苏州科技大学 Method for estimating different permeability and road traffic capacity of automatic driving vehicle
CN115116249A (en) * 2022-06-06 2022-09-27 苏州科技大学 Method for estimating different permeability and road traffic capacity of automatic driving vehicle
CN116863205A (en) * 2023-06-15 2023-10-10 深圳市软筑信息技术有限公司 Container empty detection method and system for customs
CN116614841A (en) * 2023-07-17 2023-08-18 中汽智联技术有限公司 Road side data quality assessment method and electronic equipment
CN116614841B (en) * 2023-07-17 2023-10-27 中汽智联技术有限公司 Road side data quality assessment method and electronic equipment
CN116749866A (en) * 2023-08-22 2023-09-15 常州星宇车灯股份有限公司 Vertical take-off and landing lighting auxiliary system of aerocar and aerocar
CN117172423A (en) * 2023-10-26 2023-12-05 北京中联世建建设规划设计有限公司 Municipal engineering mapping method based on satellite remote sensing technology
CN117172423B (en) * 2023-10-26 2024-01-23 北京中联世建建设规划设计有限公司 Municipal engineering mapping method based on satellite remote sensing technology
CN117437564A (en) * 2023-12-20 2024-01-23 中铁三局集团广东建设工程有限公司 Unmanned aerial vehicle data processing method and device for bridge construction monitoring
CN117437564B (en) * 2023-12-20 2024-03-12 中铁三局集团广东建设工程有限公司 Unmanned aerial vehicle data processing method and device for bridge construction monitoring
CN117636270A (en) * 2024-01-23 2024-03-01 南京理工大学 Vehicle robbery event identification method and device based on monocular camera
CN117636270B (en) * 2024-01-23 2024-04-09 南京理工大学 Vehicle robbery event identification method and device based on monocular camera

Also Published As

Publication number Publication date
WO2019025872A3 (en) 2019-10-03

Similar Documents

Publication Publication Date Title
WO2019025872A2 (en) Autonomous city transportation means with artificial telepathy
US11874663B2 (en) Systems and methods for computer-assisted shuttles, buses, robo-taxis, ride-sharing and on-demand vehicles with situational awareness
US11468395B2 (en) Modular delivery vehicle with access lockers
US20210397199A1 (en) Methods and Systems for Transportation to Destinations by a Self-Driving Vehicle
JP6772100B2 (en) Dynamic management device for drones, dynamic management method for drones, and dynamic management program for drones
JP2020529080A (en) Systems and methods for remote operation of robot vehicles
JP6278539B2 (en) Flight mode selection based on situation
CN104837705B (en) Enable and disable automatic Pilot
CN103370249B (en) The system and method for the behavior of the object arrived for predicted detection
CN107918781A (en) Movable sensor platform
US20180322783A1 (en) Systems and methods for visualizing potential risks
CN104470781B (en) Traffic signal state and other aspects of vehicle environmental are inferred based on alternate data
JP2020527805A (en) Relocation of autonomous vehicles
CN109641538A (en) It is created using vehicle, updates the system and method for map
CN107036600A (en) System and method for autonomous vehicle navigation
CN107077810A (en) System, method and apparatus for producing bunching effect on moving advertising, media and communications platform
CN109416873A (en) The autonomous motor vehicles in autonomous or part and its correlation method with automation risk control system
JP2019139692A (en) Mobile shop vehicle and mobile shop system
CN110140028A (en) Mark is used for the parking site of autonomous vehicle
Townsend Ghost road: Beyond the driverless car
EP4244099A2 (en) Uav enabled vehicle perimeters
JP2020205122A (en) Operation plan creation device, operation plan creation method, and operation plan creation program
US20230398932A1 (en) Methods and apparatus for safety support with unmanned vehicles
Zankl et al. Digibus 2017
KR102645700B1 (en) Digital twin-based charging station control system

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18833704

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 18833704

Country of ref document: EP

Kind code of ref document: A2

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 10/11/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 18833704

Country of ref document: EP

Kind code of ref document: A2