WO2021076929A1 - Uni-body multimodal mobility chassis and autonomous vehicle system - Google Patents

Uni-body multimodal mobility chassis and autonomous vehicle system Download PDF

Info

Publication number
WO2021076929A1
WO2021076929A1 PCT/US2020/056037 US2020056037W WO2021076929A1 WO 2021076929 A1 WO2021076929 A1 WO 2021076929A1 US 2020056037 W US2020056037 W US 2020056037W WO 2021076929 A1 WO2021076929 A1 WO 2021076929A1
Authority
WO
WIPO (PCT)
Prior art keywords
leg
autonomous vehicle
sensor
legs
machine learning
Prior art date
Application number
PCT/US2020/056037
Other languages
French (fr)
Inventor
Jack Vice
Original Assignee
Zenosol, Corp.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zenosol, Corp. filed Critical Zenosol, Corp.
Publication of WO2021076929A1 publication Critical patent/WO2021076929A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G1/00Suspensions with rigid connection between axle and frame
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K17/00Arrangement or mounting of transmissions in vehicles
    • B60K17/04Arrangement or mounting of transmissions in vehicles characterised by arrangement, location, or kind of gearing
    • B60K17/043Transmission unit disposed in on near the vehicle wheel, or between the differential gear unit and the wheel
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K7/00Disposition of motor in, or adjacent to, traction wheel
    • B60K7/0007Disposition of motor in, or adjacent to, traction wheel the motor being electric
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D63/00Motor vehicles or trailers not otherwise provided for
    • B62D63/02Motor vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2300/00Indexing codes relating to the type of vehicle
    • B60G2300/50Electric vehicles; Hybrid vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2401/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60G2401/12Strain gauge
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2401/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60G2401/14Photo or light sensitive means, e.g. Infrared
    • B60G2401/142Visual Display Camera, e.g. LCD
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2401/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60G2401/16GPS track data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2401/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60G2401/28Gyroscopes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2500/00Indexing codes relating to the regulated action or device
    • B60G2500/30Height or ground clearance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2600/00Indexing codes relating to particular elements, systems or processes used on suspension systems or suspension control systems
    • B60G2600/18Automatic control means
    • B60G2600/187Digital Controller Details and Signal Treatment
    • B60G2600/1876Artificial intelligence
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2800/00Indexing codes relating to the type of movement or to the condition of the vehicle and to the end result to be achieved by the control action
    • B60G2800/21Traction, slip, skid or slide control
    • B60G2800/215Traction, slip, skid or slide control by applying a braking action on each wheel individually
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K7/00Disposition of motor in, or adjacent to, traction wheel
    • B60K2007/0092Disposition of motor in, or adjacent to, traction wheel the motor axle being coaxial to the wheel axle

Definitions

  • the present aspect relates to an autonomous vehicle system and more particularly to a land vehicle operatively coupled to an artificial intelligence (AI) platform to support autonomous vehicle movement. More specifically, the AI platform supports and enables reinforcement learning directed at a risk-reward system utilizing vehicle sensor data to direct autonomous movement of the vehicle.
  • AI artificial intelligence
  • Autonomous vehicles have become adapted for increasing and varied purposes. For example, navigating a complex environment.
  • An autonomous vehicle is essentially an autonomous robot with mechanisms that allow it to navigate on the surface of the ground.
  • an autonomous vehicle lit the real world has the ability to travel without human navigation assistance.
  • obstacles can get in the way of the autonomous vehicle making navigation difficult.
  • the autonomous vehicle includes a flexible uni-body chassis with no distinguishable joints; a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; a processor; and a memory.
  • the memory has instructions stored thereon, which when executed cause the system to receive a signal from the sensor; determine if an object is encountered based on the signal; determine, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and control, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
  • the instructions when executed by the processor may cause the system to determine if the movement of the autonomous vehicle was successful, generate at least one of a positive reward value or a negative reward value based on the determined success, and train the machine learning network using reinforcement training based on the generated reward.
  • the leg movement may include a quantity of lift, a degree of lift, and/or a sequence in which each of the plurality of legs are lifted.
  • each leg of the plurality of legs may further include a tension wire configured to bend or compress the respective leg, and a pincer configured to selectively engage movement of the corresponding wheel.
  • the wheel motor may be configured to selectively engage the tension wire.
  • the senor may include a bend sensor, a wheel motor current sensor, an inertial measurement unit sensor; a first camera, a GPS, a temperature sensor, and/or a humidity sensor.
  • the first camera may be an omnidirectional camera configured to provide a 360° view of the surroundings of the chassis.
  • the instructions when executed by the processor may cause the system to control, by the machine learning network, pincer activation based on the sensor.
  • input to the machine learning network further includes at least one of camera image data, IMU data, GPS location data, bend sensor data, or wheel motor current value.
  • the senor further may include a second camera positioned to proximal to a mirror.
  • the mirror may be positioned at a 90° relative to the second camera.
  • the autonomous vehicle may further include a gripper configured for holding an object to be manipulated.
  • the autonomous vehicle system may further include a UVC light, configured to disinfect surfaces.
  • a computer-controlled method for autonomous vehicle movement includes receiving a signal from a sensor of an autonomous vehicle, the autonomous vehicle including a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; determining if an object is encountered based on the signal; determining, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and controlling, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
  • the method may further include-determining if the movement of the autonomous vehicle was, generating at least one of a positive reward value or a negative reward value based on the determined success, and training the machine learning network using reinforcement training based on the generated reward.
  • the leg movement may include at least one of a quantity of lift, a degree of lift, or a sequence in which each of the plurality of legs are lifted.
  • each leg of the plurality of legs further may include a tension wire configured to bend or compress the respective leg, and a pincer configured to selectively engage movement of the corresponding wheel.
  • the wheel motor may be configured to selectively engage the tension wire.
  • the senor may include a bend sensor, a wheel motor current sensor, an inertial measurement unit sensor; a first camera, a GPS, a temperature sensor, and/or a humidity sensor.
  • the first camera may be an omnidirectional camera configured to provide a 360° view of the surroundings of the chassis.
  • input to the machine learning network may further include at least one of camera image data, IMU data, GPS location data, bend sensor data, or wheel motor current value.
  • a non-transitory computer-readable medium storing instructions which, when executed by a processor, cause the processor to perform a method for autonomous vehicle movement, the method includes: receiving a signal from a sensor of an autonomous vehicle, the autonomous vehicle including a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; determining if an object is encountered based on the signal; determining, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and controlling, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
  • FIG. 1 is a schematic diagram illustrating an exemplary uni-body multimodal mobility device.
  • FIG. 2 is a schematic diagram illustrating an embedded motor of a leg as shown in FIG. 1.
  • FIG. 3 is a schematic diagram illustrating a pincer as shown in FIG. 1.
  • FIG. 4 is a flow chart illustrating application of reinforcement learning to the vehicle structure and functionality shown in FIGS. 1-3.
  • FIG. 5 is a flow chart illustrating an example of reinforcement learning as applied to an illustrative embodiment of a land vehicle in accordance with the present disclosure.
  • FIG. 6 is a block diagram illustrating an example of a computer system/server in communication with a cloud-based support system, to implement the system and processes with respect to the embodiment of FIGS. 1-5.
  • FIG. 7 is a block diagram illustrating a cloud computer environment in accordance with an illustrative embodiment of the present disclosure.
  • FIG. 8 is a block diagram illustrating a set of functional abstraction model layers provided by the cloud computing environment shown in the embodiment of FIG. 7.
  • the present disclosure relates to an autonomous vehicle system, such as a land vehicle operatively coupled to an artificial intelligence (AI) platform to support autonomous vehicle movement. More specifically, the presently disclosed AI platform supports and enables reinforcement learning directed at a risk-reward system utilizing vehicle sensor data to direct autonomous movement of the vehicle.
  • AI artificial intelligence
  • a schematic diagram of an autonomous vehicle system includes a uni-body multimodal mobility device 100.
  • the uni-body multimodal mobility device 100 is a land vehicle and is configured with a uni-body construction.
  • Uni-body refers to a single molded unit forming both the bodywork and chassis of the vehicle. It is understood in the art that a land vehicle is directed at an apparatus adapted to travel on land or over a surface. In one aspect, and as described in detail below, the vehicle may carry a load.
  • the uni-body multimodal mobility device 100 is provided with a supporting frame, referred to herein as a chassis 102.
  • the chassis 102 is a flexible, uni-body chassis with no distinguishable joints.
  • the chassis 102 can be fabricated, for example, by injection molding, with the chassis 102 having variable density and variable compliancy to achieve optimal compression, rebounding, and dampening characteristics to improve mobility.
  • the chassis 102 combines legged mobility with wheeled mobility into the uni-body design as will now be described in more detail.
  • the uni-body multimodal mobility device 100 is configured with a plurality of legs and corresponding wheels or wheel system to support and enable movement across a surface.
  • two legs are shown, including leg 106a and 106b, with each leg having a corresponding wheel, shown herein as wheels 108a and 108b, operatively connected to a chassis 102, respectively.
  • the quantity of legs and wheels shown herein should not be considered limiting.
  • the chassis 102 has one or more additional legs with each additional leg having a proximally positioned wheel.
  • the wheels 108a and 108b may be a system having two or more proximally positioned wheels or revolving objects that support movement of the uni-body multimodal mobility device 100 across a surface.
  • Each wheel or system of wheels is in communication with a proximally positioned motor embedded into a distal end of each leg.
  • motor 110a is positioned proximal to wheel 108a
  • motor 110b is positioned proximal to wheel 108b.
  • the motor and motor components will be discussed in greater detail in conjunction with reference to FIG. 2.
  • the chassis 102 is configured to receive a computing device 112 with an embedded processor (not shown).
  • the computing device 112 in one illustrative embodiment is a smart phone or tablet configured with a corresponding operating system.
  • the computing device 112 is configured with a camera, such as high-resolution camera 118.
  • the computing device 112 is placed into or received by the chassis 102 with the camera 118 positioned without obstructions from the chassis 102.
  • the area 104 of the chassis 102 that receives the computing device 112 is raised to enable the camera 118 to capture views that extend over the legs 106a and 106b, e.g. the view from the camera is maximized so that any obstruction from the legs is minimized.
  • the camera 118 includes an omni-directional lens 114 is shown positioned in communication with the computing device 112. The omni-directional lens 114 is operatively coupled to the computing device 112 to provide a 360° view of the surroundings of the chassis 102.
  • a second camera 136 also referred to herein as a rear camera of the computing device 112, is positioned to proximal to a mirror 116, shown herein positioned at a 90° relative to the second camera 136.
  • a corresponding reflection off-of mirror 116 enhances the view and corresponding images captured by the second camera 136.
  • the chassis 102 is shown herein with a battery 120 operatively coupled to the computing device 112.
  • the battery 120 may be a replaceable battery or a rechargeable battery embedded in the chassis 102.
  • the battery 120 may be comprised of multiple replaceable or rechargeable batteries.
  • the batteries 120 may be embedded in each of the legs 106a and 106b.
  • Sensors, in communication with the computing device 112, include a plurality of inertial measurement unit (IMU) sensors 128, a global positioning system (GPS) (not shown), cameras 114 and 136, temperature sensors (not shown) and humidity sensors (not shown).
  • IMU inertial measurement unit
  • GPS global positioning system
  • each leg 106a and 106b are a single, composite, compliant part with various other components fused into the uni-body design.
  • each leg includes a corresponding bend sensor, a tension wire housed in a tension wire conduit, and one or more IMUs and pincers.
  • the tension wire as shown herein, leg A 106a is shown with bend sensor A 130a, tension wireA 122a, IMUA 128a, pincerAi 132al and pincerA2 132a2 and legB 106b is shown with bend sensor B 130b, tension wire B 122b, IMU B 128b, pincer Bi 132bl and pincer B 2 132b2.
  • FIG. 1 shows legs 106a and 106b having a semi-circular profile or shape, though in a physical aspect the legs would follow a three-dimensional curve when the tension wire is pulled by the wheel, wherein the leg would flex inward and upward.
  • the wheels 108a and 108b When subject to movement across a planar or relatively planar surface, the wheels 108a and 108b perform in a skid-steer manner commonly found in small mobile robotics.
  • the computing device 112 engages circuits 138a and 138b operatively coupled to motors 110a and 110b, respectively.
  • Non-linear cams 126a and 126b are engaged by motors 110a and 110b, respectively, to engage tension wires 122a and 122b, located on each of legs 106a and 106b. The engagement effectively compresses and lifts a selected leg.
  • the tension wires 122a and 122b are operatively coupled to the circuits 138a and 138b, respectively, to engage the motors 110a and 110b at the end of legs 106a and 106b, respectively.
  • the tension wires 122a and 122b are respectively attached to anchor points 124a and 124b located at the base of legs 106a and 106b.
  • the non-linear cams 126a and 126b have a variable radius which increases mechanical advantage thus increasing the force being applied to pull the tension wires 122a and 122b during rotation.
  • pincers 132al and 132a2 clamp down on the tension wire 122a and hold the tension wire in place to maintain the desired bending or compression of the leg A 106a.
  • Leg B 106b is subject to similar bending and compression, including pincers 132bl and 132b2 clamping down on tension wire 122b and holding the tension wire in place to maintain the desired bend or compression of leg B 106b.
  • the pincers are discussed in more detail in FIG. 3. In the current FIG.
  • the tension wires 122a and 122b may run down the front and the back of the legs 106a and 106b.
  • the tension wires 122a and 122b may be a chain which goes around a gear in wheel 208.
  • a gripper 134 is shown mounted to the underside of the chassis 102 for payload handling.
  • the gripper is a device which enables holding of an object to be manipulated.
  • the gripper 134 is configured to enable the vehicle to transport a payload from a source location to a destination location.
  • the gripper 134 has a one-degree of freedom, although this aspect should not be considered limiting.
  • the gripper 134 is operatively coupled to tension wires 122a and 122b so that the chassis 102 can be raised or lowered to facilitation and enable transport of the payload.
  • the chassis 102 may be lowered to enable the gripper 134 to pick-up or deliver the payload, and the chassis may be raised or manipulated into a raised position during transport of the payload.
  • the uni-body multimodal mobility device 100 may be used to reduce virus spread will require rapid and frequent decontamination of large and diverse spaces with varied terrain, obstacles, and surfaces such as schools, child care facilities, commercial office buildings, restaurants, retail stores, public areas, entertainment venues, gyms, hotels, and travel hubs, in addition to hospitals and other healthcare facilities.
  • the gripper 134 may hold, for example, but not limited to an Ultraviolet (UV) type “C” (UVC) light or even shorter “far-UVC” configured for disinfecting surfaces.
  • UVC Ultraviolet
  • a swarm of uni-body multimodal mobility devices 100 equipped with a UVC light cab be deployed to disinfect a workspace.
  • the UVC (and/or “far-UVC”) light may be located on any portion of the uni-body multimodal mobility device 100.
  • FIG. 2 an enhanced view 200 of the bottom of leg 202 as shown in FIG. 1 is provided. Embedded within leg 202 are the motor 204 and non-linear cam 206 from FIG. 1. Also shown is the wheel 208 from FIG. 1 with tire 210 mounted to the wheel. An electrical current is sent to the motor 204 to turn a shaft 212. Operatively coupled to the shaft 212 is a bevel gear 214 that is positioned by a linear actuator 216 to selectively engage with either the cam 206 or the wheel 208.
  • a signal is sent to a meter gear 218 that engages with the linear actuator 216 which moves the bevel gear 214 into contact with the cam 206.
  • the bevel gear 214 is moved by an electro-magnetic process.
  • the motor 204 rotates the shaft 212 and the bevel gear 214 which when in contact causes the cam 206 to rotate as well.
  • the cam 206 is operatively connected to the tension wire of FIG. 1, and the rotation of the cam 206 causes the tension wire to tighten lifting the leg 202.
  • the cam 206 has a varying radius to allow for progressively higher tension forces as each leg compresses.
  • Brake springs 220 exert a force on the wheel 208 pushing the wheel 208 against a brake surface 222 that prevents the wheel 208 from spinning.
  • the brake springs 222 are mounted to an outside surface 224 that spins with the shaft 212.
  • the motor 204 and shaft 212 would not utilize the bevel gear 214 but would instead have a simple spline (not shown) that would drive the wheel 208 or would drive both the cam 206 and the wheel 208 at the same time.
  • the cam 206 and wheel 208 would both be engaged such that the wheel 208 forward rotation would assist in raising the leg 202 when the wheel 208 is in contact with a climbing surface such as a step.
  • the pincers could clamp down on the tension wire to keep the leg 202 lifted.
  • the shaft 212 spline would move to release the cam 206 tension while the pincers maintained tension on the wire.
  • the motor 204 would drive the wheel 208 forward as the pincers methodically release the tension wire to lower the leg 202 thereby improving climbing performance by both rolling up and pulling up with the legs.
  • the other wheels and legs of the chassis would behave in a similar manner both rolling forward and pushing with the legs.
  • an enhanced view 300 of the pincer 302 as shown in FIG. 1 is provided.
  • the pincer 302 functions to hold the tension wire 304 in place after it has been compressed by the cam, so that when the cam is disengaged to move the wheel, the leg maintains its compressed shape and lifted position.
  • pin 306 pierces a link in the tension wire 304 and recesses into a linear actuator 308.
  • the pin 306 retreats from the linear actuator 308 and from the link in tension wire 304 allowing the tension wire 304 to decompress and the leg to retain its normal, uncompressed shape.
  • AI Artificial Intelligence
  • AI refers to the intelligence when machines, based on information, are able to make decisions, which maximizes the chance of success in a given topic. More specifically, AI is able to leam from a data set to solve problems and provide relevant recommendations .
  • Machine learning which is a subset of AI, utilizes algorithms and corresponding neural networks, to learn from data and create foresights based on this data. More specifically, ML is the application of AI through the creation of neural networks that can demonstrate learning behavior by performing tasks that are not explicitly programmed.
  • reinforcement learning is a type of dynamic programming that trains algorithms using a system of reward and punishment.
  • a reinforcement learning algorithm learns by interacting with its environment. A reward state is generated and received by the vehicle when performing correctly, and penalties are generated and received by the vehicle when performing incorrectly.
  • a positive value or positive set of values is assigned with respect to desired behavior and action to provide positive reinforcement
  • a negative value or negative set of values is assigned with respect to undesired behavior(s) for negative reinforcement.
  • the reinforcement learning enables the vehicle to explore mobility and mobility limitations in a defined environment. More specifically, the reinforcement learning supports and enables autonomous movement behaviors, such rolling, walking, and climbing.
  • FIG. 4 a flow chart 400 is provided to illustrate application of reinforcement learning to the vehicle structure and functionality shown and described in FIGS. 1-3.
  • a reinforcement learning algorithm acquires all of the environmental state inputs from the vehicle 402.
  • the state inputs include, but are not limited to, the camera image data, IMU data, GPS location data, bend sensor data, and motor current value(s).
  • the inputs are mapped, with some probability, to various outputs 404.
  • Examples of the various outputs include, but are not limited to, motor torque, motor direction, cam activation, pincer activation, etc.
  • the vehicle will conduct a given action 410 and will receive a predefined reward signal 412. If the reward is high 414, the mapping from observation to activation for that observation and that action pair is reinforced 416, thereby providing encouragement so that the action taken will be more likely to occur in the future when receiving the same or similar observation.
  • a high reward value is a positive integer. If the reward is low 418, the mapping from observation to activation for that observation and action pair is discouraged 420 such that the action taken will be less likely to occur in the future when receiving the same or similar observation.
  • a low reward value is a negative integer.
  • Reinforcement learning as shown and described herein is an autonomous learning process.
  • a flow chart 500 is provided to illustrate an example of reinforcement learning as applied to the land vehicle described herein. Reinforcement learning is directed at multi-dimensional movement of the vehicle.
  • the vehicle sensors gather corresponding sensor data 504, which is employed to assess how the vehicle may climb over the obstacle.
  • the obstacle is a stair, although this is one example and should not be considered limiting.
  • the assessment at step (504) is followed by using the acquired sensor data to calculate a quantity and degree of lift required by the vehicle legs and the sequence in which the legs are lifted 506.
  • the variable Xi btai is assigned to the quantity of vehicle legs 508.
  • the calculated quantity and sequence are applied to the legs in the form of compression and lifting as dictated by the sequence 510.
  • the counting variable X is initialized 512 and a leg, e.g. legx, in the sequence is subject to compression and lift 514. It is then determined if the movement of legx was successful 516. If the movement was unsuccessful, negative feedback in the form of a negative reward value is generated58) with a magnitude of the negative reward value being proportional to a degree of failure corresponding to the feedback. The reward value is a factor in the reinforcement learning. Following step 518, the process returns to step 506 for reassessment.
  • a positive response to the determination at step 516 is followed by generating a positive reward value 520 with a magnitude of the positive reward value being proportional to a degree of success corresponding to the feedback.
  • the counting variable X is then incremented 522 and it is determined if the sequence of leg compressions and lifts is completed 524.
  • a negative response at step 524 is followed by a return to step 514 for lift and compression of the next leg, e.g. legx, identified in the sequence, and a positive response at step 524 is followed by the vehicle moving as instructed 526.
  • the reinforcement learning support dynamically learning and application to the vehicle and vehicle movement by adjusting vehicle movement actions based on continuous feedback to maximize a reward.
  • FIG. 6 a block diagram 600 is provided illustrating an example of a computer system/server 602, hereinafter referred to as a host 602 in communication with a cloud based support system, to implement the system and processes described above with respect to FIGS. 1-5.
  • Host 602 is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with host 602 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and file systems (e.g ., distributed storage environments and distributed cloud computing environments) that include any of the above systems, devices, and their equivalents.
  • the host 602 may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system.
  • program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types.
  • the host 602 may be practiced in a distributed cloud computing environment 610 where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer system storage media including memory storage devices.
  • the host 602 is shown in the form of a general-purpose computing device.
  • the components of the host 602 may include, but are not limited to, one or more processors or processing units 604, a system memory 606, and a bus 608 that couples various system components including system memory 606 to processing unit 604.
  • the bus 608 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
  • Host 602 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by the host 602 and it includes both volatile and non-volatile media, removable and non-removable media.
  • the memory 606 can include computer system readable media in the form of volatile memory, such as random-access memory (RAM) 630 and/or cache memory 632.
  • storage system 634 can be provided for reading from and writing to a non removable, non-volatile magnetic media (not shown and typically called a “hard drive”).
  • a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”).
  • an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media.
  • each can be connected to bus 608 by one or more data media interfaces.
  • Program/utility 640 having a set (at least one) of program modules 642, may be stored in the memory 606 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating systems, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment.
  • the program modules 642 generally carry out the functions and/or methodologies of aspects to store and analyze data with respect to leg movement and compression, and corresponding reward value computations.
  • the host 602 may also communicate with one or more external devices (614), such as a keyboard, a pointing device, etc.; a display 624; one or more devices that enable a user to interact with host 602; and/or any devices (e.g., network card, modem, etc.) that enable host 602 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interface(s) 622. Still yet, host 602 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 620.
  • LAN local area network
  • WAN wide area network
  • public network e.g., the Internet
  • network adapter 620 communicates with the other components of host 602 via bus 608.
  • a plurality of nodes of a distributed file system (not shown) is in communication with the host 602 via the I/O interface 622 or via the network adapter 620.
  • other hardware and/or software components could be used in conjunction with host 602. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
  • main memory 606 including RAM 630, cache 632, and storage system 634, such as a removable storage drive and a hard disk installed in a hard disk drive.
  • Computer programs are stored in memory 606. Computer programs may also be received via a communication interface, such as network adapter 620. Such computer programs, when run, enable the computer system to perform the features of the present aspects as discussed herein. In particular, the computer programs, when run, enable the processing unit 604 to perform the features of the computer system. Accordingly, such computer programs represent controllers of the computer system.
  • the host 602 is a node of a cloud computing environment 610.
  • cloud computing is a model of service delivery for enabling convenient, on- demand network access to a shared pool of configurable computing resources (e.g ., networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services) that can be rapidly provisioned and released with minimal management effort or interaction with a provider of the service.
  • This cloud model may include at least five characteristics, at least three service models, and at least four deployment models. Example of such characteristics are as follows:
  • On-demand self-service a cloud consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with the service’s provider.
  • Broad network access capabilities are available over a network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, laptops, and PDAs).
  • heterogeneous thin or thick client platforms e.g., mobile phones, laptops, and PDAs.
  • Resource pooling the provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to demand. There is a sense of location independence in that the consumer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher layer of abstraction (e.g., country, state, or datacenter).
  • Rapid elasticity capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale in. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.
  • Measured service cloud systems automatically control and optimize resource use by leveraging a metering capability at some layer of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported providing transparency for both the provider and consumer of the utilized service.
  • Service Models are as follows:
  • SaaS Software as a Service: the capability provided to the consumer is to use the provider’s applications running on a cloud infrastructure.
  • the applications are accessible from various client devices through a thin client interface such as a web browser (e.g ., web-based email).
  • a web browser e.g ., web-based email.
  • the consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.
  • PaaS Platform as a Service
  • the consumer does not manage or control the underlying cloud infrastructure including networks, servers, operating systems, or storage, but has control over the deployed applications and possibly application hosting environment configurations.
  • IaaS Infrastructure as a Service
  • the consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, deployed applications, and possibly limited control of select networking components (e.g., host firewalls).
  • Private cloud the cloud infrastructure is operated solely for an organization. It may be managed by the organization or a third party and may exist on-premises or off-premises.
  • Community cloud the cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns (e.g., mission, security requirements, policy, and compliance considerations). It may be managed by the organizations or a third party and may exist on-premises or off-premises.
  • Public cloud the cloud infrastructure is made available to the general public or a large industry group and is owned by an organization selling cloud services.
  • Hybrid cloud the cloud infrastructure is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability (e.g ., cloud bursting for load balancing between clouds).
  • a cloud computing environment is service oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability.
  • An infrastructure comprising a network of interconnected nodes.
  • cloud computing network 700 includes a cloud computing environment 750 having one or more cloud computing nodes 710 with which local computing devices used by cloud consumers may communicate. Examples of these local computing devices include, but are not limited to, personal digital assistant (PDA) or cellular telephone 754A, desktop computer 754B, laptop computer 754C, and/or automobile computer system 754N. Individual nodes within nodes 710 may further communicate with one another. They may be grouped (not shown) physically or virtually, in one or more networks, such as Private, Community, Public, or Hybrid clouds as described hereinabove, or a combination thereof.
  • PDA personal digital assistant
  • cellular telephone 754A desktop computer 754B
  • laptop computer 754C laptop computer 754C
  • automobile computer system 754N automobile computer system
  • Individual nodes within nodes 710 may further communicate with one another. They may be grouped (not shown) physically or virtually, in one or more networks, such as Private, Community, Public, or Hybrid clouds as described hereinabove, or a combination thereof.
  • cloud computing environment 700 to offer infrastructure, platforms and/or software as services for which a cloud consumer does not need to maintain resources on a local computing device. It is understood that the types of computing devices 754A-N shown in FIG. 7 are intended to be illustrative only and that the cloud computing environment 750 can communicate with any type of computerized device over any type of network and/or network addressable connection (e.g., using a web browser).
  • FIG. 8 a set of functional abstraction layers 800 provided by the cloud computing network of FIG. 7 is shown. It should be understood in advance that the components, layers, and functions shown in FIG. 8 are intended to be illustrative only, and the aspects are not limited thereto. As depicted, the following layers and corresponding functions are provided: hardware and software layer 810, virtualization layer 820, management layer 830, and workload layer 840.
  • the hardware and software layer 810 include hardware and software components. Examples of hardware components include mainframes; RISC (Reduced Instruction Set Computer) architecture-based servers; networks and networking components. Examples of software components include network application server software; and database software.
  • Virtualization layer 820 provides an abstraction layer from which the following examples of virtual entities may be provided: virtual servers; virtual storage; virtual networks, including virtual private networks; virtual applications and operating systems; and virtual clients.
  • management layer 830 may provide the following functions: resource provisioning, metering and pricing, user portal, service layer management, and SLA planning and fulfillment.
  • Resource provisioning provides dynamic procurement of computing resources and other resources that are utilized to perform tasks within the cloud computing environment.
  • Metering and pricing provides cost tracking as resources are utilized within the cloud computing environment, and billing or invoicing for consumption of these resources. In one example, these resources may comprise application software licenses.
  • Security provides identity verification for cloud consumers and tasks, as well as protection for data and other resources.
  • User portal provides access to the cloud computing environment for consumers and system administrators.
  • Service layer management provides cloud computing resource allocation and management such that required service layers are met.
  • Service Layer Agreement (SLA) planning and fulfillment provides pre-arrangement for, and procurement of, cloud computing resources for which a future requirement is anticipated in accordance with an SLA.
  • Workloads layer 840 provides examples of functionality for which the cloud computing environment may be utilized. Examples of workloads and functions which may be provided from this layer include, but are not limited to: mapping and navigation; software development and lifecycle management; virtual classroom education delivery; data analytics processing; transaction processing; and reinforcement learning application to vehicle movement.
  • the present aspects may be a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present aspects.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • a computer readable signal medium includes a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer readable signal medium is any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present aspects may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present aspects.
  • the aspects may be embodied as a system, method, or computer program product. Accordingly, the aspects may take the form of an entirely hardware aspect, an entirely software aspect (including firmware, resident software, micro-code, etc.), or an aspect combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, the aspects described herein may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • each block in the flow charts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flow chart illustration(s), and combinations of blocks in the block diagrams and/or flow chart illustration(s), can be implemented by special purpose hardware- based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices.
  • operational data may be identified and illustrated herein within the tool, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single dataset, or may be distributed over different locations including over different storage devices, and may exist, at least partially, as electronic signals on a system or network.
  • Fiducials are positioned in communication with a body and recognized within captured 3D data utilized for model generation thereby increasing the quality of the generated model and/or 3D mesh.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The aspect relates to a uni-body multimodal mobility device. Each leg of the device is embedded with a wheel and wheel motor which is subject to skid-steer movement. When increased mobility is required, the cam is engaged by the wheel motor and a tension wire embedded in one or more of the legs is subject to a force to compress and lift the leg. One or more pincers in each leg are provided to allow compression to specific portions of the leg, thereby allowing for semi-independent degrees of freedom. The device utilizes artificial intelligence (AI) reinforcement learning to enable autonomous behaviors. The AI takes the environmental state inputs and maps them to outputs including various mobility actions. Upon conducting an action, the device receives a corresponding proportional and commensurate reward signal directed at action reinforcement.

Description

UNI-BODY MULTIMODAL MOBILITY CHASSIS AND AUTONOMOUS VEHICLE SYSTEM
TECHNICAL FIELD
[0001] The present aspect relates to an autonomous vehicle system and more particularly to a land vehicle operatively coupled to an artificial intelligence (AI) platform to support autonomous vehicle movement. More specifically, the AI platform supports and enables reinforcement learning directed at a risk-reward system utilizing vehicle sensor data to direct autonomous movement of the vehicle.
BACKGROUND
[0002] Autonomous vehicles have become adapted for increasing and varied purposes. For example, navigating a complex environment. An autonomous vehicle is essentially an autonomous robot with mechanisms that allow it to navigate on the surface of the ground. Among other tasks, an autonomous vehicle lit the real world has the ability to travel without human navigation assistance. However, often obstacles can get in the way of the autonomous vehicle making navigation difficult.
[0003] Accordingly, improvements are needed in using AI to direct autonomous vehicles.
SUMMARY
[0004] In various aspects, autonomous vehicle system, is presented. The autonomous vehicle includes a flexible uni-body chassis with no distinguishable joints; a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; a processor; and a memory. The memory has instructions stored thereon, which when executed cause the system to receive a signal from the sensor; determine if an object is encountered based on the signal; determine, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and control, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network. [0005] In aspects, the instructions when executed by the processor may cause the system to determine if the movement of the autonomous vehicle was successful, generate at least one of a positive reward value or a negative reward value based on the determined success, and train the machine learning network using reinforcement training based on the generated reward.
[0006] In various aspects, the leg movement may include a quantity of lift, a degree of lift, and/or a sequence in which each of the plurality of legs are lifted.
[0007] In various aspects, each leg of the plurality of legs may further include a tension wire configured to bend or compress the respective leg, and a pincer configured to selectively engage movement of the corresponding wheel. The wheel motor may be configured to selectively engage the tension wire.
[0008] In aspects, the sensor may include a bend sensor, a wheel motor current sensor, an inertial measurement unit sensor; a first camera, a GPS, a temperature sensor, and/or a humidity sensor.
[0009] In various aspects, the first camera may be an omnidirectional camera configured to provide a 360° view of the surroundings of the chassis.
[0010] In various aspects, the instructions when executed by the processor may cause the system to control, by the machine learning network, pincer activation based on the sensor.
[0011] In aspects, input to the machine learning network further includes at least one of camera image data, IMU data, GPS location data, bend sensor data, or wheel motor current value.
[0012] In various aspects, the sensor further may include a second camera positioned to proximal to a mirror. The mirror may be positioned at a 90° relative to the second camera.
[0013] In various aspects, the autonomous vehicle may further include a gripper configured for holding an object to be manipulated.
[0014] In aspects, the autonomous vehicle system may further include a UVC light, configured to disinfect surfaces.
[0015] In various aspects, a computer-controlled method for autonomous vehicle movement includes receiving a signal from a sensor of an autonomous vehicle, the autonomous vehicle including a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; determining if an object is encountered based on the signal; determining, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and controlling, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
[0016] In various aspects, the method may further include-determining if the movement of the autonomous vehicle was, generating at least one of a positive reward value or a negative reward value based on the determined success, and training the machine learning network using reinforcement training based on the generated reward.
[0017] In aspects, the leg movement may include at least one of a quantity of lift, a degree of lift, or a sequence in which each of the plurality of legs are lifted.
[0018] In various aspects, each leg of the plurality of legs further may include a tension wire configured to bend or compress the respective leg, and a pincer configured to selectively engage movement of the corresponding wheel. The wheel motor may be configured to selectively engage the tension wire.
[0019] In various aspects, the sensor may include a bend sensor, a wheel motor current sensor, an inertial measurement unit sensor; a first camera, a GPS, a temperature sensor, and/or a humidity sensor.
[0020] In aspects, the first camera may be an omnidirectional camera configured to provide a 360° view of the surroundings of the chassis.
[0021] In various aspects, may include controlling, by the machine learning network, pincer activation based on the sensor.
[0022] In various aspects, input to the machine learning network may further include at least one of camera image data, IMU data, GPS location data, bend sensor data, or wheel motor current value.
[0023] In aspects, a non-transitory computer-readable medium storing instructions which, when executed by a processor, cause the processor to perform a method for autonomous vehicle movement, the method includes: receiving a signal from a sensor of an autonomous vehicle, the autonomous vehicle including a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; determining if an object is encountered based on the signal; determining, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and controlling, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
[0024] Further details and aspects of exemplary aspects of the present disclosure are described in more detail below with reference to the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS [0025] The drawings referenced herein form a part of the specification. Features shown in the drawings are meant as illustrative of only some aspects, and not of all aspects unless otherwise explicitly indicated.
[0026] FIG. 1 is a schematic diagram illustrating an exemplary uni-body multimodal mobility device.
[0027] FIG. 2 is a schematic diagram illustrating an embedded motor of a leg as shown in FIG. 1.
[0028] FIG. 3 is a schematic diagram illustrating a pincer as shown in FIG. 1.
[0029] FIG. 4 is a flow chart illustrating application of reinforcement learning to the vehicle structure and functionality shown in FIGS. 1-3.
[0030] FIG. 5 is a flow chart illustrating an example of reinforcement learning as applied to an illustrative embodiment of a land vehicle in accordance with the present disclosure.
[0031] FIG. 6 is a block diagram illustrating an example of a computer system/server in communication with a cloud-based support system, to implement the system and processes with respect to the embodiment of FIGS. 1-5.
[0032] FIG. 7 is a block diagram illustrating a cloud computer environment in accordance with an illustrative embodiment of the present disclosure.
[0033] FIG. 8 is a block diagram illustrating a set of functional abstraction model layers provided by the cloud computing environment shown in the embodiment of FIG. 7.
DETAILED DESCRIPTION
[0034] The present disclosure relates to an autonomous vehicle system, such as a land vehicle operatively coupled to an artificial intelligence (AI) platform to support autonomous vehicle movement. More specifically, the presently disclosed AI platform supports and enables reinforcement learning directed at a risk-reward system utilizing vehicle sensor data to direct autonomous movement of the vehicle.
[0035] It will be readily understood that the components of the aspects, as generally described and illustrated in the Figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the aspects of the system and the method, as presented in the Figures, is not intended to limit the scope of the aspects, as claimed, but is merely representative of the selected aspects.
[0036] Reference throughout this specification to “a select aspect,” “one aspect,” or “an aspect” means that a particular feature, structure, or characteristic described in connection with the aspect is included in at least one aspect. Thus, appearances of the phrases “a select aspect,” “in one aspect,” or “in an aspect” in various places throughout this specification are not necessarily referring to the same aspect.
[0037] The illustrated aspects will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following detailed description is intended only by way of example, and simply illustrates certain selected aspects of devices, systems, and processes that are consistent with the aspects as claimed herein.
[0038] Referring now to the figures and initially with reference to FIG. 1, a schematic diagram of an autonomous vehicle system includes a uni-body multimodal mobility device 100. As shown and described herein, the uni-body multimodal mobility device 100 is a land vehicle and is configured with a uni-body construction. Uni-body refers to a single molded unit forming both the bodywork and chassis of the vehicle. It is understood in the art that a land vehicle is directed at an apparatus adapted to travel on land or over a surface. In one aspect, and as described in detail below, the vehicle may carry a load. The uni-body multimodal mobility device 100 is provided with a supporting frame, referred to herein as a chassis 102. The chassis 102 is a flexible, uni-body chassis with no distinguishable joints. The chassis 102 can be fabricated, for example, by injection molding, with the chassis 102 having variable density and variable compliancy to achieve optimal compression, rebounding, and dampening characteristics to improve mobility. The chassis 102 combines legged mobility with wheeled mobility into the uni-body design as will now be described in more detail.
[0039] As shown, the uni-body multimodal mobility device 100 is configured with a plurality of legs and corresponding wheels or wheel system to support and enable movement across a surface. In the FIG. 1 two legs are shown, including leg 106a and 106b, with each leg having a corresponding wheel, shown herein as wheels 108a and 108b, operatively connected to a chassis 102, respectively. The quantity of legs and wheels shown herein should not be considered limiting. In one aspect, the chassis 102 has one or more additional legs with each additional leg having a proximally positioned wheel. In addition, although shown herein as a single wheel, in one aspect, the wheels 108a and 108b may be a system having two or more proximally positioned wheels or revolving objects that support movement of the uni-body multimodal mobility device 100 across a surface.
[0040] Each wheel or system of wheels is in communication with a proximally positioned motor embedded into a distal end of each leg. As shown herein, motor 110a is positioned proximal to wheel 108a, and motor 110b is positioned proximal to wheel 108b. The motor and motor components will be discussed in greater detail in conjunction with reference to FIG. 2. [0041] The chassis 102 is configured to receive a computing device 112 with an embedded processor (not shown). In an aspect, the computing device 112 in one illustrative embodiment is a smart phone or tablet configured with a corresponding operating system. As shown herein, the computing device 112 is configured with a camera, such as high-resolution camera 118. The computing device 112 is placed into or received by the chassis 102 with the camera 118 positioned without obstructions from the chassis 102. In the example shown herein, the area 104 of the chassis 102 that receives the computing device 112 is raised to enable the camera 118 to capture views that extend over the legs 106a and 106b, e.g. the view from the camera is maximized so that any obstruction from the legs is minimized. The camera 118 includes an omni-directional lens 114 is shown positioned in communication with the computing device 112. The omni-directional lens 114 is operatively coupled to the computing device 112 to provide a 360° view of the surroundings of the chassis 102. A second camera 136, also referred to herein as a rear camera of the computing device 112, is positioned to proximal to a mirror 116, shown herein positioned at a 90° relative to the second camera 136. A corresponding reflection off-of mirror 116 enhances the view and corresponding images captured by the second camera 136. [0042] The chassis 102 is shown herein with a battery 120 operatively coupled to the computing device 112. The battery 120 may be a replaceable battery or a rechargeable battery embedded in the chassis 102. In one aspect, the battery 120 may be comprised of multiple replaceable or rechargeable batteries. Similarly, in one aspect, the batteries 120 may be embedded in each of the legs 106a and 106b. Sensors, in communication with the computing device 112, include a plurality of inertial measurement unit (IMU) sensors 128, a global positioning system (GPS) (not shown), cameras 114 and 136, temperature sensors (not shown) and humidity sensors (not shown).
[0043] The chassis 102 and legs 106a and 106b are a single, composite, compliant part with various other components fused into the uni-body design. As shown, each leg includes a corresponding bend sensor, a tension wire housed in a tension wire conduit, and one or more IMUs and pincers. The tension wire as shown herein, legA 106a is shown with bend sensorA 130a, tension wireA 122a, IMUA 128a, pincerAi 132al and pincerA2 132a2 and legB 106b is shown with bend sensorB 130b, tension wireB 122b, IMUB 128b, pincerBi 132bl and pincerB2 132b2. Electrical connections (not shown) are provided from each leg IMU and from each bend sensor to the computing device 112. In one aspect, the electrical connections are on the same serial bus. The plurality of sensors operatively coupled to the CPU, e.g. IMU(s), accelerometers(s), gyroscope(s), GPS, camera(s), temperature, humidity, together with the leg sensors, e.g. bend sensors and IMUs, as well as electrical current generated by the motor, will be used to measure the degree of bending, compression, or impact. FIG. 1 shows legs 106a and 106b having a semi-circular profile or shape, though in a physical aspect the legs would follow a three-dimensional curve when the tension wire is pulled by the wheel, wherein the leg would flex inward and upward. It is contemplated that more than one tension wire per leg may be used. [0044] When subject to movement across a planar or relatively planar surface, the wheels 108a and 108b perform in a skid-steer manner commonly found in small mobile robotics. When increased mobility is required, the computing device 112 engages circuits 138a and 138b operatively coupled to motors 110a and 110b, respectively. Non-linear cams 126a and 126b are engaged by motors 110a and 110b, respectively, to engage tension wires 122a and 122b, located on each of legs 106a and 106b. The engagement effectively compresses and lifts a selected leg. In one aspect, the tension wires 122a and 122b are operatively coupled to the circuits 138a and 138b, respectively, to engage the motors 110a and 110b at the end of legs 106a and 106b, respectively. The tension wires 122a and 122b are respectively attached to anchor points 124a and 124b located at the base of legs 106a and 106b. Non-linear cams 126a and 126b, located at the end of legs 106a and 106b, respectively pull the tension wires 122a and 122b when rotated to lift legs 106a and 106b in a vertical or near vertical direction so that contact between the leg and the external surface is mitigated or eliminated. In one aspect, the non-linear cams 126a and 126b have a variable radius which increases mechanical advantage thus increasing the force being applied to pull the tension wires 122a and 122b during rotation. When a degree of bending or compression is achieved in the legA 106a pincers 132al and 132a2, clamp down on the tension wire 122a and hold the tension wire in place to maintain the desired bending or compression of the legA 106a. LegB 106b is subject to similar bending and compression, including pincers 132bl and 132b2 clamping down on tension wire 122b and holding the tension wire in place to maintain the desired bend or compression of legB 106b. The pincers are discussed in more detail in FIG. 3. In the current FIG. 1, two pincers are shown on each leg however this is not meant to be limiting as more or less pincers may be employed on each leg. In another aspect, the tension wires 122a and 122b may run down the front and the back of the legs 106a and 106b. In aspects, the tension wires 122a and 122b may be a chain which goes around a gear in wheel 208.
[0045] A gripper 134 is shown mounted to the underside of the chassis 102 for payload handling. The gripper is a device which enables holding of an object to be manipulated. For example, in one aspect, the gripper 134 is configured to enable the vehicle to transport a payload from a source location to a destination location. In one aspect the gripper 134 has a one-degree of freedom, although this aspect should not be considered limiting. The gripper 134 is operatively coupled to tension wires 122a and 122b so that the chassis 102 can be raised or lowered to facilitation and enable transport of the payload. For example, the chassis 102 may be lowered to enable the gripper 134 to pick-up or deliver the payload, and the chassis may be raised or manipulated into a raised position during transport of the payload.
[0046] In aspects, the uni-body multimodal mobility device 100 may be used to reduce virus spread will require rapid and frequent decontamination of large and diverse spaces with varied terrain, obstacles, and surfaces such as schools, child care facilities, commercial office buildings, restaurants, retail stores, public areas, entertainment venues, gyms, hotels, and travel hubs, in addition to hospitals and other healthcare facilities. The gripper 134 may hold, for example, but not limited to an Ultraviolet (UV) type “C” (UVC) light or even shorter “far-UVC” configured for disinfecting surfaces. For example, a swarm of uni-body multimodal mobility devices 100 equipped with a UVC light cab be deployed to disinfect a workspace. In aspects, the UVC (and/or “far-UVC”) light may be located on any portion of the uni-body multimodal mobility device 100. [0047] With reference to FIG. 2, an enhanced view 200 of the bottom of leg 202 as shown in FIG. 1 is provided. Embedded within leg 202 are the motor 204 and non-linear cam 206 from FIG. 1. Also shown is the wheel 208 from FIG. 1 with tire 210 mounted to the wheel. An electrical current is sent to the motor 204 to turn a shaft 212. Operatively coupled to the shaft 212 is a bevel gear 214 that is positioned by a linear actuator 216 to selectively engage with either the cam 206 or the wheel 208. When a lifting of leg 202 is required, a signal is sent to a meter gear 218 that engages with the linear actuator 216 which moves the bevel gear 214 into contact with the cam 206. In one aspect the bevel gear 214 is moved by an electro-magnetic process. The motor 204 rotates the shaft 212 and the bevel gear 214 which when in contact causes the cam 206 to rotate as well. The cam 206 is operatively connected to the tension wire of FIG. 1, and the rotation of the cam 206 causes the tension wire to tighten lifting the leg 202. In one aspect, the cam 206 has a varying radius to allow for progressively higher tension forces as each leg compresses. While the bevel gear 214 is engaged with the cam 206 the wheel 208 is held stationary. Brake springs 220 exert a force on the wheel 208 pushing the wheel 208 against a brake surface 222 that prevents the wheel 208 from spinning. The brake springs 222 are mounted to an outside surface 224 that spins with the shaft 212.
[0048] Once the leg 202 is lifted to a desirable height, the pincers from FIG. 1 clamp down on the tension wire allowing the bevel gear 214 to disengage from contact with the cam 206. A signal is sent to the meter gear 218 to engage with the linear actuator 216 to move the bevel gear 214 into contact with the wheel 208. The bevel gear 214 pushes into the wheel 208 and applying a force that causes the brake springs 220 to compress and for the wheel to separate from brake surface 222. Once the wheel 208 has been separated from the brake surface 222 the wheel 208 can be rotated by the motor 204 and the shaft 212.
[0049] In one aspect the motor 204 and shaft 212 would not utilize the bevel gear 214 but would instead have a simple spline (not shown) that would drive the wheel 208 or would drive both the cam 206 and the wheel 208 at the same time. During climbing, the cam 206 and wheel 208 would both be engaged such that the wheel 208 forward rotation would assist in raising the leg 202 when the wheel 208 is in contact with a climbing surface such as a step. Once the leg 202 is fully lifted, the pincers could clamp down on the tension wire to keep the leg 202 lifted. The shaft 212 spline would move to release the cam 206 tension while the pincers maintained tension on the wire. The motor 204 would drive the wheel 208 forward as the pincers methodically release the tension wire to lower the leg 202 thereby improving climbing performance by both rolling up and pulling up with the legs. The other wheels and legs of the chassis would behave in a similar manner both rolling forward and pushing with the legs.
[0050] With reference to FIG. 3, an enhanced view 300 of the pincer 302 as shown in FIG. 1 is provided. As illustrated in FIGS. 1 and 2, the pincer 302 functions to hold the tension wire 304 in place after it has been compressed by the cam, so that when the cam is disengaged to move the wheel, the leg maintains its compressed shape and lifted position. Once the tension wire 304 has been compressed by the cam to an optimal tension, pin 306 pierces a link in the tension wire 304 and recesses into a linear actuator 308. When the leg is required to be lowered, the pin 306 retreats from the linear actuator 308 and from the link in tension wire 304 allowing the tension wire 304 to decompress and the leg to retain its normal, uncompressed shape.
[0051] The vehicle shown and described in connection with FIGS. 1-3 is operatively coupled to an artificial intelligence platform to optimize mobility and functionality of the vehicle. Artificial Intelligence (AI) relates to the field of computer science directed at computers and computer behavior as related to humans. AI refers to the intelligence when machines, based on information, are able to make decisions, which maximizes the chance of success in a given topic. More specifically, AI is able to leam from a data set to solve problems and provide relevant recommendations .
[0052] Machine learning (ML), which is a subset of AI, utilizes algorithms and corresponding neural networks, to learn from data and create foresights based on this data. More specifically, ML is the application of AI through the creation of neural networks that can demonstrate learning behavior by performing tasks that are not explicitly programmed. In the context of AI, reinforcement learning is a type of dynamic programming that trains algorithms using a system of reward and punishment. A reinforcement learning algorithm learns by interacting with its environment. A reward state is generated and received by the vehicle when performing correctly, and penalties are generated and received by the vehicle when performing incorrectly. In one aspect, a positive value or positive set of values is assigned with respect to desired behavior and action to provide positive reinforcement, and a negative value or negative set of values is assigned with respect to undesired behavior(s) for negative reinforcement.
[0053] With respect to the vehicle shown and described in FIGS. 1-3, the reinforcement learning enables the vehicle to explore mobility and mobility limitations in a defined environment. More specifically, the reinforcement learning supports and enables autonomous movement behaviors, such rolling, walking, and climbing. Referring to FIG. 4, a flow chart 400 is provided to illustrate application of reinforcement learning to the vehicle structure and functionality shown and described in FIGS. 1-3. A reinforcement learning algorithm acquires all of the environmental state inputs from the vehicle 402. The state inputs include, but are not limited to, the camera image data, IMU data, GPS location data, bend sensor data, and motor current value(s). The inputs are mapped, with some probability, to various outputs 404. Examples of the various outputs include, but are not limited to, motor torque, motor direction, cam activation, pincer activation, etc. Starting with a default mapping or behavior learning signals 406, upon observing a given state 408, the vehicle will conduct a given action 410 and will receive a predefined reward signal 412. If the reward is high 414, the mapping from observation to activation for that observation and that action pair is reinforced 416, thereby providing encouragement so that the action taken will be more likely to occur in the future when receiving the same or similar observation. In one aspect, a high reward value is a positive integer. If the reward is low 418, the mapping from observation to activation for that observation and action pair is discouraged 420 such that the action taken will be less likely to occur in the future when receiving the same or similar observation. In one aspect, a low reward value is a negative integer. Additionally, for every observation, there will be some probability of taking a random action in order to discover new actions which may be of higher reward. For example, if the vehicle were to encounter stairs, the vehicle could receive a reward for lifting a front leg. There may be multiple reinforcement learning algorithms forming a subsumption architecture where a lower level algorithms learn the most basic vehicle movement behaviors, like turning the wheels to achieve higher forward movement reward, while a higher level reinforcement learning algorithm might output an observation value to a lower level algorithm to achieve navigation.
[0054] Reinforcement learning as shown and described herein is an autonomous learning process. Referring to FIG. 5, a flow chart 500 is provided to illustrate an example of reinforcement learning as applied to the land vehicle described herein. Reinforcement learning is directed at multi-dimensional movement of the vehicle. At such time as an object is encountered (502), the vehicle sensors gather corresponding sensor data 504, which is employed to assess how the vehicle may climb over the obstacle. In one aspect, and as described herein, the obstacle is a stair, although this is one example and should not be considered limiting. The assessment at step (504) is followed by using the acquired sensor data to calculate a quantity and degree of lift required by the vehicle legs and the sequence in which the legs are lifted 506. The variable Xibtai is assigned to the quantity of vehicle legs 508. The calculated quantity and sequence are applied to the legs in the form of compression and lifting as dictated by the sequence 510. The counting variable X, is initialized 512 and a leg, e.g. legx, in the sequence is subject to compression and lift 514. It is then determined if the movement of legx was successful 516. If the movement was unsuccessful, negative feedback in the form of a negative reward value is generated518) with a magnitude of the negative reward value being proportional to a degree of failure corresponding to the feedback. The reward value is a factor in the reinforcement learning. Following step 518, the process returns to step 506 for reassessment. However, a positive response to the determination at step 516 is followed by generating a positive reward value 520 with a magnitude of the positive reward value being proportional to a degree of success corresponding to the feedback. The counting variable X is then incremented 522 and it is determined if the sequence of leg compressions and lifts is completed 524. A negative response at step 524 is followed by a return to step 514 for lift and compression of the next leg, e.g. legx, identified in the sequence, and a positive response at step 524 is followed by the vehicle moving as instructed 526. Accordingly, as shown herein, the reinforcement learning support dynamically learning and application to the vehicle and vehicle movement by adjusting vehicle movement actions based on continuous feedback to maximize a reward.
[0055] Aspects of the reinforcement learning as applied to the land vehicle may be embodied in a computer system/server in a single location, or in one aspect, may be configured in a cloud- based system sharing computing resources. With reference to FIG. 6, a block diagram 600 is provided illustrating an example of a computer system/server 602, hereinafter referred to as a host 602 in communication with a cloud based support system, to implement the system and processes described above with respect to FIGS. 1-5. Host 602 is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with host 602 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and file systems ( e.g ., distributed storage environments and distributed cloud computing environments) that include any of the above systems, devices, and their equivalents.
[0056] The host 602 may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types. The host 602 may be practiced in a distributed cloud computing environment 610 where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
[0057] As shown in FIG. 6, the host 602 is shown in the form of a general-purpose computing device. The components of the host 602 may include, but are not limited to, one or more processors or processing units 604, a system memory 606, and a bus 608 that couples various system components including system memory 606 to processing unit 604. The bus 608 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus. Host 602 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by the host 602 and it includes both volatile and non-volatile media, removable and non-removable media.
[0058] The memory 606 can include computer system readable media in the form of volatile memory, such as random-access memory (RAM) 630 and/or cache memory 632. By way of example only, storage system 634 can be provided for reading from and writing to a non removable, non-volatile magnetic media (not shown and typically called a “hard drive”). Although not shown, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided. In such instances, each can be connected to bus 608 by one or more data media interfaces.
[0059] Program/utility 640, having a set (at least one) of program modules 642, may be stored in the memory 606 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating systems, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment. The program modules 642 generally carry out the functions and/or methodologies of aspects to store and analyze data with respect to leg movement and compression, and corresponding reward value computations.
[0060] The host 602 may also communicate with one or more external devices (614), such as a keyboard, a pointing device, etc.; a display 624; one or more devices that enable a user to interact with host 602; and/or any devices (e.g., network card, modem, etc.) that enable host 602 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interface(s) 622. Still yet, host 602 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 620. As depicted, network adapter 620 communicates with the other components of host 602 via bus 608. In one aspect, a plurality of nodes of a distributed file system (not shown) is in communication with the host 602 via the I/O interface 622 or via the network adapter 620. It should be understood that although not shown, other hardware and/or software components could be used in conjunction with host 602. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc. [0061] In this document, the terms “computer program medium,” “computer usable medium,” and “computer readable medium” are used to generally refer to media such as main memory 606, including RAM 630, cache 632, and storage system 634, such as a removable storage drive and a hard disk installed in a hard disk drive.
[0062] Computer programs (also called computer control logic) are stored in memory 606. Computer programs may also be received via a communication interface, such as network adapter 620. Such computer programs, when run, enable the computer system to perform the features of the present aspects as discussed herein. In particular, the computer programs, when run, enable the processing unit 604 to perform the features of the computer system. Accordingly, such computer programs represent controllers of the computer system.
[0063] In one aspect, the host 602 is a node of a cloud computing environment 610. As is known in the art, cloud computing is a model of service delivery for enabling convenient, on- demand network access to a shared pool of configurable computing resources ( e.g ., networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services) that can be rapidly provisioned and released with minimal management effort or interaction with a provider of the service. This cloud model may include at least five characteristics, at least three service models, and at least four deployment models. Example of such characteristics are as follows:
[0064] On-demand self-service: a cloud consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with the service’s provider.
[0065] Broad network access: capabilities are available over a network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, laptops, and PDAs).
[0066] Resource pooling: the provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to demand. There is a sense of location independence in that the consumer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher layer of abstraction (e.g., country, state, or datacenter).
[0067] Rapid elasticity: capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale in. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.
[0068] Measured service: cloud systems automatically control and optimize resource use by leveraging a metering capability at some layer of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported providing transparency for both the provider and consumer of the utilized service. [0069] Service Models are as follows:
[0070] Software as a Service (SaaS): the capability provided to the consumer is to use the provider’s applications running on a cloud infrastructure. The applications are accessible from various client devices through a thin client interface such as a web browser ( e.g ., web-based email). The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.
[0071] Platform as a Service (PaaS): the capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages and tools supported by the provider. The consumer does not manage or control the underlying cloud infrastructure including networks, servers, operating systems, or storage, but has control over the deployed applications and possibly application hosting environment configurations.
[0072] Infrastructure as a Service (IaaS): the capability provided to the consumer is to provision processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, deployed applications, and possibly limited control of select networking components (e.g., host firewalls).
[0073] Deployment Models are as follows:
[0074] Private cloud: the cloud infrastructure is operated solely for an organization. It may be managed by the organization or a third party and may exist on-premises or off-premises. [0075] Community cloud: the cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns (e.g., mission, security requirements, policy, and compliance considerations). It may be managed by the organizations or a third party and may exist on-premises or off-premises.
[0076] Public cloud: the cloud infrastructure is made available to the general public or a large industry group and is owned by an organization selling cloud services.
[0077] Hybrid cloud: the cloud infrastructure is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability ( e.g ., cloud bursting for load balancing between clouds).
[0078] A cloud computing environment is service oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability. At the heart of cloud computing is an infrastructure comprising a network of interconnected nodes.
[0079] Referring now to FIG. 7, an illustrative cloud computing network 700 is provided. As shown, cloud computing network 700 includes a cloud computing environment 750 having one or more cloud computing nodes 710 with which local computing devices used by cloud consumers may communicate. Examples of these local computing devices include, but are not limited to, personal digital assistant (PDA) or cellular telephone 754A, desktop computer 754B, laptop computer 754C, and/or automobile computer system 754N. Individual nodes within nodes 710 may further communicate with one another. They may be grouped (not shown) physically or virtually, in one or more networks, such as Private, Community, Public, or Hybrid clouds as described hereinabove, or a combination thereof. This allows cloud computing environment 700 to offer infrastructure, platforms and/or software as services for which a cloud consumer does not need to maintain resources on a local computing device. It is understood that the types of computing devices 754A-N shown in FIG. 7 are intended to be illustrative only and that the cloud computing environment 750 can communicate with any type of computerized device over any type of network and/or network addressable connection (e.g., using a web browser).
[0080] Referring now to FIG. 8, a set of functional abstraction layers 800 provided by the cloud computing network of FIG. 7 is shown. It should be understood in advance that the components, layers, and functions shown in FIG. 8 are intended to be illustrative only, and the aspects are not limited thereto. As depicted, the following layers and corresponding functions are provided: hardware and software layer 810, virtualization layer 820, management layer 830, and workload layer 840. The hardware and software layer 810 include hardware and software components. Examples of hardware components include mainframes; RISC (Reduced Instruction Set Computer) architecture-based servers; networks and networking components. Examples of software components include network application server software; and database software.
[0081] Virtualization layer 820 provides an abstraction layer from which the following examples of virtual entities may be provided: virtual servers; virtual storage; virtual networks, including virtual private networks; virtual applications and operating systems; and virtual clients. [0082] In one example, management layer 830 may provide the following functions: resource provisioning, metering and pricing, user portal, service layer management, and SLA planning and fulfillment. Resource provisioning provides dynamic procurement of computing resources and other resources that are utilized to perform tasks within the cloud computing environment. Metering and pricing provides cost tracking as resources are utilized within the cloud computing environment, and billing or invoicing for consumption of these resources. In one example, these resources may comprise application software licenses. Security provides identity verification for cloud consumers and tasks, as well as protection for data and other resources. User portal provides access to the cloud computing environment for consumers and system administrators. Service layer management provides cloud computing resource allocation and management such that required service layers are met. Service Layer Agreement (SLA) planning and fulfillment provides pre-arrangement for, and procurement of, cloud computing resources for which a future requirement is anticipated in accordance with an SLA.
[0083] Workloads layer 840 provides examples of functionality for which the cloud computing environment may be utilized. Examples of workloads and functions which may be provided from this layer include, but are not limited to: mapping and navigation; software development and lifecycle management; virtual classroom education delivery; data analytics processing; transaction processing; and reinforcement learning application to vehicle movement. [0084] The present aspects may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present aspects.
[0085] The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
[0086] A computer readable signal medium includes a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium is any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
[0087] Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
[0088] Computer readable program instructions for carrying out operations of the present aspects may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some aspects, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present aspects.
[0089] As will be appreciated by one skilled in the art, the aspects may be embodied as a system, method, or computer program product. Accordingly, the aspects may take the form of an entirely hardware aspect, an entirely software aspect (including firmware, resident software, micro-code, etc.), or an aspect combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, the aspects described herein may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[0090] The flow charts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various aspects. In this regard, each block in the flow charts or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flow chart illustration(s), and combinations of blocks in the block diagrams and/or flow chart illustration(s), can be implemented by special purpose hardware- based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[0091] The terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0092] Indeed, executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices. Similarly, operational data may be identified and illustrated herein within the tool, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single dataset, or may be distributed over different locations including over different storage devices, and may exist, at least partially, as electronic signals on a system or network.
[0093] Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more aspects. In the following description, numerous specific details are provided, such as examples of agents, to provide a thorough understanding of the disclosed aspects. One skilled in the relevant art will recognize, however, that the aspects can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the aspects.
[0094] The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present aspects has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the aspects in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the aspects. The aspect was chosen and described in order to best explain the principles of the aspects and the practical application, and to enable others of ordinary skill in the art to understand the aspects for various aspects with various modifications as are suited to the particular use contemplated. Fiducials are positioned in communication with a body and recognized within captured 3D data utilized for model generation thereby increasing the quality of the generated model and/or 3D mesh. [0095] It will be appreciated that, although specific aspects have been described herein for purposes of illustration, various modifications may be made without departing from the spirit and scope of the aspects. Accordingly, the scope of protection of these aspects is limited only by the following claims and their equivalents.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. An autonomous vehicle system, comprising: an autonomous vehicle, including: a flexible uni-body chassis with no distinguishable joints; a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; a processor; and a memory, having instructions stored thereon, which when executed cause the system to: receive a signal from the sensor; determine if an object is encountered based on the signal; determine, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and control, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
2. The autonomous vehicle system of claim 1, wherein the instructions when executed by the processor cause the system to: determine if the movement of the autonomous vehicle was successful; generate at least one of a positive reward value or a negative reward value based on the determined success; and train the machine learning network using reinforcement training based on the generated reward.
3. The autonomous vehicle system of claim 1, wherein the leg movement includes at least one of a quantity of lift, a degree of lift, or a sequence in which each of the plurality of legs are lifted.
4. The autonomous vehicle system of claim 1, wherein each leg of the plurality of legs further includes a tension wire configured to bend or compress the respective leg, and a pincer configured to selectively engage movement of the corresponding wheel, and wherein the wheel motor is configured to selectively engage the tension wire.
5. The autonomous vehicle system of claim 1, wherein the sensor includes at least one of a bend sensor, a wheel motor current sensor, an inertial measurement unit sensor; a first camera, a GPS, a temperature sensor, or a humidity sensor.
6. The autonomous vehicle system of claim 5, wherein the first camera is an omnidirectional camera configured to provide a 360° view of the surroundings of the chassis.
7. The autonomous vehicle system of claim 1, wherein the instructions when executed by the processor cause the system to: control, by the machine learning network, pincer activation based on the sensor.
8. The autonomous vehicle system of claim 1, wherein input to the machine learning network further includes at least one of camera image data, IMU data, GPS location data, bend sensor data, or wheel motor current value.
9. The autonomous vehicle system of claim 1, wherein the sensor further includes a second camera positioned to proximal to a mirror, wherein the mirror is positioned at a 90° relative to the second camera.
10. The autonomous vehicle system of claim 1, wherein the autonomous vehicle further includes a gripper configured for holding an object to be manipulated.
11. The autonomous vehicle system of claim 1, further comprising a UVC light, configured to disinfect surfaces.
12. A computer-controlled method for autonomous vehicle movement, comprising: receiving a signal from a sensor of an autonomous vehicle, the autonomous vehicle including a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; determining if an object is encountered based on the signal; determining, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and controlling, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
13. The computer-controlled method of claim 12, further comprising: determining if the movement of the autonomous vehicle was successful; generating at least one of a positive reward value or a negative reward value based on the determined success; and training the machine learning network using reinforcement training based on the generated reward.
14. The computer-controlled method of claim 12, wherein the leg movement includes at least one of a quantity of lift, a degree of lift, or a sequence in which each of the plurality of legs are lifted.
15. The computer-controlled method of claim 12, wherein each leg of the plurality of legs further includes a tension wire configured to bend or compress the respective leg, and a pincer configured to selectively engage movement of the corresponding wheel, and wherein the wheel motor is configured to selectively engage the tension wire.
16. The computer-controlled method of claim 12, wherein the sensor includes at least one of a bend sensor, a wheel motor current sensor, an inertial measurement unit sensor; a first camera, a GPS, a temperature sensor, or a humidity sensor.
17. The computer-controlled method of claim 16, wherein the first camera is an omnidirectional camera configured to provide a 360° view of the surroundings of the chassis.
18. The computer-controlled method of claim 12, further comprising: controlling, by the machine learning network, pincer activation based on the sensor.
19. The computer-controlled method of claim 12, wherein input to the machine learning network further includes at least one of camera image data, IMU data, GPS location data, bend sensor data, or wheel motor current value.
20. A non-transitory computer-readable medium storing instructions which, when executed by a processor, cause the processor to perform a method for autonomous vehicle movement, the method comprising: receiving a signal from a sensor of an autonomous vehicle, the autonomous vehicle including a plurality of legs, each leg includes a corresponding wheel, a wheel motor, and a sensor, the legs configured to support and enable movement across a surface; determining if an object is encountered based on the signal; determining, as an output of a machine learning network, a leg movement for each of the plurality of legs, wherein the signal is an input to the machine learning network; and controlling, at least one of wheel motor torque, wheel motor direction, or cam activation to move a leg of the plurality of legs based on the output of the machine learning network.
PCT/US2020/056037 2019-10-18 2020-10-16 Uni-body multimodal mobility chassis and autonomous vehicle system WO2021076929A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962923116P 2019-10-18 2019-10-18
US62/923,116 2019-10-18

Publications (1)

Publication Number Publication Date
WO2021076929A1 true WO2021076929A1 (en) 2021-04-22

Family

ID=73198511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/056037 WO2021076929A1 (en) 2019-10-18 2020-10-16 Uni-body multimodal mobility chassis and autonomous vehicle system

Country Status (1)

Country Link
WO (1) WO2021076929A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113305854A (en) * 2021-04-27 2021-08-27 苏州邦弘智能科技有限公司 Freight transport robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170190331A1 (en) * 2015-12-31 2017-07-06 Sony Corporation Method and system for adaptive detection and application of horn for an autonomous vehicle
US9914492B1 (en) * 2017-04-13 2018-03-13 Toyota Research Institute, Inc. Low-profile vehicle
JP2019008796A (en) * 2017-06-23 2019-01-17 ウーバー テクノロジーズ,インコーポレイテッド Collision avoidance system for autonomous vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170190331A1 (en) * 2015-12-31 2017-07-06 Sony Corporation Method and system for adaptive detection and application of horn for an autonomous vehicle
US9914492B1 (en) * 2017-04-13 2018-03-13 Toyota Research Institute, Inc. Low-profile vehicle
JP2019008796A (en) * 2017-06-23 2019-01-17 ウーバー テクノロジーズ,インコーポレイテッド Collision avoidance system for autonomous vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113305854A (en) * 2021-04-27 2021-08-27 苏州邦弘智能科技有限公司 Freight transport robot

Similar Documents

Publication Publication Date Title
CN109131340A (en) Active vehicle adjusting performance based on driving behavior
US9709990B2 (en) Autonomous navigation through obstacles
US20190385061A1 (en) Closed loop model-based action learning with model-free inverse reinforcement learning
US11341433B2 (en) Routing and navigation system
US11269356B2 (en) Edge computing for clusters of vehicles
JP2024519443A (en) Method and system for action recognition using bidirectional space-time transformer
US10114460B2 (en) Virtual reality sensory construct
US11392139B2 (en) Method, apparatus and control system for controlling mobile robot
US11436441B2 (en) Systems and methods for training a machine learned model for agent navigation
JP2020507857A (en) Agent navigation using visual input
WO2021076929A1 (en) Uni-body multimodal mobility chassis and autonomous vehicle system
CN110827341A (en) Picture depth estimation method and device and storage medium
CN115533905A (en) Virtual and real transfer learning method and device of robot operation technology and storage medium
Aissam et al. Cloud robotic: Opening a new road to the industry 4.0
US11182674B2 (en) Model training by discarding relatively less relevant parameters
WO2022058820A1 (en) Wearable device enablement for visually impaired user
JP7083402B2 (en) Methods, devices and control systems for controlling mobile robots
US20200065666A1 (en) Safe and fast exploration for reinforcement learning using constrained action manifolds
US11995430B2 (en) Systems and methods for management of unmanned aerial vehicles
CN115565374A (en) Logistics vehicle driving optimization method and device, electronic equipment and readable storage medium
JP2023014018A (en) Computer-implemented method, computer program, and computer system (optimizing deployment of machine learning workloads)
Garcia et al. Development of a mobile robot prototype based on an embedded system for mapping generation and path planning-image processing & communication-ipc 2018
US20200334530A1 (en) Differentiable neuromodulated plasticity for reinforcement learning and supervised learning tasks
Gautason et al. Mars Rover analog2: a MESR analog with environmental mapping and simulation
EP4290477A1 (en) Free space estimator for autonomous movement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20804097

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20804097

Country of ref document: EP

Kind code of ref document: A1