WO2020212808A1 - Dynamically controlling electrically powered scooters based on sensed infrastructure data - Google Patents

Dynamically controlling electrically powered scooters based on sensed infrastructure data Download PDF

Info

Publication number
WO2020212808A1
WO2020212808A1 PCT/IB2020/053372 IB2020053372W WO2020212808A1 WO 2020212808 A1 WO2020212808 A1 WO 2020212808A1 IB 2020053372 W IB2020053372 W IB 2020053372W WO 2020212808 A1 WO2020212808 A1 WO 2020212808A1
Authority
WO
WIPO (PCT)
Prior art keywords
electrically powered
scooter
powered scooter
computing device
processors
Prior art date
Application number
PCT/IB2020/053372
Other languages
French (fr)
Inventor
Daniel Ting-Yuan Chen
Christopher D. KARLEN
Original Assignee
3M Innovative Properties Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3M Innovative Properties Company filed Critical 3M Innovative Properties Company
Publication of WO2020212808A1 publication Critical patent/WO2020212808A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J27/00Safety equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J45/00Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
    • B62J45/20Cycle computers as cycle accessories
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62JCYCLE SADDLES OR SEATS; AUXILIARY DEVICES OR ACCESSORIES SPECIALLY ADAPTED TO CYCLES AND NOT OTHERWISE PROVIDED FOR, e.g. ARTICLE CARRIERS OR CYCLE PROTECTORS
    • B62J45/00Electrical equipment arrangements specially adapted for use as accessories on cycles, not otherwise provided for
    • B62J45/40Sensor arrangements; Mounting thereof
    • B62J45/41Sensor arrangements; Mounting thereof characterised by the type of sensor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62KCYCLES; CYCLE FRAMES; CYCLE STEERING DEVICES; RIDER-OPERATED TERMINAL CONTROLS SPECIALLY ADAPTED FOR CYCLES; CYCLE AXLE SUSPENSIONS; CYCLE SIDE-CARS, FORECARS, OR THE LIKE
    • B62K3/00Bicycles
    • B62K3/002Bicycles without a seat, i.e. the rider operating the vehicle in a standing position, e.g. non-motorized scooters; non-motorized scooters with skis or runners
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/20Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles
    • G08G1/207Monitoring the location of vehicles belonging to a group, e.g. fleet of vehicles, countable or determined number of vehicles with respect to certain areas, e.g. forbidden or allowed areas with possible alerting when inside or outside boundaries
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62KCYCLES; CYCLE FRAMES; CYCLE STEERING DEVICES; RIDER-OPERATED TERMINAL CONTROLS SPECIALLY ADAPTED FOR CYCLES; CYCLE AXLE SUSPENSIONS; CYCLE SIDE-CARS, FORECARS, OR THE LIKE
    • B62K2204/00Adaptations for driving cycles by electric motor

Definitions

  • the present application relates generally to electrically powered scooters and roadway infrastructure.
  • Electric scooters are often used to transport people over relatively short distances.
  • a user of an electric scooter typically rides the scooter on a roadway, street, pathway or a sidewalk, and frequently may use the scooter in urban or campus settings as a convenient mode of
  • the roadway / street used by the scooter may by occupied by vehicles travelling at relatively high speeds compared to the scooter.
  • sidewalks are often occupied by pedestrians travelling at relatively low speeds compared to the scooter. Navigating roadways, streets, paths and/or sidewalks may pose a risk to the safety of the user of the electric scooter, occupants of a vehicle, pedestrians, or any other person, pet, or property in proximity to the scooter.
  • an electrically powered scooter (referred to generally herein simply as a scooter) includes a computing device to control operation of the scooter and one or more inputs to sense (i.e., capture or receive) data from infrastructure within an environment in which the scooter is or will be operated.
  • the computing device may, based on sensed infrastructure data, automatically adjust a speed at which the scooter is permitted to travel or cause the scooter to slow down or stop, for example, based on the type of pathway (e.g., a vehicle pathway, a bicycle pathway, or a pedestrian pathway).
  • the computing device may set the maximum speed for the scooter to one speed upon determining, from the sensed infrastructure data, that the scooter is currently located or will be operating on a vehicle pathway (e.g., 15 miles per hour on a road) and may automatically set the maximum speed to another, lower speed upon determining from the infrastructure data that the scooter is located on or will be used on a pedestrian pathway (e.g., a sidewalk).
  • the computing device may control the scooter based on determining, according to the infrastructure data, whether the scooter is located within or proximate to a physical region in which the scooter is permitted to be physically located.
  • the computing device may output a command to apply a brake or gracefully shut off power to an electric motor when the scooter is located in or proximate to a prohibited location (e.g., proximate a building entrance).
  • the computing device may control operation of the scooter based on infrastructure data captured from or otherwise received (e.g., from a local or cloud-based computing system), where that infrastructure data is specific to the physical location in which the scooter is currently located or planned for operation.
  • infrastructure data captured from or otherwise received (e.g., from a local or cloud-based computing system)
  • the computing device may improve the safety of the user, nearby pedestrians, and/or occupants of nearby motor vehicles.
  • the disclosure describes a system comprising: an infrastructure article; a micro-mobility device comprising a sensor configured to generate infrastructure data indicate of the infrastructure article; and a computing device comprising a memory and one or more computer processors, wherein the memory comprises instructions that when executed by the one or more computer processors cause the one or more computer processors to: receive, from a sensor, the infrastructure data; determine, based at least in part on the infrastructure data, a type of a location in which the micro-mobility device is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
  • an electrically powered scooter comprises a scooter chassis having a rear wheel mount at one end and a front wheel mount at the other end with a chassis support member extending there between; a chassis-supported front wheel mounted to the front wheel mount for turning steering movement with respect to the front wheel mount and a chassis- supported rear wheel; a chassis-supported motor physically coupled to the scooter chassis and configured by a motor controller to drive at least one of the chassis-supported front wheel or chassis-supported rear wheel for powered movement over a ground surface; a sensor configured to generate infrastructure data indicative of infrastructure proximate to the electrically powered scooter; a computing device comprising a memory and one or more computer processors, wherein the memory comprises instructions that when executed by the one or more computer processors cause the one or more computer processors to: receive, from the sensor, the infrastructure data; determine, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and perform at least one operation based at least in part on the type of the location in which the
  • a computing device comprises memory and one or more processors connected to the memory.
  • the memory includes instructions that, when executed by the one or more processors, cause the computing device to receive, from a sensor, infrastructure data indicative of infrastructure proximate to an electrically powered scooter; determine, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
  • FIG. 1 is a conceptual diagram illustrating an example system for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure.
  • FIG. 2 is a block diagram illustrating an example system for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure.
  • FIG. 3 is a block diagram illustrating an example computing device, in accordance with techniques of this disclosure.
  • FIG. 4 is a schematic diagram illustrating an example electrically powered scooter, in accordance with techniques of this disclosure.
  • FIG. 5 is a flow diagram illustrating example operation of a computing device for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure.
  • FIG. 1 is a conceptual diagram illustrating an example physical environment having transportation system that includes one or more electrically powered scooters, in accordance with techniques of this disclosure.
  • transportation system 100 includes a variety of different infrastructure elements (generally referred to as“infrastructure”).
  • infrastructure may include dedicated transportation pathways 106A- 106D (collectively, transportation pathways 106) as well as infrastructure articles 107A-107 (collectively, infrastructure articles 107) positioned and oriented within the environment.
  • system 100 includes one or more micro-mobility devices
  • micro-mobility devices include electrically-powered food delivery devices, electrically powered hoverboards or skateboards, electrically powered scooters 1 lOA-1 IOC (collectively, electrically powered scooters 110), or other small -profile devices that may use or travel upon a roadway or sidewalk.
  • Electrically powered scooters 110 (also referred to herein simply as scooters 110) may operate on transportation pathways 106.
  • electrically powered scooters 110 includes a chassis, a front wheel, a back wheel, an electric motor, a steering assembly, and a battery 119.
  • the chassis includes a rear-wheel mount at one end of the chassis, a front-wheel mount at another end of the chassis that is opposite the rear-wheel mount, and a chassis support extending horizontally between the rear-wheel mount and the front-wheel mount.
  • the front and rear wheels are mounted to the front and rear wheel mounts of the chassis, respectively.
  • the front wheel mount is coupled to a steering assembly.
  • the steering assembly includes handlebars such that turning the handle bars causes the front wheel to turn.
  • the electric motor is physically coupled to the scooter chassis and is configured by a motor controller to drive at least one of the chassis-supported front wheel or chassis-supported rear wheel for powered movement over a ground surface.
  • Examples of transportation pathways 106 include a vehicle pathway (e.g., pathway 106A, 106D), a bicycle pathway (e.g., pathway 106B), or a pedestrian pathway (e.g., pathway 106C), among others.
  • transportation pathways 106 may be sidewalks, public spaces, or other surfaces not specifically dedicated to certain types of vehicles or traffic.
  • Vehicle pathways e.g., 106A
  • vehicles 104A-104C collectively, vehicles 104 to transport people or goods.
  • Examples of vehicles 104 include automobiles (e.g., 104B, 104C) such as cars, trucks, passenger vans; buses; motorcycles; recreational vehicles (RVs); or lorries (e.g., 104A), etc.
  • vehicle pathways can also include alleys, streets, and highways (or a vehicle specific portion thereof, such as a vehicle driving lane), among others.
  • Bicycle pathways e.g., 106B
  • bicycle pathways include a street or a portion of a street designated for bicycles, a bicycle trail, among others.
  • a pedestrian pathway e.g., 106C is primarily used by pedestrians 108.
  • Examples of pedestrian pathways include a pedestrian sidewalk or a jogging path.
  • one of transportation pathways 106 may include two or more different types of pathways.
  • transportation pathway 106A may include a vehicle driving lane of a vehicle pathway and a bicycle pathway adjacent to the driving lane.
  • Transportation pathways 106 may include portions not limited to the respective pathways themselves.
  • transportation pathway 106A e.g., a vehicle pathway
  • transportation pathway 106 may include the road shoulder, physical structures near the pathway such as toll booths, railroad crossing equipment, traffic lights, guardrails, and generally encompassing any other properties or characteristics of the pathway or objects/structures in proximity to the pathway.
  • Examples of infrastructure articles include a pavement marking (e.g., infrastructure article 107A), a roadway sign (e.g., infrastructure article 107B), a license plate (e.g., infrastructure article 107C), a conspicuity tape (e.g., infrastructure article 107D), and a hazard marker (e.g., infrastructure article 107E, such as a construction barrel, a traffic cone, a traffic barricade, a safety barrier, among others).
  • Pavement markings may include liquid markings, tape, or raised pavement markings to name only a few examples.
  • pavement markings may include sensors, materials, or structures that permit the detection of the marking and/or communication of information between the pavement marking and a receiving device. Additional examples of infrastructure articles 107 include traffic lights, guardrails, billboards, electronic traffic sign (also referred to as a variable-message sign), among others.
  • Infrastructure articles 107 may include information that may be detected by one or more sensors of computing device 116.
  • an infrastructure article such as infrastructure article 107B, may include an article message 126 on the physical surface of the infrastructure article.
  • Article message 126 may include characters, images, and/or any other information that may be printed, formed, or otherwise embodied on infrastructure article 107B.
  • each infrastructure article 107 may have a physical surface having article message 126 embodied thereon.
  • Article message 126 may include human-perceptible information and machine-perceptible information.
  • Human-perceptible information may include information that indicates one or more first characteristics of a pathway, such as information typically intended to be interpreted by human drivers.
  • the human-perceptible information may provide a human-perceptible representation that is descriptive of at least a portion of the transportation pathway.
  • human-perceptible information may generally refer to information that indicates a general characteristic of a transportation pathway and that is intended to be interpreted by a human driver.
  • the human-perceptible information may include words (e.g.,“STOP” or the like), symbols, graphics (e.g., an arrow indicating the road ahead includes a sharp turn) or shapes (e.g., signs or lane markings).
  • Human-perceptible information may include the color of the article, the article message or other features of the infrastructure article, such as the border or background color.
  • some background colors may indicate information only, such as“scenic overlook” while other colors may indicate a potential hazard (e.g., the red octagon of a stop sign, or the double yellow line of a no passing zone).
  • the human-perceptible information may correspond to words or graphics included in a specification.
  • the human- perceptible information may correspond to words or symbols included in the Manual on Uniform Traffic Control Devices (MUTCD), which is published by the U.S. Department of Transportation (DOT) and includes specifications for many conventional signs for roadways. Other countries have similar specifications for traffic control symbols and devices.
  • MUTCD Uniform Traffic Control Devices
  • Machine-perceptible information may generally refer to information configured to be interpreted by an electrically powered scooter.
  • article message 126 may be encoded via a 2-dimensional bar code, such as a QR code.
  • machine -perceptible information may be interpreted by a human driver.
  • machine -perceptible information may include a feature of the graphical symbol that is a computer-interpretable visual property of the graphical symbol.
  • the machine -perceptible information may relate to the human-perceptible information, e.g., provide additional context for the human- perceptible information.
  • the human-perceptible information may be a general representation of an arrow, while the machine-perceptible information may provide an indication of the shape of the turn including the turn radius, any incline of the roadway, a distance from the sign to the turn, or the like.
  • the additional information may be visible to a human operator; however, the additional information may not be readily interpretable by the human operator, particularly at speed. In other examples, the additional information may not be visible to a human operator but may still be machine readable and visible to a vision system of an electrically powered scooter.
  • an enhanced infrastructure article may be an optically active article in that the infrastructure article is readily detectible by vision systems, which may include an infrared camera or other camera configured for detecting electromagnetic radiation in one or more bands of the electromagnetic spectrum, which may include the visible band, the infrared band, the ultraviolet band, and so forth.
  • vision systems which may include an infrared camera or other camera configured for detecting electromagnetic radiation in one or more bands of the electromagnetic spectrum, which may include the visible band, the infrared band, the ultraviolet band, and so forth.
  • the infrastructure articles may be reflective, such as retroreflective, within one or more bands of the electromagnetic spectrum that are readily detectible by visions systems of the computing device 116.
  • Article message 126 may indicate a variety of types of information.
  • article message 126 may, for instance, provide computing device 116 with static information related to a region of a pathway 106.
  • Static information may include any information that is related to navigation of the pathway associated with article message 126, and not subject to change.
  • certain features of pathways 106 may be standardized and/or commonly used, such that article message 126 may correspond to a pre-defmed classification or operating characteristic of the respective pathway.
  • article message 126 may indicate a navigational characteristic or feature of the pathway, an operating rule or set of operating rules of the pathway, or the like.
  • Infrastructure articles 107 may include a variety of indicators and/or markers.
  • infrastructure article 107 may include one or more of an optical tag, a radio-frequency identification tag, a radio-frequency tag, an acoustic surface pattern, or a material configured to provide a signature to a signature-sending system.
  • electrically powered scooters 110 may receive data from infrastructure articles 107 via near-field communication (NFC) protocols and signals, laser, or infrared-based readers, or other communication type.
  • NFC near-field communication
  • Electrically powered scooters 110 may each include one or more sensors that perceive characteristics of the environment, infrastructure, and other objects around electrically powered scooter 110A. Examples of sensors include an image sensor, sonar, LiDAR, among others. The sensors may generate sensor data indicative of sensed characteristics.
  • the sensor data may include infrastructure data indicative of the infrastructure proximate to a respective scooter of electrically powered scooters 110.
  • An object may be proximate to a particular electrically powered scooter 110 when the object is detectable by one or more sensors of particular electrically powered scooter 110.
  • the infrastructure data may be indicative of one or more infrastructure articles 107 proximate to a respective scooter of electrically powered scooters 110.
  • electrically powered scooter 110A includes computing device 116A configured to dynamically control operation of the electrically powered scooter based at least in part on the infrastructure data.
  • computing device 116A may determine a type of location in which electrically powered scooter 110A is located based on the infrastructure data and may control operation of electrically powered scooter 110A based at least in part on the type of the location.
  • Computing device 116A may determine the type of the location in which electrically powered scooter 110A is currently located.
  • Example types of locations include transportation pathways 106, parks, interiors of buildings, parking lots, etc.
  • Computing device 116A may determine the type of location in which electrically powered scooter 110A is located based on infrastructure data.
  • the infrastructure data includes image data (e.g., images and/or videos) generated by one or more image sensors.
  • Computing device 116A may perform one or more image processing algorithms on the image data to identify the type of the location.
  • the image data may include an image of one or more infrastructure articles 107 proximate to electrically powered scooter 110A.
  • computing device 116A may determine the type of location in which electrically powered scooter 110A is located is a bicycle pathway based on the image data. For example, computing device 116A may perform image processing to identify infrastructure articles 107A as pavement markings (also referred to as lane markings). Computing device 116A may determine that the type of location in which electrically powered scooter 110A is located is a bicycle pathway in response to determining that electrically powered scooter 110A is between two pavement markings. In other words, in one example, computing device 116A may determine that transportation pathway 106A is a bicycle pathway, and hence that the type of location in which electrically powered scooter 110A is located is a bicycle pathway.
  • computing device 116A determines electrically powered scooter 110A is located within a bicycle pathway based on the characteristics (e.g., color, width, double vs single line, distance between, etc.) of infrastructure articles 107A. Additional details of analyzing infrastructure data are described in U.S. Provisional Patent Application 62/622,469, fded January 26, 2018, and U.S. Provisional Patent Application 62/480,231, fded March 31,
  • Computing device 116A may determine a distance between infrastructure articles 107A. For instance, computing device 116A may calculate a number of pixels between infrastructure articles 107A and calculate the number of pixels associated with a known or typical dimension (e.g., width) of a reference object (e.g., infrastructure article 107B) captured in one or more images of the image data. In such instances, computing device 116A may compare the number of pixels between infrastructure articles 107A to the number of pixels associated with the reference object to determine the distance between infrastructure articles 107A. As such, in one example, computing device 116A may determine that type of location in which electrically powered scooter 110A is located is a bicycle pathway in response to determining that the distance between infrastructure articles 107A corresponds to a width of a bicycle pathway.
  • a known or typical dimension e.g., width
  • a reference object e.g., infrastructure article 107B
  • computing device 116A may compare the number of pixels between infrastructure articles 107A to the number of pixels associated with the reference object to determine the distance
  • computing device 116A determines a type of transportation pathway 106B based on characteristics of the infrastructure. For example, computing device 116A may determine a color of transportation pathway 106A and determine that transportation pathway 106A is a bicycle pathway based on the color. In another example, computing device 116A may identify a symbol on the surface of transportation pathway 106A between infrastructure articles 107A and determine that transportation pathway 106A is a bicycle pathway based on the symbol.
  • the image data includes data indicative of article message 126.
  • Computing device 116A may determine the type of location in which electrically powered scooter 110A is located based on article message 126.
  • article message 126 may indicate a type of infrastructure article 107B, a type of transportation pathway 106 associated with infrastructure article 107B, or both.
  • computing device 116A may determine the type of location in which electrically powered scooter 110A is located is a bicycle pathway based on the article message 126.
  • Computing device 116A may determine a type of location in which electrically powered scooter 110A is currently located based at least in part on detecting one or more vehicles 106, pedestrians 108, electrically powered scooters 110, and/or bicycles.
  • Computing device 116 may detect one or more vehicles 106 based on the infrastructure data (e.g., image data or other signature data).
  • the infrastructure data e.g., image data or other signature data.
  • computing device 116A may perform image processing of image data to detect one or more vehicles 106 and may determine transportation pathway 106A is a vehicle pathway.
  • computing device 116A may perform image processing on the image data and determine that transportation pathway 106C includes pedestrians 108. In such examples, computing device 116A may determine that transportation pathway 106C is a pedestrian pathway.
  • computing device 116A may determine that transportation pathway 106B is a bicycle pathway in response to detecting bicycles and/or scooters 110B, 1 IOC.
  • Computing device 116A may determine which of transportation pathways 106 electrically powered scooter 110A is located based on the image data. For example, computing device 116A may determine that electrically powered scooter is located within a bicycle pathway (e.g., transportation pathway 106B) in response to determining that transportation pathway 106B is located in the middle of the image.
  • a bicycle pathway e.g., transportation pathway 106B
  • computing device 116A may determine a type of location in which electrically powered scooter 110A is located based on communication data received from a computing device separate from electrically powered scooter 110A, such as another electrically powered scooter (e.g., scooter 110B), an infrastructure article, or a vehicle (e.g., vehicle 104A).
  • a computing device separate from electrically powered scooter 110A such as another electrically powered scooter (e.g., scooter 110B), an infrastructure article, or a vehicle (e.g., vehicle 104A).
  • computing device 116A receives the communication data via a dedicated short range communication (DSRC) transceiver. Additionally or alternatively, computing device 116A may receive communication data via any wireless communication device, such as a
  • the communication data may include data indicating the type of the location is a transportation pathway 106.
  • the communication data indicates GPS coordinates of electrically powered scooter 110A (e.g., GPS coordinates) and computing device 116A may determine the type of location based on the GPS coordinates.
  • the communication data may indicate a type of the sending device and computing device 116A may determine the type of location for electrically powered scooter 110A based on the type of the sending device.
  • the communication device may indicate the sending device is a vehicle, such as a lorry or semi -truck.
  • computing device 116A may determine that computing device 116A is located in a transportation pathway 106 in response to determining the sending device is a vehicle.
  • the communication data includes data which was received from vehicles 104, infrastructure articles 107, or other scooters 110 that travelled proximate to the current location of electrically powered scooter 110A within a particular time duration of electrically powered scooter 110A arriving at its current location.
  • the communication data may include data indicating a type of a roadway, a size of the roadway (e.g., a number of lanes), a speed of the vehicle 106, a speed limit for the roadway, among others.
  • the data indicating the type of the roadway may include data indicating the presence of an accident, the presence of a construction zone, the direction, speed, or congestion of traffic, road surface type, types of vehicles permitted or present on the roadway, number of lanes, complexity of traffic, or a combination thereof.
  • computing device 116A may receive data from vehicles 104 indicating a type of transportation pathway 106A.
  • computing device 116A determines whether electrically powered scooter 110A is permitted in the location in which electrically powered scooter 110A is currently located. For example, computing device 116A may determine whether electrically powered scooter 110A is permitted in its current location based on the type of the current location and one or more rules. The rules may be pre-programmed or machine generated (e.g., using trained or untrained machine learning models). In some scenarios, computing device 116A determines based on the rule(s) that electrically powered scooter 110A is permitted in certain types of locations and is not be permitted (e.g., may be prohibited) in different types of locations.
  • computing device 116A may determine that electrically powered scooter 110A is permitted in its current location when electrically powered scooter 110A is located on one of transportation pathways 106. Similarly, computing device 116A may determining that electrically powered scooter 110A is not permitted in its current location when electrically powered scooter 110A is located within a building or on an athletic field (e.g., a baseball field, soccer field, etc.).
  • an athletic field e.g., a baseball field, soccer field, etc.
  • Electrically powered scooter 110A may be permitted in a subset of one type of location and may not be permitted in a different subset of the type of location. For example, computing device 116A may determine based on the rules that electrically powered scooter 110A is permitted on transportation pathways 106A and 106B and that electrically powered scooter 110A is not be permitted on transportation pathway 106C.
  • computing device 116A may determine that electrically powered scooter 110A is not permitted to travel through construction zone 111 (or any other temporary traffic control zone).
  • computing device 116A determines whether electrically powered scooter 110A is permitted in its current location based at least in part on the presence of a vehicle 106, scooter 110, pedestrian 108, or a combination thereof. For example, computing device 116A may determine that electrically powered scooter 110A is not permitted in its current location in response to detecting one or more of vehicles 104, scooters 110, or pedestrians 108.
  • Computing device 116A may determine whether electrically powered scooter 110A is permitted in its current location based on a distance to an infrastructure article 107. For example, computing device 116A may detect (e.g., based on the image data or signature data) a pavement marking (e.g., infrastructure articles 107A) and determine a distance to the pavement marking. Computing device 116A may determine that electrically powered scooter 110A is permitted in its current location in response to determining that the distance to the pavement marking is within a threshold distance.
  • a pavement marking e.g., infrastructure articles 107A
  • Computing device 116A performs an operation based at least in part on the type of location in which electrically powered scooter 110A is located, whether electrically powered scooter 110A is permitted in its current location, a type of a roadway, presence of vehicles 104, pedestrians 108, and/or other electrically powered scooters 110, or a combination thereof. [0039] In some examples, computing device 116A performs an operation to adjust operation of the electric motor of electrically powered scooter 110A. For example, computing device 116A may perform an operation based on the type of location and/or in response to determining that electrically powered scooter 110A in not permitted in the location in which it is currently located.
  • computing device 16A may cause a motor controller to adjust (e.g., increase or decrease) the speed of the electric motor, and hence, the speed of electrically powered scooter 110A.
  • computing device 116A adjusts a maximum allowable speed based on the type of location. For example, computing device 116A may enable the electric motor to drive the wheels at a first speed when electrically powered scooter 110A is located on a vehicle pathway (e.g., pathway 106A) and may enable the electric motor to drive the wheels at a different (e.g., lower) speed with electrically powered scooter 110A is located on a pedestrian pathway (e.g., 106C).
  • computing device 116A may perform an operation to adjust operation of a braking apparatus coupled to back wheel, front wheel, or both. For instance, computing device 116A may cause the braking apparatus to slow and/or stop one or both of the wheels.
  • computing device 116A performs the operation based at least in part on a type of a roadway. For example, computing device 116A may determine that the current location of electrically powered scooter 110A is a transportation pathway 106 (e.g., pathway 106B). Computing device 116A may adjust a current speed and/or a maximum allowable speed of electrically powered scooter in response to receiving data indicating that transportation pathway 106D includes a construction zone 111.
  • a transportation pathway 106 e.g., pathway 106B
  • Computing device 116A may adjust a current speed and/or a maximum allowable speed of electrically powered scooter in response to receiving data indicating that transportation pathway 106D includes a construction zone 111.
  • Computing device 116A may perform the at least one operation based at least in part on whether computing device 116A detected the presence of vehicles 106, pedestrians 108, and/or other electrically powered scooters 110. For example, computing device 116A adjust a speed of electrically powered scooter 110A (e.g., via the electric motor and/or braking apparatus) in response to detecting pedestrian 108B, for example, regardless of the type of location in which electrically powered scooter 110A is located. As another example, computing device 116A may adjust the speed of electrically powered scooter 110A based on the type of location and the presence of vehicles 106, pedestrians 108, and/or other electrically powered scooters 110.
  • computing device 116A adjust a speed of electrically powered scooter 110A (e.g., via the electric motor and/or braking apparatus) in response to detecting pedestrian 108B, for example, regardless of the type of location in which electrically powered scooter 110A is located.
  • computing device 116A may adjust the speed of electrically powered scooter 110A based on the type
  • computing device 116A may adjust the speed of electrically powered scooter 110A when electrically powered scooter 110A is located on a bicycle pathway and pedestrian 108B is detected. In one example, computing device 116A may maintain the speed of electrically powered scooter 110A when electrically powered scooter 110A is located on a vehicle pathway and pedestrian scooter 110A is detected on a pedestrian pathway.
  • Computing device 116A may perform the at least one operation by generating an output.
  • the output may include an audio output, a visual output, a haptic output, or a combination thereof.
  • computing device 116A may output a visual alert via one or more LED lights or output a haptic alert (e.g., causing the steering assembly to vibrate) indicating that electrically powered scooter 110A is not permitted in its current location.
  • computing device 116A outputs a message to a remote computing device separate from electrically powered scooter 110A.
  • the message may indicate that electrically powered scooter 110A is currently located in a location in which it is not permitted.
  • the message may indicate an amount of time that electrically powered scooter 110A has been in its current location, the current location of electrically powered scooter, among other information.
  • computing device 116A determines an amount of time that electrically powered scooter 110A has been in a location in which the scooter 110A is not permitted.
  • Computing device 116A may perform the at least one operation in response to determining that the amount of time satisfies (e.g., is greater than or equal to) a threshold time duration. For example, computing device 116 may generate an output and/or adjust a speed of the electrically powered scooter 110A in response to determining that electrically powered scooter 110A has been located in an impermissible location for at least the threshold time duration. Computing device 116A may determine a confidence level indicating a probability that electrically powered scooter has been in a location in which the scooter 110A is not permitted. Computing device 116A may perform the at least one operation in response to determining that the confidence level satisfies (e.g., is greater than or equal to) a threshold confidence level. For example, computing device 116 may generate an output and/or adjust a speed of the electrically powered scooter 110A in response to determining that confidence level satisfies the threshold confidence level.
  • computing device 116A is described as dynamically controlling scooter 110A, techniques of this disclosure may enable a computing device to control any other type of micro mobility device, such as a powered food-delivery device, hoverboard, or skateboard.
  • FIG. 2 is a block diagram illustrating an example system for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure.
  • System 140 illustrates additional details of system 100 of FIG. 1.
  • system 140 includes electrically powered scooter 110A, vehicle 104B, and a remote computing system 150.
  • electrically powered scooter 110A and vehicle 104B are communicatively coupled to one another and/or remote computing system 150 via network 114. In another example, electrically powered scooter 110A and vehicle 104B are communicatively coupled to one another directly, for example, via a DSRC transceiver.
  • Electrically powered scooter 110A includes computing device 116A and vehicle 104B include computing device 116B.
  • Computing devices 116A, 116B may each include one or more communication unit 214A, 214B, and sensors 117A,
  • Communication units 214A, 214B (collectively, communication units 214) of computing devices 116 may communicate with external devices by transmitting and/or receiving data.
  • computing device 116 may use communication units 214 to transmit and/or receive radio signals on a radio network such as a cellular radio network or other networks, such as networks 114.
  • communication units 214 may transmit and receive messages and information to other vehicles, such as information interpreted from infrastructure article 107.
  • communication units 214 may transmit and/or receive satellite signals on a satellite network such as a Global Positioning System (GPS) network.
  • GPS Global Positioning System
  • communications units 214 may transmit and/or receive data through network 114 to remote computing system 150 via communication unit 154.
  • Sensors 117A, 117B may image sensors 102A, 102B
  • image sensors 102 may include semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metal-oxide- semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies.
  • Digital sensors include flat panel detectors.
  • electrically powered scooter 110A or vehicle 104B includes at least two different sensors for detecting light in two different wavelength spectrums.
  • Image sensors 102 may have a fixed field of view or may have an adjustable field of view.
  • An image sensor 102 with an adjustable field of view may be configured to pan left and right, up and down relative to electrically powered scooter 110 or vehicle 104B as well as be able to widen or narrow focus.
  • image sensors 102 may include a first lens and a second lens.
  • Electrically powered scooter 110 and/or vehicle 104B may have more or fewer image sensors 102 in various examples.
  • computing device 116A includes an interpretation component 118, a user interface (UI) component 124, and a control component 144.
  • Components 118A, 124, and 144 may perform operations described herein using software, hardware, firmware, or a mixture of both hardware, software, and firmware residing in and executing on computing device 116 and/or at one or more other remote computing devices.
  • components 118A, 124, and 144 may be implemented as hardware, software, and/or a combination of hardware and software.
  • Computing device 116A may execute components 118A, 124, and 144 with one or more processors.
  • Computing device 116A may execute any of components 118A, 124, 144 as or within a virtual machine executing on underlying hardware.
  • Components 118A, 124, 144 may be implemented in various ways. For example, any of components 118A, 124, 144 may be implemented as a downloadable or pre-installed application or“app.” In another example, any of components 118 A, 124, 144 may be implemented as part of an operating system of computing device 116.
  • UI component 124 may include any hardware or software for communicating with a user of electrically powered scooter 110.
  • UI component 124 includes outputs to a user such as displays, such as a display screen, indicator or other lights, audio devices to generate notifications or other audible functions, and/or haptic feedback devices.
  • UI component 124 may also include inputs such as knobs, switches, keyboards, touch screens or similar types of input devices.
  • sensors 117 may be used to gather information about infrastructure proximate to electrically powered scooter 110A and vehicle 104B, such as information about transportation pathways 106. Sensors 117 may generate infrastructure data indicative of the infrastructure proximate to electrically powered scooter 110A or vehicle 104B.
  • image sensors 102 may capture images of infrastructure articles, such as lane markings, centerline markings, edge of roadway or shoulder markings, as well as the general shape of the transportation pathway.
  • the general shape of a transportation pathway may include turns, curves, incline, decline, widening, narrowing or other characteristics.
  • interpretation component 118A may determine a type of location in which electrically powered scooter 110A is located based on infrastructure data generated by sensors 117A, 117B of electrically powered scooter 110A and/or vehicle 104B. For example, interpretation components 118A may receive, from image sensors 102) image data indicative of infrastructure proximate to electrically powered scooter 110A and vehicle 104B, respectively. Interpretation component 118A may identify the infrastructure (e.g., an infrastructure article 107) using one or more image processing algorithms.
  • Interpretation component 118A may determine a type of location in which electrically powered scooter 110A is located based on the infrastructure data.
  • Interpretation component 118A may determine the type of location based at least in part on the infrastructure data.
  • interpretation component 118A may determine the type of location in which electrically powered scooter 110A is located by performing one or more image processing algorithms on the image data to identify the type of the location.
  • interpretation component 118A may perform image processing to identify one or more infrastructure articles 107 proximate to electrically powered scooter 110A and determine the type of location based on the characteristics of the infrastructure article 107, such as a type, size, shape, and/or color of infrastructure article 107.
  • interpretation component 118A may determine a type of location based on characteristics of the infrastructure captured by the image data. For example, interpretation component 118A may determine determines a type of transportation pathway 106B based on a color and/or size of transportation pathway 106B, or a symbol (e.g., a bicycle symbol) included on the pathway. As yet another example, interpretation component 118A may determine the type of location based on an article message 126 encoded on one or more infrastructure articles 107. For example, article message 126 may indicate a type of infrastructure article 107B, a type of transportation pathway 106 associated with infrastructure article 107B, or both. In some scenarios, interpretation component 118A determines a type of location in which electrically powered scooter 110A is currently located based at least in part on detecting one or more vehicles 106, pedestrians 108, electrically powered scooters 110, and/or bicycles.
  • interpretation component 118A may determine a type of location in which electrically powered scooter 110A is located based on communication data received from a computing device separate from electrically powered scooter 110A, such as another electrically powered scooter (e.g., scooter 110B), an infrastructure article 107, or a vehicle (e.g., vehicle 104B).
  • interpretation component 118A may receive data indicating a type of the location from a vehicle 10B that is proximate to electrically powered scooter 110A via communication unit 214, such as a DSRC transceiver.
  • Interpretation component 118A may determine a type of the current location of electrically powered scooter 110A based on the type of a device from which the data was received, such as another scooter 110 or a vehicle 104.
  • Interpretation component 118A may be configured to determine one or more scooter operating rules based on the type of the location.
  • computing device 116A stores the operating rules locally.
  • computing device 116A may query remote computing system 150 via network 114 to obtain the scooter operating rules.
  • Interpretation component 118A may provide information, directly or indirectly, to control component 144 related to the scooter operating rules.
  • the scooter operating rules also referred to as“operating rules”
  • An operating rule may be any rule controlling operational characteristics of electrically powered scooter 110A. Operating rules that may be used include, but are not limited to, speed limits, acceleration limits, braking limits, following distance limits, lane markings, distance limits from construction or pedestrians, permitted locations, prohibited locations, and the like.
  • an article message such as article message of FIG. 1, includes or indicates an operating rule set for electrically powered scooter 110A.
  • Interpretation component 118A may obtain the operating rule set based on the interpretation of article message 126.
  • article message 126 may indicate a specific operating rule set associated a type of transportation pathway, a construction zone, among others.
  • interpretation component 118A may determine the rules set associated with transportation pathway 106B (e.g., a bicycle pathway) includes a speed limit.
  • Interpretation component 118A may output a command to control component 144 based on the rules, such as a command to“apply brakes” or“shut off electric motor.” In such examples, interpretation component 118A accesses a local or remote data structure to identify a set of rules set and determine one or more operations to be applied by electrically powered scooter 110A. In this way, interpretation component 118A provides the set of operations to control component 144 to modify or adjust the operation of the electrically powered scooter 110A.
  • control component 144 receives data from an external device.
  • remote computing system 150 may include a datastore that includes navigational characteristics of a transportation pathway 106 (e.g., construction zone 111), such as traffic pattern changes, presence of workers, lane width modification, curves, and shifts, and the like.
  • remote computing system 150 may include a datastore that includes navigational characteristics of the temporary zone, such as location data, congestion data, vehicle behavior variability, speed, lane departure, acceleration data, brake actuation data, and the like.
  • navigational characteristics may be official data, such as supplied by operators having control of the temporary zone or may be crowd sourced data, such as supplied by users travelling through the transportation pathway 106.
  • Interpretation component 118A may determine whether electrically powered scooter 110A is permitted in the location in which electrically powered scooter 110A is currently located based on the rules. For example, interpretation component 118A may apply the rules to the type of the scooter’s current location to determine whether electrically powered scooter 110A is permitted in its current location. As another example, interpretation component 118A may determine whether electrically powered scooter 110A is permitted in its current location based the rules and the presence (or absence) of one or more vehicles 106, pedestrians 108, electrically powered scooters 110, and/or bicycles.
  • interpretation component 118A may determine that electrically powered scooter 110A is permitted on a pedestrian pathway when pedestrians are not present and that electrically powered scooter 110A is permitted when pedestrians are nearby (or when the number or density of pedestrians satisfies (e.g., is greater than or equal to) a threshold number or density of pedestrians).
  • Control component 144 may be configured to perform an operation by adjusting operation of electrically powered scooter 110A.
  • Control component 144 may include, for example, any circuitry or other hardware, or software that may adjust one or more functions of the vehicle. Some examples include adjustments to change a speed of electrically powered scooter 110A, shut off an electric motor that drives one or more wheels, or both.
  • Control component 144 may adjust operation of electrically powered scooter 110A based at least in part on the type of location in which electrically powered scooter 110A is located; whether electrically powered scooter 110A is permitted in its current location; a type of a roadway on which electrically powered scooter 110A is located; a presence of vehicles 106, pedestrians 108, and/or other electrically powered scooters 110; or a combination thereof.
  • control component 144 adjusts operation of the electric motor of electrically powered scooter 110A.
  • control component 144 may cause the electric motor to slow down or stop, which may slow or stop electrically powered scooter 110A.
  • control component 144 causes a brake apparatus to slow or stop electrically powered scooter 110A.
  • control component 144 may adjust a maximum allowable speed of electrically powered scooter 110A.
  • UI component 124 may perform the at least one operation by generating an output.
  • the output may include an audio output, a visual output, a haptic output, or a combination thereof.
  • computing device 116A may output a visual alert via one or more LED lights or output a haptic alert (e.g., causing the steering assembly to vibrate) indicating that electrically powered scooter 110A is not permitted in its current location.
  • interpretation component 118A may determine or predict whether electrically powered scooter 110A will experience a scooter-specific event.
  • a scooter-specific event is an event that relevant to a class of vehicles with a set of characteristics that correspond to an electrically powered scooter.
  • the set of characteristics corresponding to electrically powered scooters 110 may include exactly two wheels connected by a substantially horizontal support member, a generally vertical steering assembly, and an electric motor configured to drive one or both of the exactly two wheels.
  • electrically powered scooters 110 include conspicuity tape and the set of characteristics corresponding to electrically powered scooters 110 include characteristics of the conspicuity tape, such as a location or pattern of the conspicuity tape. Examples of scooter- specific events include collisions between electrically powered scooter 110A and another electrically powered scooter 110, a pedestrian, or a vehicle; a scooter fall (e.g., due to a hole in a transportation pathway 106); among others.
  • Interpretation component 118A may predict whether electrically powered scooter 110A will experience a scooter-specific event based at least in part on a communication received from another electrically powered scooter 110 (e.g. electrically powered scooter 110B), vehicle 104B, an infrastructure article 107, remote computing system 150, or any other computing device.
  • electrically powered scooter 110A may receive message from electrically powered scooter 110B of FIG. 1 via communication unit 214, such as a DSRC transceiver.
  • interpretation component 118A determines a distance between electrically powered scooter 110A and electrically powered scooter 110B based on the message.
  • interpretation component 118A may determine a distance between the scooter 110A and 110B based on a time between when the message was sent by electrically powered scooter 110B and when the message was received by electrically powered scooter 110A. In some examples, interpretation component 118A may determine whether the distance satisfies a threshold distance.
  • interpretation component 118A may determine whether electrically powered scooter 110A will experience a scooter-specific event, such as a collision with a vehicle defined by the set of characteristics that correspond to electrically powered scooter 110B. In other words, interpretation component 118A may determine whether electrically powered scooter 110A will experience a collision with electrically powered scooter 110B. In some scenarios, interpretation component 118A determines whether scooters 110A and HOB will collide based on the distance between electrically powered scooter 110A and electrically powered scooter HOB. For example, interpretation component 118A may predict that the scooters 110 will collide in response to determining the distance between scooters 110A and 110B is within a threshold distance.
  • interpretation component 118A may predict that electrically powered scooter 110A will collide with electrically powered scooter HOB based on the distance between electrically powered scooter 110A and electrically powered scooter 110B, a speed of electrically powered scooter 110A, and a speed of electrically powered scooter HOB.
  • interpretation component 118A of electrically powered scooter 110A may determine, based on the data received from electrically powered scooter 110B, whether electrically powered scooter 110B is physically behind or ahead of electrically powered scooter 110A.
  • interpretation component 118A may predict a collision will occur in response to determining that the distance between scooters 110A and 110B is within a threshold distance and that the speed of electrically powered scooter 110A is less than the speed of electrically powered scooter HOB.
  • interpretation component 118A may predict a collision will occur in response to determining that the distance between scooters 110A and 110B is within a threshold distance and that the speed of electrically powered scooter 110A is greater than the speed of electrically powered scooter HOB.
  • Interpretation component 118A may predict the occurrence of a scooter-specific event in response to detecting a temporary traffic zone in the path of electrically powered scooter 110A. For example, interpretation component 118A may determine construction zone 111 of FIG. 1 is in the path of electrically powered scooter 110A based on sensor data generated by sensors 117. Interpretation component 118A may predict a scooter-specific event will occur (e.g., a collision or the user of scooter 110A may fall) in response determining construction zone 111 is in the path of electrically powered scooter 110A.
  • a scooter-specific event will occur (e.g., a collision or the user of scooter 110A may fall) in response determining construction zone 111 is in the path of electrically powered scooter 110A.
  • interpretation component 118A may determine whether a scooter- specific event will occur based on the presence or absence of a pavement marking on a transportation pathway 106. For example, interpretation component 118A may determine whether a roadway (e.g., transportation pathway 106A) includes pavement markings (e.g., infrastructure articles 107A) based on the infrastructure data (e.g., image data or signature data). In one example, interpretation component 118A may predict a scooter-specific event will occur based on the absence of pavement markings on the roadway.
  • signature data may represent a detected signal and/or one or more properties of a signal, such as but not limited to an acoustic signal.
  • Interpretation component 118A may perform an operation in response predicting the occurrence of a scooter-specific event. In one example, interpretation component 118A performs the operation by causing control component 144 to adjust operation of electrically powered scooter 110A. For example, interpretation component 118A may cause control component 144 to adjust a speed (e.g., increase or decrease) the speed of electrically powered scooter 110A. As one example, interpretation component 118A may cause control component 144 to change operation of the electric motor (e.g., increase or decrease the speed of the motor) or change operation of a braking apparatus (e.g., apply the braking apparatus or disengage the braking apparatus).
  • a speed e.g., increase or decrease
  • a braking apparatus e.g., apply the braking apparatus or disengage the braking apparatus.
  • Adjusting the speed of electrically powered scooter 110A may increase the distance between scooters 110A and 110B beyond the threshold distance, which may reduce the risk of a collision.
  • interpretation component 118A may output an alert (e.g., audible, haptic, visual) in response to predicting the scooter-specific event will occur.
  • the alert may warn the user of electrically powered scooter 110 or people within proximity of electrically powered scooter 110A, of the risk of a scooter-specific event.
  • interpretation component 118A determine whether electrically powered scooter 110A will experience a collision with a vehicle (e.g., vehicle 104B) defined by the set of characteristics that correspond to an automobile.
  • the set of characteristics corresponding to an automobile may include at least four wheels and an enclosed passenger compartment occupied by a driver and/or one or more passengers.
  • Interpretation component 118A may determine whether electrically powered scooter 110A will experience a collision with vehicle 104B based at least in part on sensor data generated by one or more of sensors 117.
  • interpretation component 118A may determine that vehicle 104B is proximate to electrically powered scooter 110A is configured with a set of characteristics corresponding to an automobile based on image data, signature data, or a combination therein.
  • interpretation component 118A may perform image processing on an image of vehicle 104B to determine that vehicle 104B is an automobile.
  • interpretation component 118A performs one or more operations in response to determining that the characteristics of an object in proximity to electrically powered scooter 110A correspond to an automobile.
  • interpretation component 118A may cause scooter 110A to adjust a speed (e.g., by changing operation of a braking apparatus or electric motor) of electrically powered scooter 110A.
  • interpretation component 118A may cause communication unit 214 to output a message to the automobile (e.g., via a DSRC transceiver) indicating the presence of electrically powered scooter 110A.
  • Interpretation component 118A may perform one operation in response to determining the set of characteristics of the object in proximity to electrically powered scooter 110A correspond to an automobile and a different operation in response to determining the set of characteristics of the object in proximity to electrically powered scooter 110A do not correspond to an automobile.
  • Interpretation component 118A may determine the characteristics of an object in proximity to electrically powered scooter 110A include characteristics of a person or a pedestrian (e.g., two legs, or two arms).
  • interpretation component 118A causes scooter 110A to perform one action (e.g., adjust its speed) in response to determining the characteristics of the object correspond to an automobile and performs a second, different action (e.g., outputting an audible, visual, or haptic alert) in response to determining the characteristics of the object correspond to a person rather than an automobile.
  • one action e.g., adjust its speed
  • a second, different action e.g., outputting an audible, visual, or haptic alert
  • communication unit 214 of computing device 116 of electrically powered scooter 110A outputs a message to a remote computing device, such as remote computing system 150.
  • Remote computing system 150 may include a distributed computing platform (e.g., a cloud computing platform executing on various servers, virtual machines and/or containers within an execution environment provided by one or more data centers), physical servers, desktop computing devices, or any other type of computing system.
  • the message indicates usage data for electrically powered scooter 110A.
  • the usage data may include a current location of electrically powered scooter 110A, whether the current location of electrically powered scooter 110A is permitted, a type of the current location (e.g., a
  • Remote computing system 150 may receive the message from computing device 116A.
  • the message may include usage data associated with electrically powered scooter 110A.
  • Analysis module 152 of remote computing system 150 may store the usage data within usage data 156.
  • analysis module 152 may store a user account for each user of electrically powered scooters 110 within user data 158.
  • the user account may include information indicating whether the user complied with various scooter operating rules, such as complying with speed limits, operating electrically powered scooter 110A in prohibited locations, or parking electrically powered scooter 110A within da pre-defined set of delineated parking regions.
  • analysis module 152 may store information indicating whether a user of electrically powered scooter 110A parked scooter 110A within a particular delineated region.
  • electrically powered scooters 110 may be permitted to be parked in a plurality of delineated regions within a physical environment.
  • the delineated regions include designated parking zones to return electrically powered scooters.
  • Analysis module 152 and/or electrically powered scooter 110A may determine whether the last user of electrically powered scooter 110A parked scooter 110A within one of the delineated regions, for example, based on a GPS coordinates and/or infrastructure data generated by one or more sensors 117.
  • analysis module 152 may select a reward in response to determining that the user parked electrically powered scooter 110A within one of the delineated regions.
  • the reward may include a discount on a future scooter use or a discount on the current use.
  • Analysis module 152 may associated the reward with the user account for the user of electrically powered scooter 110A and may store an indication of the reward in the datastore.
  • Analysis module 152 may, in some examples, determine whether a user of electrically powered scooters 110 is permitted to use electrically powered scooters 110 based on information stored within the user account for that user. For example, analysis module 152 may determine that the user is prohibited (e.g., temporarily) from utilizing electrically powered scooters 110 in response to determining the user has violated operating rules. As one example, analysis module 152 may determine that the user is prohibited from utilizing electrically powered scooters 110 in response to determining the user parked electrically powered scooters 110 outside a delineated region several times, did not comply with speed limits, or operated electrically powered scooters 110 in prohibited locations.
  • analysis module 152 may store scooter usage information for a plurality of user accounts that are each associated with respective users of electrically powered scooters 110. Analysis module 152 may analyze the scooter information to determine a degree of compliance or non-compliance with scooter operating rules. For example, analysis module 152 may determine how often users of electrically powered scooters 110 exceed a speed limit, park within the delineated regions, operate scooters in prohibited locations, etc. Similarly, analysis module 152 may analyze the usage data to determine locations with high degrees of non- compliance with the scooter operating rules. For example, analysis module 152 may rank locations by non-compliance and generate a report indicating the locations with the high non- compliance.
  • interpretation component 118A of computing device 116A is described as performing various functionality of computing device 116A, in some examples, interpretation component 118B of computing device 116B may perform similar functionality. For example, interpretation component 118B of computing device 116B of vehicle 104B may identify a type of a location in which electrically powered scooter 110A is located or determine whether electrically powered scooter 110A is permitted in its current location. Similarly, in some examples, interpretation component 118B may predict a scooter-specific event for electrically powered scooter 110A. Interpretation component 118B may output information to electrically powered scooter 110A via communication unit 214B.
  • interpretation component 118A may output to electrically powered scooter 110A information indicating the type of location in which electrically powered scooter 110A is located, whether electrically powered scooter 110A is permitted in its current location, or a prediction of a scooter-specific event for electrically powered scooter 110A.
  • FIG. 3 is a block diagram illustrating an example computing device, in accordance with one or more aspects of the present disclosure.
  • FIG. 3 illustrates only one example of a computing device.
  • Many other examples of computing device 116A may be used in other instances and may include a subset of the components included in example computing device 116A or may include additional components not shown example computing device 116A in FIG. 3.
  • computing device 116A may be logically divided into user space 202, kernel space 204, and hardware 206.
  • Hardware 206 may include one or more hardware components that provide an operating environment for components executing in user space 202 and kernel space 204.
  • User space 202 and kernel space 204 may represent different sections or segmentations of memory, where kernel space 204 provides higher privileges to processes and threads than user space 202.
  • kernel space 204 may include operating system 220, which operates with higher privileges than components executing in user space 202.
  • hardware 206 includes one or more processors 208, input components 210, storage devices 212, communication units 214, output components 216, and sensors 117.
  • Processors 208, input components 210, storage devices 212, communication units 214, output components 216, and sensors 1117 may each be interconnected by one or more communication channels 218.
  • Communication channels 218 may interconnect each of the components 208, 210, 212, 214, 216, and 117 and other components for inter-component communications (physically, communicatively, and/or operatively).
  • communication channels 218 may include a hardware bus, a network connection, one or more inter-process communication data structures, or any other components for communicating data between hardware and/or software.
  • processors 208 may implement functionality and/or execute instructions within computing device 116A.
  • processors 208 on computing device 116A may receive and execute instructions stored by storage devices 212 that provide the functionality of components included in kernel space 204 and user space 202. These instructions executed by processors 208 may cause computing device 116A to store and/or modify information, within storage devices 212 during program execution.
  • Processors 208 may execute instructions of components in kernel space 204 and user space 202 to perform one or more operations in accordance with techniques of this disclosure. That is, components included in user space 202 and kernel space 204 may be operable by processors 208 to perform various functions described herein.
  • One or more input components 210 of computing device 116A may receive input.
  • Examples of input are tactile, audio, kinetic, and optical input, to name only a few examples.
  • Input components 210 of computing device 116A include a voice responsive system, video camera, buttons, control pad, microphone or any other type of device for detecting input from a human or machine.
  • input component 210 may be a presence- sensitive input component, which may include a presence-sensitive screen, touch-sensitive screen, etc.
  • One or more communication units 214 of computing device 116A may communicate with external devices by transmitting and/or receiving data.
  • computing device 116A may use communication units 214 to transmit and/or receive radio signals on a radio network such as a cellular radio network.
  • communication units 214 may transmit and/or receive satellite signals on a satellite network such as a Global Positioning System (GPS) network.
  • GPS Global Positioning System
  • Examples of communication units 214 include a DSRC transceiver, an optical transceiver, a radio frequency transceiver, a GPS receiver, or any other type of device that can send and/or receive information.
  • Other examples of communication units 214 may include Bluetooth®, GPS, 3G,
  • Wi-Fi® radios found in mobile devices as well as Universal Serial Bus (USB) controllers and the like.
  • USB Universal Serial Bus
  • One or more output components 216 of computing device 116A may generate output. Examples of output are tactile, audio, and video output.
  • Output components 216 of computing device 116A include a presence-sensitive screen, sound card, video graphics adapter card, speaker, cathode ray tube (CRT) monitor, liquid crystal display (LCD), or any other type of device for generating output to a human or machine.
  • Output components may include display components such as a liquid crystal display (LCD), a Light-Emitting Diode (LED) or any other type of device for generating tactile, audio, and/or visual output.
  • Output components 216 may be integrated with computing device 116A in some examples.
  • output components 216 may be physically external to and separate from computing device 116A but may be operably coupled to computing device 116A via wired or wireless communication.
  • An output component may be a built-in component of computing device 116A located within and physically connected to the external packaging of computing device 116A (e.g., a screen on a mobile phone).
  • a presence-sensitive display may be an external component of computing device 116A located outside and physically separated from the packaging of computing device 116A (e.g., a monitor, a projector, etc. that shares a wired and/or wireless data path with a tablet computer).
  • Output components 216 may also include control component 144, in examples where computing device 116A is onboard an electrically powered scooter. Control component 144 has the same functions as control component 144 described in relation to FIG. 1.
  • One or more storage devices 212 within computing device 116A may store information for processing during operation of computing device 116A.
  • storage device 212 is a temporary memory, meaning that a primary purpose of storage device 212 is not long term storage.
  • Storage devices 212 on computing device 116A may configured for short-term storage of information as volatile memory and therefore not retain stored contents if deactivated. Examples of volatile memories include random access memories (RAM), dynamic random-access memories (DRAM), static random-access memories (SRAM), and other forms of volatile memories known in the art.
  • Storage devices 212 also include one or more computer-readable storage media. Storage devices 212 may be configured to store larger amounts of information than volatile memory. Storage devices 212 may further be configured for long-term storage of information as non-volatile memory space and retain information after activate/off cycles.
  • Non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
  • Storage devices 212 may store program instructions and/or data associated with components included in user space 202 and/or kernel space 204.
  • application 228 executes in user space 202 of computing device 116A.
  • Application 228 may be logically divided into presentation layer 222, application layer 224, and data layer 226.
  • Presentation layer 222 may include user interface (UI) component 124, which generates and renders user interfaces of application 228.
  • Application 228 may include, but is not limited to: UI component 124, interpretation component 118A, security component 120, and one or more service components 122.
  • application layer 224 may interpretation component 118A, service component 122, and security component 120.
  • Presentation layer 222 may include UI component 124.
  • Data layer 226 may include one or more datastores.
  • a datastore may store data in structure or unstructured form.
  • Example datastores may be any one or more of a relational database management system, online analytical processing database, table, or any other suitable structure for storing data.
  • Service data 233 may include any data to provide and/or resulting from providing a service of service component 122.
  • service data 233 may include information about infrastructure articles 107, user information, operating rule sets, or any other information transmitted between one or more components of computing device 116A.
  • Operating data 236 may include instructions for scooter operating rule sets for operating electrically powered scooter 110A.
  • Sensor data 232 may include infrastructure data, such as image data, signature data, or any other data indicative of infrastructure proximate to electrically powered scooter 110A.
  • communication units 214 may receive, from an image sensor 102, image data indicative of infrastructure proximate to electrically powered scooter 110A and may store the image data in sensor data 232.
  • Image data may include one or more images that are received from one or more image sensors, such as image sensors 102.
  • the images are bitmaps, Joint Photographic Experts Group images (JPEGs), Portable Network Graphics images (PNGs), or any other suitable graphics file formats.
  • the image data includes images of one or more infrastructure articles 107 of FIG. 1.
  • the image data includes images of one or more article message 126 associated with one or more infrastructure articles 107.
  • computing device 116A may dynamically control operations of electrically powered scooter 110A based at least in part on sensor data 232.
  • Interpretation component 118A may determine a type of the location at which electrically powered scooter 110A is located based at least in part on the infrastructure data and/or communication data received from another electrically powered scooter (e.g., scooter 110B), an infrastructure article, or a vehicle (e.g., vehicle 104A).
  • interpretation component 118A may determine whether electrically powered scooter 110A is permitted to be in the location where electrically powered scooter 110A is currently located.
  • Interpretation component 118A may determine whether electrically powered scooter 110A is permitted in its current location based at least in part on the type of the current location and/or the presence of a vehicle, pedestrian, or another scooter in proximity to electrically powered scooter 110A.
  • interpretation component 118A causes control component 144 to adjust control of electrically powered scooter 110A based on the type of location at which electrically powered scooter 110A is currently located or whether electrically powered scooter 110A is permitted in that location. For example, interpretation component 118A may cause control component 144 to adjust operation of the electric motor and/or adjust operation of the braking assembly (e.g., to adjust a speed of electrically powered scooter 110A).
  • Interpretation component 118A may determine whether electrically powered scooter 110A will experience a scooter-specific event.
  • Interpretation component 118A may determine whether electrically powered scooter 110A will experience a scooter-specific event based at least in part on sensor data received from sensors 117 and/or communication data received from another electrically powered scooter 110, a vehicle 104, an infrastructure article 107, remote computing system 150, or any other computing device.
  • interpretation component 118A performs one or more operations in response to determining that electrically powered scooter 110A will experience a scooter-specific event.
  • interpretation component 118A may cause scooter 110A to adjust a speed (e.g., by changing operation of a braking apparatus or electric motor) of electrically powered scooter 110A, cause communication unit 214 to output a message to the another computing device (e.g., one of vehicles 104), or output an alert (e.g., audible, visual, and/or haptic).
  • a speed e.g., by changing operation of a braking apparatus or electric motor
  • communication unit 214 e.g., to output a message to the another computing device (e.g., one of vehicles 104), or output an alert (e.g., audible, visual, and/or haptic).
  • FIG. 4 is a conceptual diagram of an electrically powered scooter 110A.
  • Electrically powered scooter 110A include a chassis 402, a rear wheel 404, a front wheel 406, and a steering assembly 408.
  • Chassis 402 includes chassis support member 412 extending substantially horizontally between a rear-wheel mount 414 at one end of chassis 402 and a front- wheel mount 416 at another end of chassis 402 that is opposite the rear-wheel mount 414.
  • rear wheel 404 is mounted to rear wheel mount 414 and front wheel 406 is mounted to front wheel mount 416.
  • Front wheel 406 is mounted to front wheel mount 416 for turning steering movement with respect to the front wheel mount 406 and rear wheel 404.
  • Front wheel mount 416 may be coupled to steering assembly 408.
  • Steering assembly 408 may extend generally vertically relative to chassis support member 412. Steering assembly may be angled relative to chassis support member 412. In one example, an angle between chassis support member 412 and steering assembly 408 is between approximately 60 degrees to approximately 90 degrees.
  • Steering assembly 408 may include handlebars 410. Steering assembly 408 may be coupled to front wheel mount 416 such that turning handlebars 410 may cause front wheel 406 to turn.
  • Electrically powered scooter 110A includes at least one electric motor 420, at least one motor controller 422, and at least one battery 424.
  • Motor controller 422 may be operatively coupled to electric motor 420 to drive rear wheel 404 and/or front wheel 406.
  • electric motor 420 is configured to drive rear wheel 404
  • electric motor 420 may be configured to drive front wheel 406.
  • electrically powered scooter 110A includes a plurality of motors that are each configured to drive a respective wheel.
  • Electrically powered scooter 110A may include a braking apparatus 430.
  • braking apparatus 430 is operatively coupled to rear wheel 404 to selectively slow and/or stop rear wheel 404.
  • electrically powered scooter 110A includes a braking apparatus coupled to front wheel 406.
  • FIG. 5 is a flow diagram illustrating example operation of a computing device for dynamically controlling an electrically powered scooter, in accordance with one or more techniques of this disclosure.
  • the techniques are described in terms of computing device 116A. However, the techniques may be performed by other computing devices.
  • computing device 116A receives infrastructure data indicative of infrastructure proximate to electrically powered scooter 110A from one or more sensors (502).
  • the infrastructure data may include image data or signature data.
  • Computing device 116A may determine a type of the location at which electrically powered scooter 110A is located based at least in part on the infrastructure data (504). For example, computing device 116A may perform image processing on image data included within the infrastructure data to identify the type of location. As another example, computing device 116A may determine the type of location in which electrically powered scooter 110A is located based on signature data.
  • computing device 116A performs one or more operations based at least in part on the type of the location in which electrically powered scooter 110A is located (506). For example, computing device 116A may determine whether electrically powered scooter 110A is permitted in its current location based at least in part on the type of the location.
  • Computing device 116A may perform operations that cause electrically powered scooter 110A to change operation of the electric motor or change operation of the brake assembly. For example, computing device 116A may output a command causing the electric motor to slow down or speed up. As another example, computing device 116A may output a command to cause a brake assembly to apply the brakes. In some examples, computing device 116A performs one or more operations by outputting an alert (e.g., audible, visual, and/or haptic) or outputting a message to another computing device (e.g., one of vehicles 104 or remote computing system 150). In some examples, the message indicates the presence of electrically powered scooter 110A.
  • an alert e.g., audible, visual, and/or haptic
  • another computing device e.g., one of vehicles 104 or remote computing system 150. In some examples, the message indicates the presence of electrically powered scooter 110A.
  • the message indicates usage data for electrically powered scooter 110A, such as a current location of electrically powered scooter 110A, whether the current location of electrically powered scooter 110A is permitted, a type of the current location (e.g., a transportation pathway 106, a park, a scooter parking zone, etc.), an amount of time that electrically powered scooter 110A has been in its current location, or information indicating the occurrence of a scooter- specific event.
  • usage data for electrically powered scooter 110A such as a current location of electrically powered scooter 110A, whether the current location of electrically powered scooter 110A is permitted, a type of the current location (e.g., a transportation pathway 106, a park, a scooter parking zone, etc.), an amount of time that electrically powered scooter 110A has been in its current location, or information indicating the occurrence of a scooter- specific event.
  • Example 1 A method comprising: receiving, from a sensor, infrastructure data indicative of infrastructure proximate to an electrically powered scooter; determining, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and performing at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
  • Example 2 The method of example 1, further comprising: determining, based at least in part on the type of the location, whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located; and performing the at least one operation in response to determining that the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located.
  • Example 3 The method of example 2, wherein a type of location the electrically powered scooter is not permitted to operate comprises at least one of a pedestrian pathway or an driving lane of a vehicle pathway.
  • Example 4 The method of example 2, wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least a portion of a pavement marking; wherein determining whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located further comprises determining whether the electrically powered scooter is within a threshold distance of the portion of the pavement marking.
  • Example 5 The method of example 1, wherein performing the at least one operation comprises adjusting operation of the motor or adjusting operation of a braking apparatus to slow the at least one of the chassis-supported front wheel or chassis-supported rear wheel.
  • Example 6 The method of example 1, wherein the sensor comprises an image sensor, and wherein determining the type of the location in which the electrically powered scooter is physically located comprises: determining, based at least in part on the infrastructure data received from the sensor, a distance between the electrically powered scooter and the at least one infrastructure article; and determining the type of the location based at least in part on the distance.
  • Example 7 The method of example 1, wherein determining the type of the location in which the electrically powered scooter is physically located is further based on data received via a dedicated short range communication (DSRC) transceiver from at least one of another electrically powered scooter, a vehicle, or at least one infrastructure article.
  • DSRC dedicated short range communication
  • Example 8 The method of example 1, wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least one of a roadway sign, a license plate, or conspicuity tape.
  • Example 9 The method of example 1, wherein the location is a bicycle pathway adjacent to a vehicle driving lane.
  • Example 10 The method of example 1, wherein determining whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located is based at least in part on a global positioning system (GPS) location of the electrically powered scooter.
  • GPS global positioning system
  • Example 11 The method of example 1, wherein performing the at least one operation is based at least in part on detection of at least one of a vehicle, another electrically powered scooter, or a pedestrian.
  • Example 12 The method of example 1, wherein performing the at least one operation comprises generating an output.
  • Example 13 The method of example 12, wherein generating the output is in response to determining that an amount of time the electrically powered scooter has been in the location in which the electrically powered scooter is not permitted satisfies a threshold time duration.
  • Example 14 The method of example 12, wherein generating the output is in response to determining that a confidence level indicative a probability the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located satisfies a threshold confidence level.
  • Example 15 The method of example 12, wherein the output is at least one of audio, visual or haptic output.
  • Example 16 The method of example 1, wherein performing the at least one operation comprises sending a message to a remote computing device indicating that the electrically powered scooter is not at a physical location in which the electrically powered scooter is permitted to be physically located.
  • Example 17 The method of example 1, further comprising: determining, based at least in part on sensor data, a scooter-specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter; and performing at least one operation based at least in part on the determination of the scooter-specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter.
  • Example 18 The method of example 17, wherein the sensor is a first Dedicated Short- Range Communications transceiver, wherein determining the scooter-specific event comprises: receiving a message from a second Dedicated Short-Range Communications transceiver configured at the second electrically powered scooter; and determining the second electrically powered scooter is within the threshold distance of the first electrically powered scooter based at least in part on the message.
  • Example 19 The method of example 18, wherein determining the scooter-specific event comprises determining that the second electrically powered scooter is physically behind the first electrically powered scooter.
  • Example 20 The method of example 18, wherein determining the scooter-specific event comprises determining, based at least in part on detection of the proximity of the second electrically powered scooter and the sensor data, whether a collision between the first electrically powered scooter and the second electrically powered scooter will occur; and wherein performing at least one operation comprises generating an output, changing the operation of the motor controller that drives the at least one of the chassis-supported front wheel or rear wheel, or changing the operation of a braking apparatus.
  • Example 21 The method of example 17, wherein the set of characteristics is a first set of characteristics, the method further comprising: determining, based on the sensor data, that a vehicle proximate to the scooter is configured with a second set of characteristics that correspond to an automobile, wherein the first and second set of characteristics are different; wherein determining the scooter-specific event is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
  • Example 22 The method of example 21, wherein the at least one operation is a first operation, wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile, wherein a second operation is based at least in part on an object configured with a third set of characteristics that do not correspond to the automobile, and wherein performing the at least one operation comprises performing the first operation based at least in part on determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
  • Example 23 The method of example 22, wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
  • Example 24 The method of example 17, wherein performing the at least one operation comprises outputting at least one of audio, visual or haptic output.
  • Example 25 The method of example 17, wherein determining the scooter-specific event the memory comprises determining the presence of a temporary traffic control zone in a path of the electrically powered scooter.
  • Example 26 The method of example 17, wherein the electrically power scooter includes one or more portions of conspicuity tape comprising one or more characteristics within the set of characteristics that correspond to the electrically powered scooter.
  • Example 27 The method of example 17, wherein performing the at least one operation comprises sending a message to an automobile that indicates the presence of the electrically powered scooter.
  • Example 28 The method of example 17, wherein determining the scooter-specific event comprises determining whether a pavement marking is present or absent from a portion of a roadway.
  • Example 29 The method of example 17, wherein the set of characteristics is a first set of characteristics, the method further comprising: determining, based on the sensor data, that an object proximate to the scooter is characterized by a second set of characteristics that correspond to a person, wherein the first and second set of characteristics are different; and determining the scooter-specific event based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the person.
  • Example 30 The method of example 17, wherein performing the at least one operation comprises sending a message to a remote computing device that that the electrically powered scooter determined, based at least in part on the sensor data, the scooter-specific event.
  • Example 31 The method of example 30, wherein a user profile stored at the remote computing device is based at least in part on the message, and wherein the user profile is associated with a user of the electrically powered scooter and is usable to determine future use of another electrically powered scooter by the user.
  • Example 32 The method of example 30, wherein the message is usable by the remote computing device to determine a degree of non-compliance for a physical region based on a plurality of electrically powered scooters.
  • Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol.
  • computer-readable media generally may correspond to (1) tangible computer-readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave.
  • Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure.
  • a computer program product may include a computer-readable medium.
  • Such computer-readable storage media can comprise RAM, ROM, eEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
  • coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave are included in the definition of medium.
  • DSL digital subscriber line
  • computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transient media, but are instead directed to non-transient, tangible storage media.
  • Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • FPGAs field programmable logic arrays
  • processors may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described.
  • the functionality described may be provided within dedicated hardware and/or software modules. Also, the techniques could be fully implemented in one or more circuits or logic elements.
  • the techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set).
  • IC integrated circuit
  • a set of ICs e.g., a chip set.
  • Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
  • a computer-readable storage medium includes a non-transitory medium.
  • the term "non-transitory” indicates, in some examples, that the storage medium is not embodied in a carrier wave or a propagated signal.
  • a non-transitory storage medium stores data that can, over time, change (e.g., in RAM or cache).

Abstract

An electrically powered scooter includes a scooter chassis, a front wheel, a rear wheel, a motor, a sensor, and a computing device. The sensor is configured to generate infrastructure data indicative of infrastructure proximate to the electrically powered scooter. The computing device includes a memory and one or more computer processors. The memory includes instructions that, when executed by the one or more computer processors, cause the one or more computer processors to: receive, from the sensor, the infrastructure data; determine, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.

Description

DYNAMICAUUY CONTROUUING EUECTRICAUUY POWERED SCOOTERS BASED
ON SENSED INFRASTRUCTURE DATA
TECHNICAU FIEUD
[0001] The present application relates generally to electrically powered scooters and roadway infrastructure.
BACKGROUND
[0002] Electric scooters are often used to transport people over relatively short distances. A user of an electric scooter typically rides the scooter on a roadway, street, pathway or a sidewalk, and frequently may use the scooter in urban or campus settings as a convenient mode of
transportation. In many situations, the roadway / street used by the scooter (or adjacent to the path or lane used by the scooter) may by occupied by vehicles travelling at relatively high speeds compared to the scooter. Moreover, sidewalks are often occupied by pedestrians travelling at relatively low speeds compared to the scooter. Navigating roadways, streets, paths and/or sidewalks may pose a risk to the safety of the user of the electric scooter, occupants of a vehicle, pedestrians, or any other person, pet, or property in proximity to the scooter.
SUMMARY
[0003] In general, this disclosure describes techniques for dynamically controlling and/or influencing the operation of electrically powered scooters based on sensed infrastructure data. In various examples described herein, an electrically powered scooter (referred to generally herein simply as a scooter) includes a computing device to control operation of the scooter and one or more inputs to sense (i.e., capture or receive) data from infrastructure within an environment in which the scooter is or will be operated. As one example, the computing device may, based on sensed infrastructure data, automatically adjust a speed at which the scooter is permitted to travel or cause the scooter to slow down or stop, for example, based on the type of pathway (e.g., a vehicle pathway, a bicycle pathway, or a pedestrian pathway). For example, the computing device may set the maximum speed for the scooter to one speed upon determining, from the sensed infrastructure data, that the scooter is currently located or will be operating on a vehicle pathway (e.g., 15 miles per hour on a road) and may automatically set the maximum speed to another, lower speed upon determining from the infrastructure data that the scooter is located on or will be used on a pedestrian pathway (e.g., a sidewalk). As another example, the computing device may control the scooter based on determining, according to the infrastructure data, whether the scooter is located within or proximate to a physical region in which the scooter is permitted to be physically located. For example, the computing device may output a command to apply a brake or gracefully shut off power to an electric motor when the scooter is located in or proximate to a prohibited location (e.g., proximate a building entrance).
[0004] In this way, the computing device may control operation of the scooter based on infrastructure data captured from or otherwise received (e.g., from a local or cloud-based computing system), where that infrastructure data is specific to the physical location in which the scooter is currently located or planned for operation. By controlling operation of the scooter based on the location of the scooter, using data captured from or otherwise received for physical infrastructure within the environment, the computing device may improve the safety of the user, nearby pedestrians, and/or occupants of nearby motor vehicles.
[0005] In some examples, the disclosure describes a system comprising: an infrastructure article; a micro-mobility device comprising a sensor configured to generate infrastructure data indicate of the infrastructure article; and a computing device comprising a memory and one or more computer processors, wherein the memory comprises instructions that when executed by the one or more computer processors cause the one or more computer processors to: receive, from a sensor, the infrastructure data; determine, based at least in part on the infrastructure data, a type of a location in which the micro-mobility device is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
[0006] In some examples, an electrically powered scooter comprises a scooter chassis having a rear wheel mount at one end and a front wheel mount at the other end with a chassis support member extending there between; a chassis-supported front wheel mounted to the front wheel mount for turning steering movement with respect to the front wheel mount and a chassis- supported rear wheel; a chassis-supported motor physically coupled to the scooter chassis and configured by a motor controller to drive at least one of the chassis-supported front wheel or chassis-supported rear wheel for powered movement over a ground surface; a sensor configured to generate infrastructure data indicative of infrastructure proximate to the electrically powered scooter; a computing device comprising a memory and one or more computer processors, wherein the memory comprises instructions that when executed by the one or more computer processors cause the one or more computer processors to: receive, from the sensor, the infrastructure data; determine, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located. [0007] In another example, a computing device comprises memory and one or more processors connected to the memory. The memory includes instructions that, when executed by the one or more processors, cause the computing device to receive, from a sensor, infrastructure data indicative of infrastructure proximate to an electrically powered scooter; determine, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
[0008] The details of one or more examples are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF DRAWINGS
[0009] FIG. 1 is a conceptual diagram illustrating an example system for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure.
[0010] FIG. 2 is a block diagram illustrating an example system for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure.
[0011] FIG. 3 is a block diagram illustrating an example computing device, in accordance with techniques of this disclosure.
[0012] FIG. 4 is a schematic diagram illustrating an example electrically powered scooter, in accordance with techniques of this disclosure.
[0013] FIG. 5 is a flow diagram illustrating example operation of a computing device for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure.
DETAILED DESCRIPTION
[0014] FIG. 1 is a conceptual diagram illustrating an example physical environment having transportation system that includes one or more electrically powered scooters, in accordance with techniques of this disclosure. In the example of FIG. 1, transportation system 100 includes a variety of different infrastructure elements (generally referred to as“infrastructure”). As shown in the example of FIG. 1, infrastructure may include dedicated transportation pathways 106A- 106D (collectively, transportation pathways 106) as well as infrastructure articles 107A-107 (collectively, infrastructure articles 107) positioned and oriented within the environment.
[0015] As shown in FIG. 1, system 100 includes one or more micro-mobility devices Examples of micro-mobility devices include electrically-powered food delivery devices, electrically powered hoverboards or skateboards, electrically powered scooters 1 lOA-1 IOC (collectively, electrically powered scooters 110), or other small -profile devices that may use or travel upon a roadway or sidewalk. Electrically powered scooters 110 (also referred to herein simply as scooters 110) may operate on transportation pathways 106. As described in more detail with reference to FIG. 3, in this example, electrically powered scooters 110 includes a chassis, a front wheel, a back wheel, an electric motor, a steering assembly, and a battery 119. In this example, the chassis includes a rear-wheel mount at one end of the chassis, a front-wheel mount at another end of the chassis that is opposite the rear-wheel mount, and a chassis support extending horizontally between the rear-wheel mount and the front-wheel mount. The front and rear wheels are mounted to the front and rear wheel mounts of the chassis, respectively. The front wheel mount is coupled to a steering assembly. In some examples, the steering assembly includes handlebars such that turning the handle bars causes the front wheel to turn. In some examples, the electric motor is physically coupled to the scooter chassis and is configured by a motor controller to drive at least one of the chassis-supported front wheel or chassis-supported rear wheel for powered movement over a ground surface.
[0016] Examples of transportation pathways 106 include a vehicle pathway (e.g., pathway 106A, 106D), a bicycle pathway (e.g., pathway 106B), or a pedestrian pathway (e.g., pathway 106C), among others. In other examples, transportation pathways 106 may be sidewalks, public spaces, or other surfaces not specifically dedicated to certain types of vehicles or traffic. Vehicle pathways (e.g., 106A) may be used by vehicles 104A-104C (collectively, vehicles 104) to transport people or goods. Examples of vehicles 104 include automobiles (e.g., 104B, 104C) such as cars, trucks, passenger vans; buses; motorcycles; recreational vehicles (RVs); or lorries (e.g., 104A), etc. Examples of vehicle pathways can also include alleys, streets, and highways (or a vehicle specific portion thereof, such as a vehicle driving lane), among others. Bicycle pathways (e.g., 106B) may be used by bicycles or vehicles and bicycles. Examples of bicycle pathways include a street or a portion of a street designated for bicycles, a bicycle trail, among others. In some instances, a pedestrian pathway (e.g., 106C) is primarily used by pedestrians 108.
Examples of pedestrian pathways include a pedestrian sidewalk or a jogging path. In some examples, one of transportation pathways 106 may include two or more different types of pathways. For instance, transportation pathway 106A may include a vehicle driving lane of a vehicle pathway and a bicycle pathway adjacent to the driving lane. Transportation pathways 106 may include portions not limited to the respective pathways themselves. In the example of transportation pathway 106A (e.g., a vehicle pathway), transportation pathway 106 may include the road shoulder, physical structures near the pathway such as toll booths, railroad crossing equipment, traffic lights, guardrails, and generally encompassing any other properties or characteristics of the pathway or objects/structures in proximity to the pathway.
[0017] Examples of infrastructure articles include a pavement marking (e.g., infrastructure article 107A), a roadway sign (e.g., infrastructure article 107B), a license plate (e.g., infrastructure article 107C), a conspicuity tape (e.g., infrastructure article 107D), and a hazard marker (e.g., infrastructure article 107E, such as a construction barrel, a traffic cone, a traffic barricade, a safety barrier, among others). Pavement markings may include liquid markings, tape, or raised pavement markings to name only a few examples. In some examples, pavement markings may include sensors, materials, or structures that permit the detection of the marking and/or communication of information between the pavement marking and a receiving device. Additional examples of infrastructure articles 107 include traffic lights, guardrails, billboards, electronic traffic sign (also referred to as a variable-message sign), among others. Infrastructure articles 107 may include information that may be detected by one or more sensors of computing device 116.
[0018] In some examples, an infrastructure article, such as infrastructure article 107B, may include an article message 126 on the physical surface of the infrastructure article. Article message 126 may include characters, images, and/or any other information that may be printed, formed, or otherwise embodied on infrastructure article 107B. For example, each infrastructure article 107 may have a physical surface having article message 126 embodied thereon. Article message 126 may include human-perceptible information and machine-perceptible information.
[0019] Human-perceptible information may include information that indicates one or more first characteristics of a pathway, such as information typically intended to be interpreted by human drivers. In other words, the human-perceptible information may provide a human-perceptible representation that is descriptive of at least a portion of the transportation pathway. As described herein, human-perceptible information may generally refer to information that indicates a general characteristic of a transportation pathway and that is intended to be interpreted by a human driver. For example, the human-perceptible information may include words (e.g.,“STOP” or the like), symbols, graphics (e.g., an arrow indicating the road ahead includes a sharp turn) or shapes (e.g., signs or lane markings). Human-perceptible information may include the color of the article, the article message or other features of the infrastructure article, such as the border or background color. For example, some background colors may indicate information only, such as“scenic overlook” while other colors may indicate a potential hazard (e.g., the red octagon of a stop sign, or the double yellow line of a no passing zone).
[0020] In some instances, the human-perceptible information may correspond to words or graphics included in a specification. For example, in the United States (U.S.), the human- perceptible information may correspond to words or symbols included in the Manual on Uniform Traffic Control Devices (MUTCD), which is published by the U.S. Department of Transportation (DOT) and includes specifications for many conventional signs for roadways. Other countries have similar specifications for traffic control symbols and devices.
[0021] Machine-perceptible information may generally refer to information configured to be interpreted by an electrically powered scooter. For example, article message 126 may be encoded via a 2-dimensional bar code, such as a QR code. In some examples, machine -perceptible information may be interpreted by a human driver. In other words, machine -perceptible information may include a feature of the graphical symbol that is a computer-interpretable visual property of the graphical symbol. In some examples, the machine -perceptible information may relate to the human-perceptible information, e.g., provide additional context for the human- perceptible information. In an example of an arrow indicating a sharp turn, the human-perceptible information may be a general representation of an arrow, while the machine-perceptible information may provide an indication of the shape of the turn including the turn radius, any incline of the roadway, a distance from the sign to the turn, or the like. The additional information may be visible to a human operator; however, the additional information may not be readily interpretable by the human operator, particularly at speed. In other examples, the additional information may not be visible to a human operator but may still be machine readable and visible to a vision system of an electrically powered scooter. In some examples, an enhanced infrastructure article may be an optically active article in that the infrastructure article is readily detectible by vision systems, which may include an infrared camera or other camera configured for detecting electromagnetic radiation in one or more bands of the electromagnetic spectrum, which may include the visible band, the infrared band, the ultraviolet band, and so forth. For example, the infrastructure articles may be reflective, such as retroreflective, within one or more bands of the electromagnetic spectrum that are readily detectible by visions systems of the computing device 116.
[0022] Article message 126 may indicate a variety of types of information. In some examples, article message 126 may, for instance, provide computing device 116 with static information related to a region of a pathway 106. Static information may include any information that is related to navigation of the pathway associated with article message 126, and not subject to change. For example, certain features of pathways 106 may be standardized and/or commonly used, such that article message 126 may correspond to a pre-defmed classification or operating characteristic of the respective pathway. As some examples, article message 126 may indicate a navigational characteristic or feature of the pathway, an operating rule or set of operating rules of the pathway, or the like.
[0023] Infrastructure articles 107 may include a variety of indicators and/or markers. For example, infrastructure article 107 may include one or more of an optical tag, a radio-frequency identification tag, a radio-frequency tag, an acoustic surface pattern, or a material configured to provide a signature to a signature-sending system. In some examples, electrically powered scooters 110 may receive data from infrastructure articles 107 via near-field communication (NFC) protocols and signals, laser, or infrared-based readers, or other communication type.
[0024] Electrically powered scooters 110 may each include one or more sensors that perceive characteristics of the environment, infrastructure, and other objects around electrically powered scooter 110A. Examples of sensors include an image sensor, sonar, LiDAR, among others. The sensors may generate sensor data indicative of sensed characteristics. For example, the sensor data may include infrastructure data indicative of the infrastructure proximate to a respective scooter of electrically powered scooters 110. An object may be proximate to a particular electrically powered scooter 110 when the object is detectable by one or more sensors of particular electrically powered scooter 110. As one example, the infrastructure data may be indicative of one or more infrastructure articles 107 proximate to a respective scooter of electrically powered scooters 110.
[0025] In accordance with techniques of this disclosure, electrically powered scooter 110A includes computing device 116A configured to dynamically control operation of the electrically powered scooter based at least in part on the infrastructure data. As one example, computing device 116A may determine a type of location in which electrically powered scooter 110A is located based on the infrastructure data and may control operation of electrically powered scooter 110A based at least in part on the type of the location.
[0026] Computing device 116A may determine the type of the location in which electrically powered scooter 110A is currently located. Example types of locations include transportation pathways 106, parks, interiors of buildings, parking lots, etc. Computing device 116A may determine the type of location in which electrically powered scooter 110A is located based on infrastructure data. In some examples, the infrastructure data includes image data (e.g., images and/or videos) generated by one or more image sensors. Computing device 116A may perform one or more image processing algorithms on the image data to identify the type of the location. For instance, the image data may include an image of one or more infrastructure articles 107 proximate to electrically powered scooter 110A. In one example, computing device 116A may determine the type of location in which electrically powered scooter 110A is located is a bicycle pathway based on the image data. For example, computing device 116A may perform image processing to identify infrastructure articles 107A as pavement markings (also referred to as lane markings). Computing device 116A may determine that the type of location in which electrically powered scooter 110A is located is a bicycle pathway in response to determining that electrically powered scooter 110A is between two pavement markings. In other words, in one example, computing device 116A may determine that transportation pathway 106A is a bicycle pathway, and hence that the type of location in which electrically powered scooter 110A is located is a bicycle pathway. In some instances, computing device 116A determines electrically powered scooter 110A is located within a bicycle pathway based on the characteristics (e.g., color, width, double vs single line, distance between, etc.) of infrastructure articles 107A. Additional details of analyzing infrastructure data are described in U.S. Provisional Patent Application 62/622,469, fded January 26, 2018, and U.S. Provisional Patent Application 62/480,231, fded March 31,
2017, each of which is hereby incorporated by reference in their entirety. [0027] Computing device 116A may determine a distance between infrastructure articles 107A. For instance, computing device 116A may calculate a number of pixels between infrastructure articles 107A and calculate the number of pixels associated with a known or typical dimension (e.g., width) of a reference object (e.g., infrastructure article 107B) captured in one or more images of the image data. In such instances, computing device 116A may compare the number of pixels between infrastructure articles 107A to the number of pixels associated with the reference object to determine the distance between infrastructure articles 107A. As such, in one example, computing device 116A may determine that type of location in which electrically powered scooter 110A is located is a bicycle pathway in response to determining that the distance between infrastructure articles 107A corresponds to a width of a bicycle pathway.
[0028] In some examples, computing device 116A determines a type of transportation pathway 106B based on characteristics of the infrastructure. For example, computing device 116A may determine a color of transportation pathway 106A and determine that transportation pathway 106A is a bicycle pathway based on the color. In another example, computing device 116A may identify a symbol on the surface of transportation pathway 106A between infrastructure articles 107A and determine that transportation pathway 106A is a bicycle pathway based on the symbol.
[0029] In some instances, the image data includes data indicative of article message 126.
Computing device 116A may determine the type of location in which electrically powered scooter 110A is located based on article message 126. For instance, article message 126 may indicate a type of infrastructure article 107B, a type of transportation pathway 106 associated with infrastructure article 107B, or both. In one instance, computing device 116A may determine the type of location in which electrically powered scooter 110A is located is a bicycle pathway based on the article message 126.
[0030] Computing device 116A may determine a type of location in which electrically powered scooter 110A is currently located based at least in part on detecting one or more vehicles 106, pedestrians 108, electrically powered scooters 110, and/or bicycles. Computing device 116 may detect one or more vehicles 106 based on the infrastructure data (e.g., image data or other signature data). For example, computing device 116A may perform image processing of image data to detect one or more vehicles 106 and may determine transportation pathway 106A is a vehicle pathway. As another example, computing device 116A may perform image processing on the image data and determine that transportation pathway 106C includes pedestrians 108. In such examples, computing device 116A may determine that transportation pathway 106C is a pedestrian pathway. Similarly, computing device 116A may determine that transportation pathway 106B is a bicycle pathway in response to detecting bicycles and/or scooters 110B, 1 IOC. Computing device 116A may determine which of transportation pathways 106 electrically powered scooter 110A is located based on the image data. For example, computing device 116A may determine that electrically powered scooter is located within a bicycle pathway (e.g., transportation pathway 106B) in response to determining that transportation pathway 106B is located in the middle of the image.
[0031] In some scenarios, computing device 116A may determine a type of location in which electrically powered scooter 110A is located based on communication data received from a computing device separate from electrically powered scooter 110A, such as another electrically powered scooter (e.g., scooter 110B), an infrastructure article, or a vehicle (e.g., vehicle 104A).
In some examples, computing device 116A receives the communication data via a dedicated short range communication (DSRC) transceiver. Additionally or alternatively, computing device 116A may receive communication data via any wireless communication device, such as a
BLUETOOTH device, a WIFI device, a GPS device, among others. For instance, the communication data may include data indicating the type of the location is a transportation pathway 106. In one instance, the communication data indicates GPS coordinates of electrically powered scooter 110A (e.g., GPS coordinates) and computing device 116A may determine the type of location based on the GPS coordinates. In another example, the communication data may indicate a type of the sending device and computing device 116A may determine the type of location for electrically powered scooter 110A based on the type of the sending device. For example, the communication device may indicate the sending device is a vehicle, such as a lorry or semi -truck. In such examples, computing device 116A may determine that computing device 116A is located in a transportation pathway 106 in response to determining the sending device is a vehicle. In some instances, the communication data includes data which was received from vehicles 104, infrastructure articles 107, or other scooters 110 that travelled proximate to the current location of electrically powered scooter 110A within a particular time duration of electrically powered scooter 110A arriving at its current location.
[0032] In some examples, the communication data may include data indicating a type of a roadway, a size of the roadway (e.g., a number of lanes), a speed of the vehicle 106, a speed limit for the roadway, among others. In some examples, the data indicating the type of the roadway may include data indicating the presence of an accident, the presence of a construction zone, the direction, speed, or congestion of traffic, road surface type, types of vehicles permitted or present on the roadway, number of lanes, complexity of traffic, or a combination thereof. For example, computing device 116A may receive data from vehicles 104 indicating a type of transportation pathway 106A.
[0033] In some examples, computing device 116A determines whether electrically powered scooter 110A is permitted in the location in which electrically powered scooter 110A is currently located. For example, computing device 116A may determine whether electrically powered scooter 110A is permitted in its current location based on the type of the current location and one or more rules. The rules may be pre-programmed or machine generated (e.g., using trained or untrained machine learning models). In some scenarios, computing device 116A determines based on the rule(s) that electrically powered scooter 110A is permitted in certain types of locations and is not be permitted (e.g., may be prohibited) in different types of locations. For instance, computing device 116A may determine that electrically powered scooter 110A is permitted in its current location when electrically powered scooter 110A is located on one of transportation pathways 106. Similarly, computing device 116A may determining that electrically powered scooter 110A is not permitted in its current location when electrically powered scooter 110A is located within a building or on an athletic field (e.g., a baseball field, soccer field, etc.).
[0034] Electrically powered scooter 110A may be permitted in a subset of one type of location and may not be permitted in a different subset of the type of location. For example, computing device 116A may determine based on the rules that electrically powered scooter 110A is permitted on transportation pathways 106A and 106B and that electrically powered scooter 110A is not be permitted on transportation pathway 106C.
[0035] For example, computing device 116A may determine that electrically powered scooter 110A is not permitted to travel through construction zone 111 (or any other temporary traffic control zone).
[0036] Alternatively or additionally to determining whether electrically powered scooter 110A is permitted in its current location based on the type of the current location, in some scenarios, computing device 116A determines whether electrically powered scooter 110A is permitted in its current location based at least in part on the presence of a vehicle 106, scooter 110, pedestrian 108, or a combination thereof. For example, computing device 116A may determine that electrically powered scooter 110A is not permitted in its current location in response to detecting one or more of vehicles 104, scooters 110, or pedestrians 108.
[0037] Computing device 116A may determine whether electrically powered scooter 110A is permitted in its current location based on a distance to an infrastructure article 107. For example, computing device 116A may detect (e.g., based on the image data or signature data) a pavement marking (e.g., infrastructure articles 107A) and determine a distance to the pavement marking. Computing device 116A may determine that electrically powered scooter 110A is permitted in its current location in response to determining that the distance to the pavement marking is within a threshold distance.
[0038] Computing device 116A performs an operation based at least in part on the type of location in which electrically powered scooter 110A is located, whether electrically powered scooter 110A is permitted in its current location, a type of a roadway, presence of vehicles 104, pedestrians 108, and/or other electrically powered scooters 110, or a combination thereof. [0039] In some examples, computing device 116A performs an operation to adjust operation of the electric motor of electrically powered scooter 110A. For example, computing device 116A may perform an operation based on the type of location and/or in response to determining that electrically powered scooter 110A in not permitted in the location in which it is currently located. For example, computing device 16A may cause a motor controller to adjust (e.g., increase or decrease) the speed of the electric motor, and hence, the speed of electrically powered scooter 110A. In one scenario, computing device 116A adjusts a maximum allowable speed based on the type of location. For example, computing device 116A may enable the electric motor to drive the wheels at a first speed when electrically powered scooter 110A is located on a vehicle pathway (e.g., pathway 106A) and may enable the electric motor to drive the wheels at a different (e.g., lower) speed with electrically powered scooter 110A is located on a pedestrian pathway (e.g., 106C). In another example, computing device 116A may perform an operation to adjust operation of a braking apparatus coupled to back wheel, front wheel, or both. For instance, computing device 116A may cause the braking apparatus to slow and/or stop one or both of the wheels.
[0040] In some instances, computing device 116A performs the operation based at least in part on a type of a roadway. For example, computing device 116A may determine that the current location of electrically powered scooter 110A is a transportation pathway 106 (e.g., pathway 106B). Computing device 116A may adjust a current speed and/or a maximum allowable speed of electrically powered scooter in response to receiving data indicating that transportation pathway 106D includes a construction zone 111.
[0041] Computing device 116A may perform the at least one operation based at least in part on whether computing device 116A detected the presence of vehicles 106, pedestrians 108, and/or other electrically powered scooters 110. For example, computing device 116A adjust a speed of electrically powered scooter 110A (e.g., via the electric motor and/or braking apparatus) in response to detecting pedestrian 108B, for example, regardless of the type of location in which electrically powered scooter 110A is located. As another example, computing device 116A may adjust the speed of electrically powered scooter 110A based on the type of location and the presence of vehicles 106, pedestrians 108, and/or other electrically powered scooters 110. For example, computing device 116A may adjust the speed of electrically powered scooter 110A when electrically powered scooter 110A is located on a bicycle pathway and pedestrian 108B is detected. In one example, computing device 116A may maintain the speed of electrically powered scooter 110A when electrically powered scooter 110A is located on a vehicle pathway and pedestrian scooter 110A is detected on a pedestrian pathway.
[0042] Computing device 116A may perform the at least one operation by generating an output. For example, the output may include an audio output, a visual output, a haptic output, or a combination thereof. As one example, computing device 116A may output a visual alert via one or more LED lights or output a haptic alert (e.g., causing the steering assembly to vibrate) indicating that electrically powered scooter 110A is not permitted in its current location.
[0043] In some examples, computing device 116A outputs a message to a remote computing device separate from electrically powered scooter 110A. The message may indicate that electrically powered scooter 110A is currently located in a location in which it is not permitted. The message may indicate an amount of time that electrically powered scooter 110A has been in its current location, the current location of electrically powered scooter, among other information.
[0044] In some instances, computing device 116A determines an amount of time that electrically powered scooter 110A has been in a location in which the scooter 110A is not permitted.
Computing device 116A may perform the at least one operation in response to determining that the amount of time satisfies (e.g., is greater than or equal to) a threshold time duration. For example, computing device 116 may generate an output and/or adjust a speed of the electrically powered scooter 110A in response to determining that electrically powered scooter 110A has been located in an impermissible location for at least the threshold time duration. Computing device 116A may determine a confidence level indicating a probability that electrically powered scooter has been in a location in which the scooter 110A is not permitted. Computing device 116A may perform the at least one operation in response to determining that the confidence level satisfies (e.g., is greater than or equal to) a threshold confidence level. For example, computing device 116 may generate an output and/or adjust a speed of the electrically powered scooter 110A in response to determining that confidence level satisfies the threshold confidence level.
[0045] While computing device 116A is described as dynamically controlling scooter 110A, techniques of this disclosure may enable a computing device to control any other type of micro mobility device, such as a powered food-delivery device, hoverboard, or skateboard.
[0046] FIG. 2 is a block diagram illustrating an example system for dynamically controlling an electrically powered scooter, in accordance with techniques of this disclosure. System 140 illustrates additional details of system 100 of FIG. 1. In the examples of FIG. 2, system 140 includes electrically powered scooter 110A, vehicle 104B, and a remote computing system 150.
In some examples, electrically powered scooter 110A and vehicle 104B are communicatively coupled to one another and/or remote computing system 150 via network 114. In another example, electrically powered scooter 110A and vehicle 104B are communicatively coupled to one another directly, for example, via a DSRC transceiver.
[0047] Electrically powered scooter 110A includes computing device 116A and vehicle 104B include computing device 116B. Computing devices 116A, 116B (collectively, computing devices 116) may each include one or more communication unit 214A, 214B, and sensors 117A,
117B, respectively. [0048] Communication units 214A, 214B (collectively, communication units 214) of computing devices 116 may communicate with external devices by transmitting and/or receiving data. For example, computing device 116 may use communication units 214 to transmit and/or receive radio signals on a radio network such as a cellular radio network or other networks, such as networks 114. In some examples communication units 214 may transmit and receive messages and information to other vehicles, such as information interpreted from infrastructure article 107. In some examples, communication units 214 may transmit and/or receive satellite signals on a satellite network such as a Global Positioning System (GPS) network. In some examples, communications units 214 may transmit and/or receive data through network 114 to remote computing system 150 via communication unit 154.
[0049] Sensors 117A, 117B (collectively, sensors 117) may image sensors 102A, 102B
(collectively, image sensors 102), temperature sensors, LiDAR, or a combination thereof, to name only a few examples of sensors. Examples of image sensors 102 may include semiconductor charge-coupled devices (CCD) or active pixel sensors in complementary metal-oxide- semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS, Live MOS) technologies. Digital sensors include flat panel detectors. In one example, electrically powered scooter 110A or vehicle 104B includes at least two different sensors for detecting light in two different wavelength spectrums. Image sensors 102 may have a fixed field of view or may have an adjustable field of view. An image sensor 102 with an adjustable field of view may be configured to pan left and right, up and down relative to electrically powered scooter 110 or vehicle 104B as well as be able to widen or narrow focus. In some examples, image sensors 102 may include a first lens and a second lens. Electrically powered scooter 110 and/or vehicle 104B may have more or fewer image sensors 102 in various examples.
[0050] In the example of FIG. 2, computing device 116A includes an interpretation component 118, a user interface (UI) component 124, and a control component 144. Components 118A, 124, and 144 may perform operations described herein using software, hardware, firmware, or a mixture of both hardware, software, and firmware residing in and executing on computing device 116 and/or at one or more other remote computing devices. In some examples, components 118A, 124, and 144 may be implemented as hardware, software, and/or a combination of hardware and software.
[0051] Computing device 116A may execute components 118A, 124, and 144 with one or more processors. Computing device 116A may execute any of components 118A, 124, 144 as or within a virtual machine executing on underlying hardware. Components 118A, 124, 144 may be implemented in various ways. For example, any of components 118A, 124, 144 may be implemented as a downloadable or pre-installed application or“app.” In another example, any of components 118 A, 124, 144 may be implemented as part of an operating system of computing device 116.
[0052] UI component 124 may include any hardware or software for communicating with a user of electrically powered scooter 110. In some examples, UI component 124 includes outputs to a user such as displays, such as a display screen, indicator or other lights, audio devices to generate notifications or other audible functions, and/or haptic feedback devices. UI component 124 may also include inputs such as knobs, switches, keyboards, touch screens or similar types of input devices.
[0053] In general, sensors 117 may be used to gather information about infrastructure proximate to electrically powered scooter 110A and vehicle 104B, such as information about transportation pathways 106. Sensors 117 may generate infrastructure data indicative of the infrastructure proximate to electrically powered scooter 110A or vehicle 104B. For example, image sensors 102 may capture images of infrastructure articles, such as lane markings, centerline markings, edge of roadway or shoulder markings, as well as the general shape of the transportation pathway. The general shape of a transportation pathway may include turns, curves, incline, decline, widening, narrowing or other characteristics.
[0054] According to techniques of this disclosure, interpretation component 118A may determine a type of location in which electrically powered scooter 110A is located based on infrastructure data generated by sensors 117A, 117B of electrically powered scooter 110A and/or vehicle 104B. For example, interpretation components 118A may receive, from image sensors 102) image data indicative of infrastructure proximate to electrically powered scooter 110A and vehicle 104B, respectively. Interpretation component 118A may identify the infrastructure (e.g., an infrastructure article 107) using one or more image processing algorithms.
[0055] Interpretation component 118A may determine a type of location in which electrically powered scooter 110A is located based on the infrastructure data. Interpretation component 118A may determine the type of location based at least in part on the infrastructure data. For example, interpretation component 118A may determine the type of location in which electrically powered scooter 110A is located by performing one or more image processing algorithms on the image data to identify the type of the location. For instance, interpretation component 118A may perform image processing to identify one or more infrastructure articles 107 proximate to electrically powered scooter 110A and determine the type of location based on the characteristics of the infrastructure article 107, such as a type, size, shape, and/or color of infrastructure article 107. As another example, interpretation component 118A may determine a type of location based on characteristics of the infrastructure captured by the image data. For example, interpretation component 118A may determine determines a type of transportation pathway 106B based on a color and/or size of transportation pathway 106B, or a symbol (e.g., a bicycle symbol) included on the pathway. As yet another example, interpretation component 118A may determine the type of location based on an article message 126 encoded on one or more infrastructure articles 107. For example, article message 126 may indicate a type of infrastructure article 107B, a type of transportation pathway 106 associated with infrastructure article 107B, or both. In some scenarios, interpretation component 118A determines a type of location in which electrically powered scooter 110A is currently located based at least in part on detecting one or more vehicles 106, pedestrians 108, electrically powered scooters 110, and/or bicycles.
[0056] As another example, interpretation component 118A may determine a type of location in which electrically powered scooter 110A is located based on communication data received from a computing device separate from electrically powered scooter 110A, such as another electrically powered scooter (e.g., scooter 110B), an infrastructure article 107, or a vehicle (e.g., vehicle 104B). For example, interpretation component 118A may receive data indicating a type of the location from a vehicle 10B that is proximate to electrically powered scooter 110A via communication unit 214, such as a DSRC transceiver. Interpretation component 118A may determine a type of the current location of electrically powered scooter 110A based on the type of a device from which the data was received, such as another scooter 110 or a vehicle 104.
[0057] Interpretation component 118A may be configured to determine one or more scooter operating rules based on the type of the location. In some examples, computing device 116A stores the operating rules locally. In another example, computing device 116A may query remote computing system 150 via network 114 to obtain the scooter operating rules. Interpretation component 118A may provide information, directly or indirectly, to control component 144 related to the scooter operating rules. As will be described below, the scooter operating rules (also referred to as“operating rules”) may be used to adjust operation of electrically powered scooter 110A. An operating rule may be any rule controlling operational characteristics of electrically powered scooter 110A. Operating rules that may be used include, but are not limited to, speed limits, acceleration limits, braking limits, following distance limits, lane markings, distance limits from construction or pedestrians, permitted locations, prohibited locations, and the like.
[0058] In some examples, an article message, such as article message of FIG. 1, includes or indicates an operating rule set for electrically powered scooter 110A. Interpretation component 118A may obtain the operating rule set based on the interpretation of article message 126. For example, article message 126 may indicate a specific operating rule set associated a type of transportation pathway, a construction zone, among others. As one example, interpretation component 118A may determine the rules set associated with transportation pathway 106B (e.g., a bicycle pathway) includes a speed limit. Interpretation component 118A may output a command to control component 144 based on the rules, such as a command to“apply brakes” or“shut off electric motor.” In such examples, interpretation component 118A accesses a local or remote data structure to identify a set of rules set and determine one or more operations to be applied by electrically powered scooter 110A. In this way, interpretation component 118A provides the set of operations to control component 144 to modify or adjust the operation of the electrically powered scooter 110A.
[0059] In some examples, control component 144 receives data from an external device. For example, remote computing system 150 may include a datastore that includes navigational characteristics of a transportation pathway 106 (e.g., construction zone 111), such as traffic pattern changes, presence of workers, lane width modification, curves, and shifts, and the like. In some examples, remote computing system 150 may include a datastore that includes navigational characteristics of the temporary zone, such as location data, congestion data, vehicle behavior variability, speed, lane departure, acceleration data, brake actuation data, and the like. Such navigational characteristics may be official data, such as supplied by operators having control of the temporary zone or may be crowd sourced data, such as supplied by users travelling through the transportation pathway 106.
[0060] Interpretation component 118A may determine whether electrically powered scooter 110A is permitted in the location in which electrically powered scooter 110A is currently located based on the rules. For example, interpretation component 118A may apply the rules to the type of the scooter’s current location to determine whether electrically powered scooter 110A is permitted in its current location. As another example, interpretation component 118A may determine whether electrically powered scooter 110A is permitted in its current location based the rules and the presence (or absence) of one or more vehicles 106, pedestrians 108, electrically powered scooters 110, and/or bicycles. For instance, interpretation component 118A may determine that electrically powered scooter 110A is permitted on a pedestrian pathway when pedestrians are not present and that electrically powered scooter 110A is permitted when pedestrians are nearby (or when the number or density of pedestrians satisfies (e.g., is greater than or equal to) a threshold number or density of pedestrians).
[0061] Control component 144 may be configured to perform an operation by adjusting operation of electrically powered scooter 110A. Control component 144 may include, for example, any circuitry or other hardware, or software that may adjust one or more functions of the vehicle. Some examples include adjustments to change a speed of electrically powered scooter 110A, shut off an electric motor that drives one or more wheels, or both.
[0062] Control component 144 may adjust operation of electrically powered scooter 110A based at least in part on the type of location in which electrically powered scooter 110A is located; whether electrically powered scooter 110A is permitted in its current location; a type of a roadway on which electrically powered scooter 110A is located; a presence of vehicles 106, pedestrians 108, and/or other electrically powered scooters 110; or a combination thereof. In some examples, control component 144 adjusts operation of the electric motor of electrically powered scooter 110A. For example, control component 144 may cause the electric motor to slow down or stop, which may slow or stop electrically powered scooter 110A. In another example, control component 144 causes a brake apparatus to slow or stop electrically powered scooter 110A. As yet another example, control component 144 may adjust a maximum allowable speed of electrically powered scooter 110A.
[0063] UI component 124 may perform the at least one operation by generating an output. For example, the output may include an audio output, a visual output, a haptic output, or a combination thereof. As one example, computing device 116A may output a visual alert via one or more LED lights or output a haptic alert (e.g., causing the steering assembly to vibrate) indicating that electrically powered scooter 110A is not permitted in its current location.
[0064] According to some examples, interpretation component 118A may determine or predict whether electrically powered scooter 110A will experience a scooter-specific event. As used herein, a scooter-specific event is an event that relevant to a class of vehicles with a set of characteristics that correspond to an electrically powered scooter. In one example, the set of characteristics corresponding to electrically powered scooters 110 may include exactly two wheels connected by a substantially horizontal support member, a generally vertical steering assembly, and an electric motor configured to drive one or both of the exactly two wheels. In some examples, electrically powered scooters 110 include conspicuity tape and the set of characteristics corresponding to electrically powered scooters 110 include characteristics of the conspicuity tape, such as a location or pattern of the conspicuity tape. Examples of scooter- specific events include collisions between electrically powered scooter 110A and another electrically powered scooter 110, a pedestrian, or a vehicle; a scooter fall (e.g., due to a hole in a transportation pathway 106); among others.
[0065] Interpretation component 118A may predict whether electrically powered scooter 110A will experience a scooter-specific event based at least in part on a communication received from another electrically powered scooter 110 (e.g. electrically powered scooter 110B), vehicle 104B, an infrastructure article 107, remote computing system 150, or any other computing device. In one example, electrically powered scooter 110A may receive message from electrically powered scooter 110B of FIG. 1 via communication unit 214, such as a DSRC transceiver. In some instances, interpretation component 118A determines a distance between electrically powered scooter 110A and electrically powered scooter 110B based on the message. For example, interpretation component 118A may determine a distance between the scooter 110A and 110B based on a time between when the message was sent by electrically powered scooter 110B and when the message was received by electrically powered scooter 110A. In some examples, interpretation component 118A may determine whether the distance satisfies a threshold distance.
[0066] In another example, interpretation component 118A may determine whether electrically powered scooter 110A will experience a scooter-specific event, such as a collision with a vehicle defined by the set of characteristics that correspond to electrically powered scooter 110B. In other words, interpretation component 118A may determine whether electrically powered scooter 110A will experience a collision with electrically powered scooter 110B. In some scenarios, interpretation component 118A determines whether scooters 110A and HOB will collide based on the distance between electrically powered scooter 110A and electrically powered scooter HOB. For example, interpretation component 118A may predict that the scooters 110 will collide in response to determining the distance between scooters 110A and 110B is within a threshold distance. As another example, interpretation component 118A may predict that electrically powered scooter 110A will collide with electrically powered scooter HOB based on the distance between electrically powered scooter 110A and electrically powered scooter 110B, a speed of electrically powered scooter 110A, and a speed of electrically powered scooter HOB. For example, interpretation component 118A of electrically powered scooter 110A may determine, based on the data received from electrically powered scooter 110B, whether electrically powered scooter 110B is physically behind or ahead of electrically powered scooter 110A. In an example where electrically powered scooter 110A is ahead of electrically powered scooter 110B, interpretation component 118A may predict a collision will occur in response to determining that the distance between scooters 110A and 110B is within a threshold distance and that the speed of electrically powered scooter 110A is less than the speed of electrically powered scooter HOB. Similarly, when electrically powered scooter 110A is behind electrically powered scooter 110B, interpretation component 118A may predict a collision will occur in response to determining that the distance between scooters 110A and 110B is within a threshold distance and that the speed of electrically powered scooter 110A is greater than the speed of electrically powered scooter HOB.
[0067] Interpretation component 118A may predict the occurrence of a scooter-specific event in response to detecting a temporary traffic zone in the path of electrically powered scooter 110A. For example, interpretation component 118A may determine construction zone 111 of FIG. 1 is in the path of electrically powered scooter 110A based on sensor data generated by sensors 117. Interpretation component 118A may predict a scooter-specific event will occur (e.g., a collision or the user of scooter 110A may fall) in response determining construction zone 111 is in the path of electrically powered scooter 110A.
[0068] As another example, interpretation component 118A may determine whether a scooter- specific event will occur based on the presence or absence of a pavement marking on a transportation pathway 106. For example, interpretation component 118A may determine whether a roadway (e.g., transportation pathway 106A) includes pavement markings (e.g., infrastructure articles 107A) based on the infrastructure data (e.g., image data or signature data). In one example, interpretation component 118A may predict a scooter-specific event will occur based on the absence of pavement markings on the roadway. In some examples, signature data may represent a detected signal and/or one or more properties of a signal, such as but not limited to an acoustic signal.
[0069] Interpretation component 118A may perform an operation in response predicting the occurrence of a scooter-specific event. In one example, interpretation component 118A performs the operation by causing control component 144 to adjust operation of electrically powered scooter 110A. For example, interpretation component 118A may cause control component 144 to adjust a speed (e.g., increase or decrease) the speed of electrically powered scooter 110A. As one example, interpretation component 118A may cause control component 144 to change operation of the electric motor (e.g., increase or decrease the speed of the motor) or change operation of a braking apparatus (e.g., apply the braking apparatus or disengage the braking apparatus).
Adjusting the speed of electrically powered scooter 110A may increase the distance between scooters 110A and 110B beyond the threshold distance, which may reduce the risk of a collision. As another example, interpretation component 118A may output an alert (e.g., audible, haptic, visual) in response to predicting the scooter-specific event will occur. For example, the alert may warn the user of electrically powered scooter 110 or people within proximity of electrically powered scooter 110A, of the risk of a scooter-specific event.
[0070] In some instances, interpretation component 118A determine whether electrically powered scooter 110A will experience a collision with a vehicle (e.g., vehicle 104B) defined by the set of characteristics that correspond to an automobile. In one example, the set of characteristics corresponding to an automobile may include at least four wheels and an enclosed passenger compartment occupied by a driver and/or one or more passengers. Interpretation component 118A may determine whether electrically powered scooter 110A will experience a collision with vehicle 104B based at least in part on sensor data generated by one or more of sensors 117. For example, interpretation component 118A may determine that vehicle 104B is proximate to electrically powered scooter 110A is configured with a set of characteristics corresponding to an automobile based on image data, signature data, or a combination therein.
For example, interpretation component 118A may perform image processing on an image of vehicle 104B to determine that vehicle 104B is an automobile.
[0071] In some examples, interpretation component 118A performs one or more operations in response to determining that the characteristics of an object in proximity to electrically powered scooter 110A correspond to an automobile. As one example, interpretation component 118A may cause scooter 110A to adjust a speed (e.g., by changing operation of a braking apparatus or electric motor) of electrically powered scooter 110A. As another example, interpretation component 118A may cause communication unit 214 to output a message to the automobile (e.g., via a DSRC transceiver) indicating the presence of electrically powered scooter 110A.
[0072] Interpretation component 118A may perform one operation in response to determining the set of characteristics of the object in proximity to electrically powered scooter 110A correspond to an automobile and a different operation in response to determining the set of characteristics of the object in proximity to electrically powered scooter 110A do not correspond to an automobile. Interpretation component 118A may determine the characteristics of an object in proximity to electrically powered scooter 110A include characteristics of a person or a pedestrian (e.g., two legs, or two arms). In one example, interpretation component 118A causes scooter 110A to perform one action (e.g., adjust its speed) in response to determining the characteristics of the object correspond to an automobile and performs a second, different action (e.g., outputting an audible, visual, or haptic alert) in response to determining the characteristics of the object correspond to a person rather than an automobile.
[0073] In some examples, communication unit 214 of computing device 116 of electrically powered scooter 110A outputs a message to a remote computing device, such as remote computing system 150. Remote computing system 150 may include a distributed computing platform (e.g., a cloud computing platform executing on various servers, virtual machines and/or containers within an execution environment provided by one or more data centers), physical servers, desktop computing devices, or any other type of computing system. In some examples, the message indicates usage data for electrically powered scooter 110A. The usage data may include a current location of electrically powered scooter 110A, whether the current location of electrically powered scooter 110A is permitted, a type of the current location (e.g., a
transportation pathway 106, a park, a scooter parking zone, etc.), an amount of time that electrically powered scooter 110A has been in its current location, information indicating the occurrence of a scooter-specific event, among other information.
[0074] Remote computing system 150 may receive the message from computing device 116A. The message may include usage data associated with electrically powered scooter 110A.
Analysis module 152 of remote computing system 150 may store the usage data within usage data 156. For example, analysis module 152 may store a user account for each user of electrically powered scooters 110 within user data 158. The user account may include information indicating whether the user complied with various scooter operating rules, such as complying with speed limits, operating electrically powered scooter 110A in prohibited locations, or parking electrically powered scooter 110A within da pre-defined set of delineated parking regions.
[0075] In some examples, analysis module 152 may store information indicating whether a user of electrically powered scooter 110A parked scooter 110A within a particular delineated region. For example, electrically powered scooters 110 may be permitted to be parked in a plurality of delineated regions within a physical environment. In one example, the delineated regions include designated parking zones to return electrically powered scooters. Analysis module 152 and/or electrically powered scooter 110A may determine whether the last user of electrically powered scooter 110A parked scooter 110A within one of the delineated regions, for example, based on a GPS coordinates and/or infrastructure data generated by one or more sensors 117. In some instances, analysis module 152 may select a reward in response to determining that the user parked electrically powered scooter 110A within one of the delineated regions. For example, the reward may include a discount on a future scooter use or a discount on the current use. Analysis module 152 may associated the reward with the user account for the user of electrically powered scooter 110A and may store an indication of the reward in the datastore.
[0076] Analysis module 152 may, in some examples, determine whether a user of electrically powered scooters 110 is permitted to use electrically powered scooters 110 based on information stored within the user account for that user. For example, analysis module 152 may determine that the user is prohibited (e.g., temporarily) from utilizing electrically powered scooters 110 in response to determining the user has violated operating rules. As one example, analysis module 152 may determine that the user is prohibited from utilizing electrically powered scooters 110 in response to determining the user parked electrically powered scooters 110 outside a delineated region several times, did not comply with speed limits, or operated electrically powered scooters 110 in prohibited locations.
[0077] According to some examples, analysis module 152 may store scooter usage information for a plurality of user accounts that are each associated with respective users of electrically powered scooters 110. Analysis module 152 may analyze the scooter information to determine a degree of compliance or non-compliance with scooter operating rules. For example, analysis module 152 may determine how often users of electrically powered scooters 110 exceed a speed limit, park within the delineated regions, operate scooters in prohibited locations, etc. Similarly, analysis module 152 may analyze the usage data to determine locations with high degrees of non- compliance with the scooter operating rules. For example, analysis module 152 may rank locations by non-compliance and generate a report indicating the locations with the high non- compliance.
[0078] While interpretation component 118A of computing device 116A is described as performing various functionality of computing device 116A, in some examples, interpretation component 118B of computing device 116B may perform similar functionality. For example, interpretation component 118B of computing device 116B of vehicle 104B may identify a type of a location in which electrically powered scooter 110A is located or determine whether electrically powered scooter 110A is permitted in its current location. Similarly, in some examples, interpretation component 118B may predict a scooter-specific event for electrically powered scooter 110A. Interpretation component 118B may output information to electrically powered scooter 110A via communication unit 214B. As one example, interpretation component 118A may output to electrically powered scooter 110A information indicating the type of location in which electrically powered scooter 110A is located, whether electrically powered scooter 110A is permitted in its current location, or a prediction of a scooter-specific event for electrically powered scooter 110A.
[0079] FIG. 3 is a block diagram illustrating an example computing device, in accordance with one or more aspects of the present disclosure. FIG. 3 illustrates only one example of a computing device. Many other examples of computing device 116A may be used in other instances and may include a subset of the components included in example computing device 116A or may include additional components not shown example computing device 116A in FIG. 3.
[0080] As shown in the example of FIG. 3, computing device 116A may be logically divided into user space 202, kernel space 204, and hardware 206. Hardware 206 may include one or more hardware components that provide an operating environment for components executing in user space 202 and kernel space 204. User space 202 and kernel space 204 may represent different sections or segmentations of memory, where kernel space 204 provides higher privileges to processes and threads than user space 202. For instance, kernel space 204 may include operating system 220, which operates with higher privileges than components executing in user space 202.
[0081] As shown in FIG. 3, hardware 206 includes one or more processors 208, input components 210, storage devices 212, communication units 214, output components 216, and sensors 117. Processors 208, input components 210, storage devices 212, communication units 214, output components 216, and sensors 1117 may each be interconnected by one or more communication channels 218. Communication channels 218 may interconnect each of the components 208, 210, 212, 214, 216, and 117 and other components for inter-component communications (physically, communicatively, and/or operatively). In some examples, communication channels 218 may include a hardware bus, a network connection, one or more inter-process communication data structures, or any other components for communicating data between hardware and/or software.
[0082] One or more processors 208 may implement functionality and/or execute instructions within computing device 116A. For example, processors 208 on computing device 116A may receive and execute instructions stored by storage devices 212 that provide the functionality of components included in kernel space 204 and user space 202. These instructions executed by processors 208 may cause computing device 116A to store and/or modify information, within storage devices 212 during program execution. Processors 208 may execute instructions of components in kernel space 204 and user space 202 to perform one or more operations in accordance with techniques of this disclosure. That is, components included in user space 202 and kernel space 204 may be operable by processors 208 to perform various functions described herein.
[0083] One or more input components 210 of computing device 116A may receive input.
Examples of input are tactile, audio, kinetic, and optical input, to name only a few examples.
Input components 210 of computing device 116A, in one example, include a voice responsive system, video camera, buttons, control pad, microphone or any other type of device for detecting input from a human or machine. In some examples, input component 210 may be a presence- sensitive input component, which may include a presence-sensitive screen, touch-sensitive screen, etc.
[0084] One or more communication units 214 of computing device 116A may communicate with external devices by transmitting and/or receiving data. For example, computing device 116A may use communication units 214 to transmit and/or receive radio signals on a radio network such as a cellular radio network. In some examples, communication units 214 may transmit and/or receive satellite signals on a satellite network such as a Global Positioning System (GPS) network.
Examples of communication units 214 include a DSRC transceiver, an optical transceiver, a radio frequency transceiver, a GPS receiver, or any other type of device that can send and/or receive information. Other examples of communication units 214 may include Bluetooth®, GPS, 3G,
4G, and Wi-Fi® radios found in mobile devices as well as Universal Serial Bus (USB) controllers and the like.
[0085] One or more output components 216 of computing device 116A may generate output. Examples of output are tactile, audio, and video output. Output components 216 of computing device 116A, in some examples, include a presence-sensitive screen, sound card, video graphics adapter card, speaker, cathode ray tube (CRT) monitor, liquid crystal display (LCD), or any other type of device for generating output to a human or machine. Output components may include display components such as a liquid crystal display (LCD), a Light-Emitting Diode (LED) or any other type of device for generating tactile, audio, and/or visual output. Output components 216 may be integrated with computing device 116A in some examples.
[0086] In other examples, output components 216 may be physically external to and separate from computing device 116A but may be operably coupled to computing device 116A via wired or wireless communication. An output component may be a built-in component of computing device 116A located within and physically connected to the external packaging of computing device 116A (e.g., a screen on a mobile phone). In another example, a presence-sensitive display may be an external component of computing device 116A located outside and physically separated from the packaging of computing device 116A (e.g., a monitor, a projector, etc. that shares a wired and/or wireless data path with a tablet computer). [0087] Output components 216 may also include control component 144, in examples where computing device 116A is onboard an electrically powered scooter. Control component 144 has the same functions as control component 144 described in relation to FIG. 1.
[0088] One or more storage devices 212 within computing device 116A may store information for processing during operation of computing device 116A. In some examples, storage device 212 is a temporary memory, meaning that a primary purpose of storage device 212 is not long term storage. Storage devices 212 on computing device 116A may configured for short-term storage of information as volatile memory and therefore not retain stored contents if deactivated. Examples of volatile memories include random access memories (RAM), dynamic random-access memories (DRAM), static random-access memories (SRAM), and other forms of volatile memories known in the art.
[0089] Storage devices 212, in some examples, also include one or more computer-readable storage media. Storage devices 212 may be configured to store larger amounts of information than volatile memory. Storage devices 212 may further be configured for long-term storage of information as non-volatile memory space and retain information after activate/off cycles.
Examples of non-volatile memories include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories. Storage devices 212 may store program instructions and/or data associated with components included in user space 202 and/or kernel space 204.
[0090] As shown in FIG. 3, application 228 executes in user space 202 of computing device 116A. Application 228 may be logically divided into presentation layer 222, application layer 224, and data layer 226. Presentation layer 222 may include user interface (UI) component 124, which generates and renders user interfaces of application 228. Application 228 may include, but is not limited to: UI component 124, interpretation component 118A, security component 120, and one or more service components 122. For instance, application layer 224 may interpretation component 118A, service component 122, and security component 120. Presentation layer 222 may include UI component 124.
[0091] Data layer 226 may include one or more datastores. A datastore may store data in structure or unstructured form. Example datastores may be any one or more of a relational database management system, online analytical processing database, table, or any other suitable structure for storing data.
[0092] Service data 233 may include any data to provide and/or resulting from providing a service of service component 122. For instance, service data 233 may include information about infrastructure articles 107, user information, operating rule sets, or any other information transmitted between one or more components of computing device 116A. Operating data 236 may include instructions for scooter operating rule sets for operating electrically powered scooter 110A.
[0093] Sensor data 232 may include infrastructure data, such as image data, signature data, or any other data indicative of infrastructure proximate to electrically powered scooter 110A. For example, communication units 214 may receive, from an image sensor 102, image data indicative of infrastructure proximate to electrically powered scooter 110A and may store the image data in sensor data 232. Image data may include one or more images that are received from one or more image sensors, such as image sensors 102. In some examples, the images are bitmaps, Joint Photographic Experts Group images (JPEGs), Portable Network Graphics images (PNGs), or any other suitable graphics file formats. In some examples, the image data includes images of one or more infrastructure articles 107 of FIG. 1. In one example, the image data includes images of one or more article message 126 associated with one or more infrastructure articles 107.
[0094] According to techniques of this disclosure, computing device 116A may dynamically control operations of electrically powered scooter 110A based at least in part on sensor data 232. Interpretation component 118A may determine a type of the location at which electrically powered scooter 110A is located based at least in part on the infrastructure data and/or communication data received from another electrically powered scooter (e.g., scooter 110B), an infrastructure article, or a vehicle (e.g., vehicle 104A). In another example, interpretation component 118A may determine whether electrically powered scooter 110A is permitted to be in the location where electrically powered scooter 110A is currently located. Interpretation component 118A may determine whether electrically powered scooter 110A is permitted in its current location based at least in part on the type of the current location and/or the presence of a vehicle, pedestrian, or another scooter in proximity to electrically powered scooter 110A.
[0095] In some examples, interpretation component 118A causes control component 144 to adjust control of electrically powered scooter 110A based on the type of location at which electrically powered scooter 110A is currently located or whether electrically powered scooter 110A is permitted in that location. For example, interpretation component 118A may cause control component 144 to adjust operation of the electric motor and/or adjust operation of the braking assembly (e.g., to adjust a speed of electrically powered scooter 110A).
[0096] Interpretation component 118A may determine whether electrically powered scooter 110A will experience a scooter-specific event. Interpretation component 118A may determine whether electrically powered scooter 110A will experience a scooter-specific event based at least in part on sensor data received from sensors 117 and/or communication data received from another electrically powered scooter 110, a vehicle 104, an infrastructure article 107, remote computing system 150, or any other computing device. In some examples, interpretation component 118A performs one or more operations in response to determining that electrically powered scooter 110A will experience a scooter-specific event. For example, interpretation component 118A may cause scooter 110A to adjust a speed (e.g., by changing operation of a braking apparatus or electric motor) of electrically powered scooter 110A, cause communication unit 214 to output a message to the another computing device (e.g., one of vehicles 104), or output an alert (e.g., audible, visual, and/or haptic).
[0097] FIG. 4 is a conceptual diagram of an electrically powered scooter 110A.
Electrically powered scooter 110A include a chassis 402, a rear wheel 404, a front wheel 406, and a steering assembly 408. Chassis 402 includes chassis support member 412 extending substantially horizontally between a rear-wheel mount 414 at one end of chassis 402 and a front- wheel mount 416 at another end of chassis 402 that is opposite the rear-wheel mount 414.
[0098] In the example of FIG. 4, rear wheel 404 is mounted to rear wheel mount 414 and front wheel 406 is mounted to front wheel mount 416. Front wheel 406 is mounted to front wheel mount 416 for turning steering movement with respect to the front wheel mount 406 and rear wheel 404. Front wheel mount 416 may be coupled to steering assembly 408. Steering assembly 408 may extend generally vertically relative to chassis support member 412. Steering assembly may be angled relative to chassis support member 412. In one example, an angle between chassis support member 412 and steering assembly 408 is between approximately 60 degrees to approximately 90 degrees. Steering assembly 408 may include handlebars 410. Steering assembly 408 may be coupled to front wheel mount 416 such that turning handlebars 410 may cause front wheel 406 to turn.
[0099] Electrically powered scooter 110A includes at least one electric motor 420, at least one motor controller 422, and at least one battery 424. Motor controller 422 may be operatively coupled to electric motor 420 to drive rear wheel 404 and/or front wheel 406. In the example of FIG. 4, electric motor 420 is configured to drive rear wheel 404, in some examples, electric motor 420 may be configured to drive front wheel 406. In one example, electrically powered scooter 110A includes a plurality of motors that are each configured to drive a respective wheel.
[0100] Electrically powered scooter 110A may include a braking apparatus 430. In the example of FIG. 4, braking apparatus 430 is operatively coupled to rear wheel 404 to selectively slow and/or stop rear wheel 404. In some examples, electrically powered scooter 110A includes a braking apparatus coupled to front wheel 406.
[0101] FIG. 5 is a flow diagram illustrating example operation of a computing device for dynamically controlling an electrically powered scooter, in accordance with one or more techniques of this disclosure. The techniques are described in terms of computing device 116A. However, the techniques may be performed by other computing devices. [0102] In the example of FIG. 5, computing device 116A receives infrastructure data indicative of infrastructure proximate to electrically powered scooter 110A from one or more sensors (502). For example, the infrastructure data may include image data or signature data.
[0103] Computing device 116A may determine a type of the location at which electrically powered scooter 110A is located based at least in part on the infrastructure data (504). For example, computing device 116A may perform image processing on image data included within the infrastructure data to identify the type of location. As another example, computing device 116A may determine the type of location in which electrically powered scooter 110A is located based on signature data.
[0104] In some examples, computing device 116A performs one or more operations based at least in part on the type of the location in which electrically powered scooter 110A is located (506). For example, computing device 116A may determine whether electrically powered scooter 110A is permitted in its current location based at least in part on the type of the location.
Computing device 116A may perform operations that cause electrically powered scooter 110A to change operation of the electric motor or change operation of the brake assembly. For example, computing device 116A may output a command causing the electric motor to slow down or speed up. As another example, computing device 116A may output a command to cause a brake assembly to apply the brakes. In some examples, computing device 116A performs one or more operations by outputting an alert (e.g., audible, visual, and/or haptic) or outputting a message to another computing device (e.g., one of vehicles 104 or remote computing system 150). In some examples, the message indicates the presence of electrically powered scooter 110A. In another example, the message indicates usage data for electrically powered scooter 110A, such as a current location of electrically powered scooter 110A, whether the current location of electrically powered scooter 110A is permitted, a type of the current location (e.g., a transportation pathway 106, a park, a scooter parking zone, etc.), an amount of time that electrically powered scooter 110A has been in its current location, or information indicating the occurrence of a scooter- specific event.
[0105] The following numbered examples may illustrate one or more aspects of the disclosure:
[0106] Example 1. A method comprising: receiving, from a sensor, infrastructure data indicative of infrastructure proximate to an electrically powered scooter; determining, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and performing at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
[0107] Example 2. The method of example 1, further comprising: determining, based at least in part on the type of the location, whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located; and performing the at least one operation in response to determining that the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located.
[0108] Example 3. The method of example 2, wherein a type of location the electrically powered scooter is not permitted to operate comprises at least one of a pedestrian pathway or an driving lane of a vehicle pathway.
[0109] Example 4. The method of example 2, wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least a portion of a pavement marking; wherein determining whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located further comprises determining whether the electrically powered scooter is within a threshold distance of the portion of the pavement marking.
[0110] Example 5. The method of example 1, wherein performing the at least one operation comprises adjusting operation of the motor or adjusting operation of a braking apparatus to slow the at least one of the chassis-supported front wheel or chassis-supported rear wheel.
[0111] Example 6. The method of example 1, wherein the sensor comprises an image sensor, and wherein determining the type of the location in which the electrically powered scooter is physically located comprises: determining, based at least in part on the infrastructure data received from the sensor, a distance between the electrically powered scooter and the at least one infrastructure article; and determining the type of the location based at least in part on the distance.
[0112] Example 7. The method of example 1, wherein determining the type of the location in which the electrically powered scooter is physically located is further based on data received via a dedicated short range communication (DSRC) transceiver from at least one of another electrically powered scooter, a vehicle, or at least one infrastructure article.
[0113] Example 8. The method of example 1, wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least one of a roadway sign, a license plate, or conspicuity tape.
[0114] Example 9. The method of example 1, wherein the location is a bicycle pathway adjacent to a vehicle driving lane.
[0115] Example 10. The method of example 1, wherein determining whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located is based at least in part on a global positioning system (GPS) location of the electrically powered scooter.
[0116] Example 11. The method of example 1, wherein performing the at least one operation is based at least in part on detection of at least one of a vehicle, another electrically powered scooter, or a pedestrian. [0117] Example 12. The method of example 1, wherein performing the at least one operation comprises generating an output.
[0118] Example 13. The method of example 12, wherein generating the output is in response to determining that an amount of time the electrically powered scooter has been in the location in which the electrically powered scooter is not permitted satisfies a threshold time duration.
[0119] Example 14. The method of example 12, wherein generating the output is in response to determining that a confidence level indicative a probability the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located satisfies a threshold confidence level.
[0120] Example 15. The method of example 12, wherein the output is at least one of audio, visual or haptic output.
[0121] Example 16. The method of example 1, wherein performing the at least one operation comprises sending a message to a remote computing device indicating that the electrically powered scooter is not at a physical location in which the electrically powered scooter is permitted to be physically located.
[0122] Example 17. The method of example 1, further comprising: determining, based at least in part on sensor data, a scooter-specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter; and performing at least one operation based at least in part on the determination of the scooter-specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter.
[0123] Example 18. The method of example 17, wherein the sensor is a first Dedicated Short- Range Communications transceiver, wherein determining the scooter-specific event comprises: receiving a message from a second Dedicated Short-Range Communications transceiver configured at the second electrically powered scooter; and determining the second electrically powered scooter is within the threshold distance of the first electrically powered scooter based at least in part on the message.
[0124] Example 19. The method of example 18, wherein determining the scooter-specific event comprises determining that the second electrically powered scooter is physically behind the first electrically powered scooter.
[0125] Example 20. The method of example 18, wherein determining the scooter-specific event comprises determining, based at least in part on detection of the proximity of the second electrically powered scooter and the sensor data, whether a collision between the first electrically powered scooter and the second electrically powered scooter will occur; and wherein performing at least one operation comprises generating an output, changing the operation of the motor controller that drives the at least one of the chassis-supported front wheel or rear wheel, or changing the operation of a braking apparatus.
[0126] Example 21. The method of example 17, wherein the set of characteristics is a first set of characteristics, the method further comprising: determining, based on the sensor data, that a vehicle proximate to the scooter is configured with a second set of characteristics that correspond to an automobile, wherein the first and second set of characteristics are different; wherein determining the scooter-specific event is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
[0127] Example 22. The method of example 21, wherein the at least one operation is a first operation, wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile, wherein a second operation is based at least in part on an object configured with a third set of characteristics that do not correspond to the automobile, and wherein performing the at least one operation comprises performing the first operation based at least in part on determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
[0128] Example 23. The method of example 22, wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
[0129] Example 24. The method of example 17, wherein performing the at least one operation comprises outputting at least one of audio, visual or haptic output.
[0130] Example 25. The method of example 17, wherein determining the scooter-specific event the memory comprises determining the presence of a temporary traffic control zone in a path of the electrically powered scooter.
[0131] Example 26. The method of example 17, wherein the electrically power scooter includes one or more portions of conspicuity tape comprising one or more characteristics within the set of characteristics that correspond to the electrically powered scooter.
[0132] Example 27. The method of example 17, wherein performing the at least one operation comprises sending a message to an automobile that indicates the presence of the electrically powered scooter.
[0133] Example 28. The method of example 17, wherein determining the scooter-specific event comprises determining whether a pavement marking is present or absent from a portion of a roadway.
[0134] Example 29. The method of example 17, wherein the set of characteristics is a first set of characteristics, the method further comprising: determining, based on the sensor data, that an object proximate to the scooter is characterized by a second set of characteristics that correspond to a person, wherein the first and second set of characteristics are different; and determining the scooter-specific event based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the person.
[0135] Example 30. The method of example 17, wherein performing the at least one operation comprises sending a message to a remote computing device that that the electrically powered scooter determined, based at least in part on the sensor data, the scooter-specific event.
[0136] Example 31. The method of example 30, wherein a user profile stored at the remote computing device is based at least in part on the message, and wherein the user profile is associated with a user of the electrically powered scooter and is usable to determine future use of another electrically powered scooter by the user.
[0137] Example 32. The method of example 30, wherein the message is usable by the remote computing device to determine a degree of non-compliance for a physical region based on a plurality of electrically powered scooters.
[0138] In one or more examples, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over, as one or more instructions or code, a computer-readable medium and executed by a hardware -based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In this manner, computer-readable media generally may correspond to (1) tangible computer-readable storage media, which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium.
[0139] By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, eEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transient media, but are instead directed to non-transient, tangible storage media. Disk and disc, as used, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
[0140] Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term "processor", as used may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described. In addition, in some aspects, the functionality described may be provided within dedicated hardware and/or software modules. Also, the techniques could be fully implemented in one or more circuits or logic elements.
[0141] The techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set). Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
[0142] It is to be recognized that depending on the example, certain acts or events of any of the methods described herein can be performed in a different sequence, may be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the method). Moreover, in certain examples, acts or events may be performed concurrently, e.g., through multi threaded processing, interrupt processing, or multiple processors, rather than sequentially.
[0143] In some examples, a computer-readable storage medium includes a non-transitory medium. The term "non-transitory" indicates, in some examples, that the storage medium is not embodied in a carrier wave or a propagated signal. In certain examples, a non-transitory storage medium stores data that can, over time, change (e.g., in RAM or cache).
[0144] Various examples have been described. These and other examples are within the scope of the following claims.

Claims

WHAT IS CLAIMED IS:
1. A system comprising
an infrastructure article;
a micro-mobility device comprising a sensor configured to generate infrastructure data indicate of the infrastructure article; and
a computing device comprising a memory and one or more computer processors, wherein the memory comprises instructions that when executed by the one or more computer processors cause the one or more computer processors to:
receive, from a sensor, the infrastructure data;
determine, based at least in part on the infrastructure data, a type of a location in which the micro-mobility device is physically located; and
perform at least one operation based at least in part on the type of the location in which the micro-mobility device is physically located.
2. The system of claim 1, wherein the infrastructure article comprising at least one of a roadway sign, a license plate, conspicuity tape, or a hazard marker.
3. The system of claim 1, wherein the infrastructure article comprises at least a portion of a pavement marking.
4. The system of claim 1, wherein the type of the location in which the micro-mobility device is physically located comprises a pedestrian pathway or a driving lane of a vehicle pathway.
5. The system of claim 1, wherein the type of the location in which the micro-mobility device is physically located comprises a roadway, and wherein execution of the instructions causes the one or more computer processors to perform the at least one operation by at least causing the one or more computer processors to perform the operation further based on information indicative of a type of the roadway, wherein the information indicative of the type of roadway information indicating a road surface type, the presence of an accident, the presence of a construction zone, or types of vehicles permitted or present on the roadway.
6. An electrically powered scooter comprising:
a scooter chassis having a rear wheel mount at one end and a front wheel mount at the other end with a chassis support member extending there between; a chassis-supported front wheel mounted to the front wheel mount for turning steering movement with respect to the front wheel mount and a chassis-supported rear wheel;
a chassis-supported motor physically coupled to the scooter chassis and configured by a motor controller to drive at least one of the chassis-supported front wheel or chassis-supported rear wheel for powered movement over a ground surface; a sensor configured to generate infrastructure data indicative of infrastructure proximate to the electrically powered scooter;
a computing device comprising a memory and one or more computer processors, wherein the memory comprises instructions that when executed by the one or more computer processors cause the one or more computer processors to:
receive, from the sensor, the infrastructure data;
determine, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
7. The electrically powered scooter of claim 6, wherein execution of the instructions causes the one or more computer processors to:
determine, based at least in part on the type of the location, whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located; and
perform the at least one operation in response to determining that the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located.
8. The electrically powered scooter of claim 7, wherein a type of location the electrically powered scooter is not permitted to operate comprises at least one of a pedestrian pathway or an driving lane of a vehicle pathway.
9. The electrically powered scooter of claim 7,
wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least a portion of a pavement marking;
wherein to determine whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located, the memory comprises instructions that cause the one or more computer processors to determine whether the electrically powered scooter is within a threshold distance of the portion of the pavement marking.
10. The electrically powered scooter of claim 6,
wherein to perform the at least one operation, the memory comprises instructions that cause the one or more computer processors adjust operation of the motor or adjust operation of a braking apparatus to slow the at least one of the chassis-supported front wheel or chassis- supported rear wheel.
11. The electrically powered scooter of claim 6,
wherein the sensor comprises an image sensor, and
wherein to determine the type of the location in which the electrically powered scooter is physically located, the memory comprises instructions that cause the one or more computer processors to:
determine, based at least in part on the infrastructure data received from the sensor, a distance between the electrically powered scooter and the at least one infrastructure article; and determine the type of the location based at least in part on the distance.
12. The electrically powered scooter of claim 6, wherein to determine the type of the location in which the electrically powered scooter is physically located, the memory comprises instructions that cause the one or more computing processors to:
determine the type of the location in which the electrically powered scooter is physically located further based on data received via a dedicated short range communication (DSRC) transceiver from at least one of another electrically powered scooter, a vehicle, or at least one infrastructure article.
13. The electrically powered scooter of claim 6, wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least one of a roadway sign, a license plate, or conspicuity tape.
14. The electrically powered scooter of claim 6, wherein the location is a bicycle pathway adjacent to a vehicle driving lane.
15. The electrically powered scooter of claim 6,
wherein the determination whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located is based at least in part on a global positioning system (GPS) location of the electrically powered scooter.
16. The electrically powered scooter of claim 6, wherein the memory comprises instructions that cause the one or more computer processors to perform the at least one operation based at least in part on detection of at least one of a vehicle, another electrically powered scooter, or a pedestrian.
17. The electrically powered scooter of claim 6, wherein to perform the at least one operation, the memory comprises instructions that cause the one or more computer processors to generate an output.
18. The electrically powered scooter of claim 17, wherein the memory comprises instructions that cause the one or more computer processors to generate the output in response to determining that an amount of time the electrically powered scooter has been in the location in which the electrically powered scooter is not permitted satisfies a threshold time duration.
19. The electrically powered scooter of claim 17, wherein the memory comprises instructions that cause the one or more computer processors to generate the output in response to determining that a confidence level indicative a probability the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located satisfies a threshold confidence level .
20. The electrically powered scooter of claim 17, wherein the output is at least one of audio, visual or haptic output.
21. The electrically powered scooter of claim 6, wherein to perform the at least one operation, the memory comprises instructions that cause the one or more computer processors to send a message to a remote computing device indicating that the electrically powered scooter is not at a physical location in which the electrically powered scooter is permitted to be physically located.
22. The electrically powered scooter of claim 6, wherein the memory comprises instructions that cause the one or more computer processors to:
determine, based at least in part on sensor data, a scooter-specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter; and
perform at least one operation based at least in part on the determination of the scooter- specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter.
23. The electrically powered scooter of claim 22,
wherein the sensor is a first Dedicated Short-Range Communications transceiver, wherein to determine the scooter-specific event, the memory comprises instructions that cause the one or more computer processors to:
receive a message from a second Dedicated Short-Range Communications transceiver configured at the second electrically powered scooter; and
determine the second electrically powered scooter is within the threshold distance of the first electrically powered scooter based at least in part on the message.
24. The electrically powered scooter of claim 23,
wherein to determine the scooter-specific event, the memory comprises instructions that cause the one or more computer processors to determine that the second electrically powered scooter is physically behind the first electrically powered scooter.
25. The electrically powered scooter of claim 23,
wherein to determine the scooter-specific event, the memory comprises instructions that cause the one or more computer processors to determine, based at least in part on detection of the proximity of the second electrically powered scooter and the sensor data, whether a collision between the first electrically powered scooter and the second electrically powered scooter will occur; and
wherein to perform at least one operation based at least in part on the determination of the scooter-specific event, the memory comprises instructions that cause the one or more computer processors to at least generate an output, change the operation of the motor controller that drives the at least one of the chassis-supported front wheel or rear wheel, or change the operation of a braking apparatus.
26. The electrically powered scooter of claim 22, wherein the set of characteristics is a first set of characteristics, wherein the memory comprises instructions that cause the one or more computer processors to:
determine, based on the sensor data, that a vehicle proximate to the scooter is configured with a second set of characteristics that correspond to an automobile, wherein the first and second set of characteristics are different; and
wherein to determine, based at least in part on the sensor data, the scooter-specific event, the memory comprises instructions that cause the one or more computer processors to determine the scooter-specific event based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
27. The electrically powered scooter of claim 26,
wherein the at least one operation is a first operation, wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile,
wherein a second operation is based at least in part on an object configured with a third set of characteristics that do not correspond to the automobile, and
wherein to perform at least one operation, the memory comprises instructions that cause the one or more computer processors to perform the first operation based at least in part on determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
28. The electrically powered scooter of claim 27,
wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
29. The electrically powered scooter of claim 22, wherein to perform the at least one operation the memory comprises instructions that cause the one or more computer processors to output at least one of audio, visual or haptic output.
30. The electrically powered scooter of claim 22, wherein to determine, based at least in part on the sensor data, the scooter-specific event the memory comprises instructions that cause the one or more computer processors to determine the presence of a temporary traffic control zone in a path of the electrically powered scooter.
31. The electrically powered scooter of claim 22, further configured with one or more portions of conspicuity tape comprising one or more characteristics within the set of
characteristics that correspond to the electrically powered scooter.
32. The electrically powered scooter of claim 22, wherein to perform the at least one operation the memory comprises instructions that cause the one or more computer processors to send a message to an automobile that indicates the presence of the electrically powered scooter.
33. The electrically powered scooter of claim 23, wherein to determine the scooter-specific event, the memory comprises instructions that cause the one or more computer processors to determine whether a pavement marking is present or absent from a portion of a roadway.
34. The electrically powered scooter of claim 22, wherein the set of characteristics is a first set of characteristics, wherein the memory comprises instructions that cause the one or more computer processors to: determine, based on the sensor data, that an object proximate to the scooter is characterized by a second set of characteristics that correspond to a person, wherein the first and second set of characteristics are different; and
wherein to determine, based at least in part on the sensor data, the scooter-specific event, the memory comprises instructions that cause the one or more computer processors to determine the scooter-specific event based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the person.
35. The electrically powered scooter of claim 22, wherein to perform the at least one operation, the memory comprises instructions that cause the one or more computer processors to send a message to a remote computing device that that the electrically powered scooter determined, based at least in part on the sensor data, the scooter-specific event.
36. The electrically powered scooter of claim 35, wherein a user profile stored at the remote computing device is based at least in part on the message, and wherein the user profile is associated with a user of the electrically powered scooter and is usable to determine future use of another electrically powered scooter by the user.
37. A computing device, comprising:
memory; and
one or more processors connected to the memory, wherein the memory includes instructions that, when executed by the one or more processors, cause the computing device to:
receive, from a sensor, infrastructure data indicative of infrastructure proximate to an electrically powered scooter;
determine, based at least in part on the infrastructure data, a type of a location in which the electrically powered scooter is physically located; and perform at least one operation based at least in part on the type of the location in which the electrically powered scooter is physically located.
38. The computing device of claim 37, wherein execution of the instructions causes the one or more processors to:
determine, based at least in part on the type of the location, whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located; and
perform the at least one operation in response to determining that the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located.
39. The computing device of claim 38, wherein execution of the instructions causes the one or more processors to determine whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located by at least causing the one or more processors to:
determine, based at least in part on a map comprising a pre-defmed set of delineated regions, that the electrically powered scooter has been placed by a user within a particular delineated region of the pre-defmed set of delineated regions;
select, based at least in part on the determination that the scooter has been placed by a user within the particular delineated region, a reward for the user; and
store an indication of the reward for the user in association with an account of the user.
40. The computing device of claim 39, wherein the particular delineated region is a designated parking zone to return electrically powered scooters.
41. The computing device of claim 38, wherein a type of location the electrically powered scooter is not permitted to operate comprises at least one of a pedestrian pathway or an driving lane of a vehicle pathway.
42. The computing device of claim 38,
wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least a portion of a pavement marking;
wherein execution of the instructions causes the one or more processors to determine whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located by at least causing the one or more processors to determine whether the electrically powered scooter is within a threshold distance of the portion of the pavement marking.
43. The computing device of claim 37,
wherein execution of the instructions causes the one or more processors to perform the at least one operation by at least causing the one or more processors adjust operation of the motor or adjust operation of a braking apparatus to slow the at least one of the chassis-supported front wheel or chassis-supported rear wheel.
44. The computing device of claim 37,
wherein the sensor comprises an image sensor, and
wherein execution of the instructions causes the one or more processors to determine the type of the location in which the electrically powered scooter is physically located by at least causing the one or more processors to:
determine, based at least in part on the infrastructure data received from the sensor, a distance between the electrically powered scooter and the at least one infrastructure article; determine the type of the location based at least in part on the distance.
45. The computing device of claim 37, wherein execution of the instructions causes the one or more processors to determine the type of the location in which the electrically powered scooter is physically located based on data received via a dedicated short range communication (DSRC) transceiver from at least one of another electrically powered scooter, a vehicle, or at least one infrastructure article.
46. The computing device of claim 37, wherein the infrastructure data includes data indicative of at least one infrastructure article comprising at least one of a roadway sign, a license plate, or conspicuity tape.
47. The computing device of claim 37, wherein the location is a bicycle pathway adjacent to a vehicle driving lane.
48. The computing device of claim 37,
wherein execution of the instructions causes the one or more processors to determine whether the electrically powered scooter is permitted to be in the location in which the electrically powered scooter is physically located based at least in part on a global positioning system (GPS) location of the electrically powered scooter.
49. The computing device of claim 37, wherein execution of the instructions causes the one or more processors to perform the at least one operation based at least in part on detection of at least one of a vehicle, another electrically powered scooter, or a pedestrian.
50. The computing device of claim 37, wherein execution of the instructions causes the one or more processors to perform the at least one operation by at least causing the one or more processors to generate an output.
51. The computing device of claim 50, wherein execution of the instructions causes the one or more processors to generate an output in response to determining that an amount of time the electrically powered scooter has been in the location in which the electrically powered scooter is not permitted satisfies a threshold time duration.
52. The computing device of claim 50, wherein execution of the instructions causes the one or more processors to generate the output in response to determining that a confidence level indicative a probability the electrically powered scooter is not permitted to be in the location in which the electrically powered scooter is located satisfies a threshold confidence level.
53. The computing device of claim 50, wherein the output is at least one of audio, visual or haptic output.
54. The computing device of claim 37, wherein execution of the instructions causes the one or more processors to perform the at least one operation by at least causing the one or more processors to send a message to a remote computing device indicating that the electrically powered scooter is not at a physical location in which the electrically powered scooter is permitted to be physically located.
55. The computing device of claim 37, wherein execution of the instructions causes the one or more processors to:
determine, based at least in part on sensor data, a scooter-specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter; and
perform at least one operation based at least in part on the determination of the scooter- specific event that is relevant only to the particular class of vehicles configured with the set of characteristics that correspond to the electrically powered scooter.
56. The computing device of claim 55,
wherein the sensor is a first Dedicated Short-Range Communications transceiver, wherein execution of the instructions causes the one or more processors to determine the scooter-specific event by at least causing the one or more processors to:
receive a message from a second Dedicated Short-Range Communications transceiver configured at the second electrically powered scooter; and
determine the second electrically powered scooter is within the threshold distance of the first electrically powered scooter based at least in part on the message.
57. The computing device of claim 56,
wherein execution of the instructions causes the one or more processors to determine the scooter-specific event by at least causing the one or more processors to determine that the second electrically powered scooter is physically behind the first electrically powered scooter.
58. The computing device of claim 57,
wherein execution of the instructions causes the one or more processors to determine the scooter-specific event by at least causing the one or more processors to determine, based at least in part on detection of the proximity of the second electrically powered scooter and the sensor data, whether a collision between the first electrically powered scooter and the second electrically powered scooter will occur; and
wherein execution of the instructions causes the one or more processors to perform at least one operation by at least causing the one or more processors to at least generate an output, change the operation of the motor controller that drives the at least one of the chassis-supported front wheel or rear wheel, or change the operation of a braking apparatus.
59. The computing device of claim 55, wherein the set of characteristics is a first set of characteristics, wherein execution of the instructions causes the one or more processors to: determine, based on the sensor data, that a vehicle proximate to the scooter is configured with a second set of characteristics that correspond to an automobile, wherein the first and second set of characteristics are different; and
wherein execution of the instructions causes the one or more processors to determine the scooter-specific event by at least causing the one or more processors to determine the scooter- specific event based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
60. The computing device of claim 55,
wherein the at least one operation is a first operation,
wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile,
wherein a second operation is based at least in part on an object configured with a third set of characteristics that do not correspond to the automobile, and
wherein execution of the instructions causes the one or more processors to perform at least one operation by at least causing the one or more processors to perform the first operation based at least in part on determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
61. The computing device of claim 60,
wherein the first operation is based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the automobile.
62. The computing device of claim 55, wherein execution of the instructions causes the one or more processors to perform the at least one operation by at least causing the one or more processors to output at least one of audio, visual or haptic output.
63. The computing device of claim 55, wherein execution of the instructions causes the one or more processors to determine the scooter-specific event by at least causing the one or more processors to determine the presence of a temporary traffic control zone in a path of the electrically powered scooter.
64. The computing device of claim 55, further configured with one or more portions of conspicuity tape comprising one or more characteristics within the set of characteristics that correspond to the electrically powered scooter.
65. The computing device of claim 55, wherein execution of the instructions causes the one or more processors to perform the at least one operation by at least causing the one or more processors to send a message to an automobile that indicates the presence of the electrically powered scooter.
66. The computing device of claim 55, wherein execution of the instructions causes the one or more processors to determine the scooter-specific event by at least causing the one or more processors to determine whether a pavement marking is present or absent from a portion of a roadway.
67. The computing device of claim 55, wherein the set of characteristics is a first set of characteristics, wherein execution of the instructions causes the one or more processors to: determine, based on the sensor data, that an object proximate to the scooter is characterized by a second set of characteristics that correspond to a person, wherein the first and second set of characteristics are different; and
wherein execution of the instructions causes the one or more processors to determine the scooter-specific event by at least causing the one or more processors to determine the scooter- specific event based at least in part on the determination that the vehicle is configured with the second set of characteristics that correspond to the person.
68. The computing device of claim 55, wherein execution of the instructions causes the one or more processors to perform the at least one operation by at least causing the one or more processors to send a message to a remote computing device that that the electrically powered scooter determined, based at least in part on the sensor data, the scooter-specific event.
69. The computing device of claim 68, wherein a user profile stored at the remote computing device is based at least in part on the message, and wherein the user profile is associated with a user of the electrically powered scooter and is usable to determine future use of another electrically powered scooter by the user.
70. The computing device of claim 68, wherein the message is usable by the remote computing device to determine a degree of non-compliance for a physical region based on a plurality of electrically powered scooters.
PCT/IB2020/053372 2019-04-19 2020-04-08 Dynamically controlling electrically powered scooters based on sensed infrastructure data WO2020212808A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962836495P 2019-04-19 2019-04-19
US62/836,495 2019-04-19

Publications (1)

Publication Number Publication Date
WO2020212808A1 true WO2020212808A1 (en) 2020-10-22

Family

ID=70289829

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2020/053372 WO2020212808A1 (en) 2019-04-19 2020-04-08 Dynamically controlling electrically powered scooters based on sensed infrastructure data

Country Status (1)

Country Link
WO (1) WO2020212808A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023055803A1 (en) * 2021-09-30 2023-04-06 Snap Inc. Ar based performance modulation of a personal mobility system
US11900550B2 (en) 2021-09-30 2024-02-13 Snap Inc. AR odometry using sensor data from a personal vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012220011A1 (en) * 2012-11-02 2014-05-08 Robert Bosch Gmbh Operating device e.g. smart phone for operating e.g. electric bicycle, has interface unit that is provided to control electromotor of electric vehicle and to limit speed of electric vehicle based on determined environmental parameter
US20170039631A1 (en) * 2015-08-04 2017-02-09 Gogoro Inc. Apparatus, method and article for electric vehicle sharing
JP2017100490A (en) * 2015-11-30 2017-06-08 パイオニア株式会社 Speed control device
CN108482372A (en) * 2018-03-21 2018-09-04 广东欧珀移动通信有限公司 Travel speed control method and device, electronic device and readable storage medium storing program for executing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012220011A1 (en) * 2012-11-02 2014-05-08 Robert Bosch Gmbh Operating device e.g. smart phone for operating e.g. electric bicycle, has interface unit that is provided to control electromotor of electric vehicle and to limit speed of electric vehicle based on determined environmental parameter
US20170039631A1 (en) * 2015-08-04 2017-02-09 Gogoro Inc. Apparatus, method and article for electric vehicle sharing
JP2017100490A (en) * 2015-11-30 2017-06-08 パイオニア株式会社 Speed control device
CN108482372A (en) * 2018-03-21 2018-09-04 广东欧珀移动通信有限公司 Travel speed control method and device, electronic device and readable storage medium storing program for executing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023055803A1 (en) * 2021-09-30 2023-04-06 Snap Inc. Ar based performance modulation of a personal mobility system
US11900550B2 (en) 2021-09-30 2024-02-13 Snap Inc. AR odometry using sensor data from a personal vehicle

Similar Documents

Publication Publication Date Title
US11854393B2 (en) Road hazard communication
US9715827B2 (en) Multi-view traffic signage
JP4967015B2 (en) Safe driving support device
US9809165B1 (en) System and method for minimizing driver distraction of a head-up display (HUD) in a vehicle
US20170032673A1 (en) Driver behavior sharing
US9601011B1 (en) Monitoring and reporting slow drivers in fast highway lanes
JP6834860B2 (en) Collision prevention device, collision prevention method, collision prevention program, recording medium
US9761134B2 (en) Monitoring and reporting slow drivers in fast highway lanes
US20210247199A1 (en) Autonomous navigation systems for temporary zones
JP7452455B2 (en) Control device, method, and program
WO2020212808A1 (en) Dynamically controlling electrically powered scooters based on sensed infrastructure data
JP5362225B2 (en) Operation recording device and operation status recording method
US20220135077A1 (en) Increasing awareness of passengers during pullovers and drop offs for autonomous vehicles
US20200377012A1 (en) Multimodal vehicle-to-pedestrian notification system
TW201814669A (en) Roadside display system, roadside unit and roadside display method thereof
JP2020530926A (en) Warning for vulnerable traffic in dangerous situations
JP2010146459A (en) Driving support device
JP2016143091A (en) Driving support system and on-vehicle information processing apparatus
US11756402B2 (en) Operator proficiency-based infrastructure articles
US20220215753A1 (en) Incentive-driven roadway condition monitoring for improved safety of micromobility device operation
US20220299630A1 (en) Radar-optical fusion article and system
US20230398866A1 (en) Systems and methods for heads-up display
JP2010198087A (en) Driving support device
WO2021076734A1 (en) Method for aligning camera and sensor data for augmented reality data visualization
CN112735105A (en) Parking lot early warning system, method and device, server and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20719503

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20719503

Country of ref document: EP

Kind code of ref document: A1