WO2022162657A1 - Création automatique d'une base de données de cartographie de terrains - Google Patents

Création automatique d'une base de données de cartographie de terrains Download PDF

Info

Publication number
WO2022162657A1
WO2022162657A1 PCT/IL2022/050097 IL2022050097W WO2022162657A1 WO 2022162657 A1 WO2022162657 A1 WO 2022162657A1 IL 2022050097 W IL2022050097 W IL 2022050097W WO 2022162657 A1 WO2022162657 A1 WO 2022162657A1
Authority
WO
WIPO (PCT)
Prior art keywords
human
route segments
scale vehicle
terrain
user
Prior art date
Application number
PCT/IL2022/050097
Other languages
English (en)
Inventor
Ronen Bitan
Original Assignee
Trailze Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/160,406 external-priority patent/US12025446B2/en
Application filed by Trailze Ltd filed Critical Trailze Ltd
Publication of WO2022162657A1 publication Critical patent/WO2022162657A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Definitions

  • the present invention relates to the field of navigation through a terrain and, more particularly, to navigation through a terrain using a human-scale vehicle.
  • a database for mapping offroad terrain of various characteristics in three-dimensional terms comprising: a plurality of road segment entries, each containing data pertaining to the terrain characteristics of the segment; and a plurality of elbow entries, each containing (x, y, z) coordinates of the elbow and a record for each road segment having the elbow as one of its end points, the record comprising navigation directives for vehicles entering the segment from the elbow.
  • each one of the road segment entries further comprises pointers to elbow entries of its end points.
  • the navigation directives include consideration of the vehicle type.
  • the navigation directives include consideration of the road segment that led the vehicle to the elbow.
  • the navigation directives include 3D considerations.
  • the navigation directives include consideration of user skills.
  • the terrain characteristics within each the segment are substantially homogenous and are configured to be traversed using a single set of the directives.
  • the directives are configured to be defined according to indications selected from the group consisting of: mounting slope, descending slope, sharp turn, rocky terrain, bridge over river, gravel, mud and hiking section.
  • the indications are configured to be translated into the directives depending on the type of the vehicles.
  • the directives are stored in the database along with references to the appropriate type of the vehicle.
  • the terrain characteristics further comprise at least one of sand, gravel and rock.
  • the navigation directives for the road segments are configured to be continuously derived from various sensors, cameras and microphones carried or worn by a traveler.
  • the sensors are selected from the group consisting of: motion sensors, environmental sensors, position sensors and wearable physiological monitoring sensors.
  • the motion sensors are selected from the group consisting of: accelerometers, gravity sensors, gyroscopes and rotational vector sensors.
  • the environmental sensors are selected from the group consisting of: barometers, photometers and thermometers.
  • the position sensors are selected from the group consisting of: orientation sensors, magnetometers, Global Positioning System (GPS), European Geostationary Navigation Overlay Service (EGNOS) and Global Navigation Satellite System (GLONASS).
  • GPS Global Positioning System
  • GNOS European Geostationary Navigation Overlay Service
  • GLONASS Global Navigation Satellite System
  • the wearable physiological monitoring sensors are configured to measure physiological parameters selected from the group consisting of: electrocardiogram (ECG), heart rate, blood pressure and body temperature.
  • ECG electrocardiogram
  • heart rate heart rate
  • blood pressure blood pressure
  • body temperature body temperature
  • a method of creating and continuously updating a database for mapping off- road terrain of various characteristics in three- dimensional terms comprising: defining a plurality of road segment, each road segment defining two bounding elbows, using at least one of maps and recorded trails; for each road segment: defining preliminary terrain characteristics; for each elbow defined by the road segment: defining preliminary navigation directives for vehicles entering the segment from the elbow; and continuously updating the database using at least one of recorded trails and data from sensors carried or worn by travelers.
  • the updating comprises updating segment definitions according to recorded trails intersecting existing segments.
  • the updating comprises updating segment definitions according to sensors data indicating change in terrain characteristics within existing road segments.
  • the sensors are selected from the group consisting of: motion sensors, environmental sensors, position sensors and wearable physiological monitoring sensors.
  • Some embodiments of the present invention may provide a method of navigating through a terrain using a human-scale vehicle, the method may include: receiving an origin point and one or more destination points in the terrain; selecting route segments of a plurality of predefined route segments to navigate a user from the origin point to the one or more destination points in the terrain using the human-scale vehicle; obtaining motion data from one or more motion sensors disposed on at least one of the user and the human-scale vehicle during traveling of the human-scale vehicle along the selected route segments; and determining, based on at least a portion of the obtained motion data, terrain characteristics of the selected route segments.
  • Some embodiments may include updating the selection of route segments based on the determined terrain characteristics.
  • Some embodiments may include updating one or more of the predefined route segments based on the terrain characteristics determined for the respective one or more selected route segments.
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has been driven on a sidewalk.
  • Some embodiments may include identifying locations in the terrain in which the human-scale vehicle has been driven on the sidewalk.
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed a predefined 3D parking pattern bordering a parking zone within the terrain.
  • Some embodiments may include determining, based on geolocation data from one or more geolocation sensors disposed on at least one of the user and the human-scale vehicle that the humanscale vehicle is within the parking zone.
  • Some embodiments may include preventing the user from locking the human-scale vehicle if the human-scale vehicle is not within the parking zone.
  • Some embodiments may include determining, based on geolocation data from one or more geolocation sensors disposed on at least one of the user and the human-scale vehicle that the humanscale vehicle has left the parking zone.
  • Some embodiments may include preventing from the user to unlock the human-scale vehicle if the human-scale vehicle is not within the parking zone.
  • Some embodiments may include detecting one or more driving-related events based on at least a portion of the obtained motion data.
  • Some embodiments may include updating the selection of the route segments based on the one or more detected driving-related events.
  • Some embodiments may include updating one or more of the predefined route segments based on the one or more driving-related events detected in the respective one or more selected route segments. [0037] Some embodiments may include at least one of selecting the route segments and updating the selection thereof based on known driving skills of the user of the human-scaled vehicle.
  • Some embodiments may include updating one or more of the predefined route segments based on at least a portion of the obtained motion data and based on known driving skills of the user.
  • Some embodiments may include generating navigation instructions based on the selected route segments.
  • Some embodiments may include updating the navigation instructions based on the determined terrain characteristics.
  • Some embodiments may include at least one of generating and updating the navigation instructions based on known driving skills of the user.
  • Some embodiments may include: receiving a plurality of ride datasets for the user of the human-scale vehicle, wherein each of the ride datasets may include selected route segments and motion data obtained during traveling of the human-scale vehicle along the selected route segments; and determining driving patterns for the user based on at least a portion of the ride datasets.
  • Some embodiments may include: receiving a plurality of ride datasets for multiple users of human-scale vehicles, wherein each of the ride datasets may include selected route segments and motion data obtained during traveling of the human-scale vehicle along the selected route segments; and at least one of defining new route segments and updating the predefined route segments based on at least a portion of the ride datasets.
  • Fig. 1 is a schematic block diagram of the database according to the present invention.
  • Fig. 2 is a schematic representation of a partial trails map
  • Figs. 3A and 3B show exemplary database entries describing the partial trails of Fig. 2;
  • Fig. 4 is a schematic block diagram showing the various sources contributing to the creation and the on-going updating of the database;
  • Fig. 5 shows an example of a segment being divided into two segments;
  • FIG. 6A and 6B show exemplary database entries describing the segment division of Fig. 5;
  • Fig. 7 is a block diagram of a system for navigating through a terrain using a human-scale vehicle, according to some embodiments of the invention.
  • Fig. 8 shows graphs of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of terrain characteristics and driving-related events, according to some embodiments of the invention
  • Fig. 9 shows graphs of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of an emergency break of the human-scale vehicle, according to some embodiments of the invention
  • Fig. 10 shows graphs of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of a fall of the human-scale vehicle, according to some embodiments of the invention
  • FIG. 11 shows graphs of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of a driving on a sidewalk, according to some embodiments of the invention
  • Fig. 12 is a flowchart of a method of navigating through a terrain using a human-scale vehicle, according to some embodiments of the invention.
  • Fig. 13 is a flowchart of a method of determining that a human-scale vehicle has been driven on a sidewalk, according to some embodiments of the invention.
  • Fig. 14 is a flowchart of a method of determining that a human-scale vehicle has crossed a predefined three-dimensional (3D) parking pattern of a parking zone, according to some embodiments of the invention.
  • the present invention provides a novel database for mapping terrain of various characteristics in three-dimensional terms.
  • the database is constructed automatically in the system by analyzing previously recorded travelers' trails and current feedback from sensors, as will be explained in detail below.
  • Fig. 1 is a schematic block diagram of the database 100 according to the present invention, comprising road segments 110 and elbows 120.
  • Each road segment entry 110 contains data pertaining to the terrain characteristics of the segment.
  • the terrain characteristics within each segment are substantially homogeneous and may be traversed using a single set of directives.
  • Each road segment entry may also optionally point to its two bounding elbows (end points).
  • Terrain characteristics may be, for example, sand, gravel, rock, etc.
  • Each elbow entry 120 includes (x, y, z) coordinates of the elbow and a record for each road segment having the elbow as one of its end points, the record including navigation directives for vehicles (or pedestrians) entering the segment from the elbow.
  • the directives may take into consideration:
  • Fig. 2 is a schematic representation of a partial 3D trails map described by the exemplary database entries in Figs. 3A and 3B.
  • the database 100 is continuously updated, as will be explained below.
  • Fig. 4 is a schematic block diagram showing the various sources contributing to the creation and the on-going updating of the database 100.
  • Maps 410 - The basic database may be constructed using existing trails databases and/or user generated content or previously selected trails.
  • Existing trails databases may comprise various available Digital Elevation Models (DEM) such as the NASA DEM, or a database collected by GPS from users which are digital representations of elevations as measured at ground level. Such elevations are calculated using a variety of methods, including stereoscopy, digitized contour data, GPS data, radar signal interpretation or other methods for extracting elevation from a given position.
  • CDEM Canadian Digital Elevation Model
  • data sets for a region or a predefined area may be obtained using specific data extraction tools such as found on the web site geogratis.gc.ca.
  • Recorded trails 420 - Segments and elbows definitions may be continuously updated using recorded routes taken by travelers (e.g., by replaying captured location information). For example, if a recorded trail indicates traversing an existing segment, the segment may be divided into two segments connected by a new elbow. In the example of Fig. 5, using the partial map of Fig. 2 as a base, a new recorded trail T510 intersecting segment S220 causes the creation of a new elbow E540 which divides the previous segment S220 into two new segments S520 and S530. Fig. 6 shows the resulting updated database.
  • Sensors 430 - Navigation directives for the various road segments may be continuously derived from various sensors carried or worn by the traveler or a vehicle, phone or additional device. The sensors may comprise, for example:
  • Motion sensors - that measure acceleration forces and rotational forces along three axes. This category includes accelerometers, gravity sensors, gyroscopes, and rotational vector sensors.
  • Environmental sensors - that measure various environmental parameters, such as ambient air temperature and pressure, illumination, and humidity. This category includes barometers, photometers, and thermometers.
  • Position sensors - that measure the physical position of a device. This category includes orientation sensors and magnetometers, Global Positioning System (GPS), European Geostationary Navigation Overlay Service (EGNOS), Global Navigation Satellite System (GLONASS), and others.
  • GPS Global Positioning System
  • EGNOS European Geostationary Navigation Overlay Service
  • GLONASS Global Navigation Satellite System
  • Wearable physiological monitoring sensors - that measure various physiological parameters of the wearer (traveler) such as, for example, electrocardiogram (ECG), heart rate, blood pressure, body temperature and others.
  • ECG electrocardiogram
  • the data aggregated from the various sensors is analyzed to determine terrain characteristics and levels of difficulty of trails. This computed data is then translated into directives stored in the elbows database in conjunction with the relevant segments.
  • the sensors data may also serve to update segments and elbows definition by identifying different characteristics in various parts of a segment, which may lead to automatically partitioning the segment into two or more segments according to the different terrain characteristics which require different directives.
  • Directives given to a traveler about to enter a route segment may indicate, for example, mounting slope, descending slope, sharp turn, rocky terrain, bridge over river, gravel, mud, hiking section, etc.
  • Various embodiments of the present invention provide a system and a method for navigating through a terrain using a human-scale vehicle.
  • the terrain may include, for example, an urban area and/or an area between two or more urban areas.
  • the human-scale vehicle may be any vehicle having dimensions that enable the vehicle to drive in bicycle lanes.
  • the human-scale vehicle may be a bicycle, e-bike, scooter, electric scooter, skateboard, electric skateboards, shared bicycle, electric pedal assisted bicycle, etc.
  • FIG. 7 is a block diagram of a system 700 for navigating through a terrain using a human-scale vehicle, according to some embodiments of the invention.
  • system 700 includes a computing device 710, a database 720 and a remote computing device 730.
  • Computing device 710 may be, for example, a portable electronic device such as a smartphone or a tablet of a user.
  • Computing device 710 may be, for example, an on-board computing device of the human-scale vehicle.
  • Computing device 710 may receive, from the user of the human-scale vehicle, an origin point and one or more destination points in the terrain.
  • Computing device 710 may select route segments of a plurality of predefined route segments to navigate the user from the origin point to the one or more destination points in the terrain using the human-scale vehicle.
  • the plurality of predefined route segments may be stored in, for example, database 720.
  • Each of the route segments may, for example, have its unique terrain characteristics that are different from terrain characteristics of other route segments.
  • the terrain characteristics may, for example, include a terrain type (e.g., roadway asphalt, sidewalk asphalt, pavement, etc.), a terrain condition (e.g., wet, dry, ice-crusted, bumpy, etc.), etc.
  • Route segments may include any section of the terrain that is suitable for driving in using the human-scale vehicle.
  • route segments may include roadway sections, pavement sections, bicycle lane sections, crosswalks, underground crossings, overhead passages, passageways, etc.
  • two or more route segments may be associated with a single pathway section in the terrain.
  • a single pathway section in the terrain may have a roadway section, a pavement section and a bicycle lane section, e.g., parallel, or substantially parallel, to each other, wherein the roadway section, the pavement section and the bicycle lane section of the same pathway section in the terrain may be associated with different route segments.
  • computing device 710 may prioritize the selection of those route segments that are more suitable for driving using human-scale vehicles. For example, if a particular pathway section in the terrain has a pavement section and a bicycle lane section, computing device 710 may select a route segment associated with the bicycle lane section for that particular pathway section.
  • computing device 710 may generate navigation instructions (e.g., navigation directives) based on the selected route segments.
  • the navigation instructions may include information concerning, for example, a maximal allowable speed along the selected route segments, turning directions at cross-roads and/or between the selected route segments, etc.
  • computing device 710 may include a user interface or computing device 710 may be in a communication with a user interface.
  • Computing device 710 may generate notifications indicative of the navigation instructions using the user interface.
  • the user interface may be, for example, a display of computing device 710 (e.g., when computing device 710 is a portable electronic device such as a smartphone or a tablet of the user).
  • the user interface may be, for example, a display of the human-scale vehicle.
  • the user interface may include, for example, haptic guidance means (e.g., attachable to a handlebar of the human-scale vehicle).
  • the user interface may include, for example, voice guidance means.
  • Computing device 710 may obtain motion data from one or more motion sensors during traveling of the human-scale vehicle along the selected route segments.
  • the one or more motion sensors may include, for example, one or more accelerometers, one or more gyroscopes, etc.
  • the one or more motion sensors may be motion sensors of computing device 710.
  • the one or more motion sensors may be wearable by the user of the human-scale vehicle.
  • the one or more motion sensors may be disposed on the human-scale vehicle.
  • the one or more motion sensors may be motion sensors of the humanscale vehicle.
  • computing device 710 may be a portable electronic device, such as a smartphone or a tablet, having one or more motion sensors. Such computing device 710 may be attachable to the human-scale vehicle or may be wearable by the user during the driving along the selected route segments.
  • computing device 710 may determine, based on at least a portion of the obtained motion data, terrain characteristics of the selected route segments.
  • the determined terrain characteristics may include the terrain type (e.g., roadway asphalt, bicycle lane asphalt, pavement, etc.), the terrain condition (e.g., wet, dry, ice -crusted, etc.), etc. of the selected route segments.
  • computing device 710 may transmit the obtained motion data to remote computing device 730, and remote computing device 730 may determine the terrain characteristics of the selected route segments based on at least a portion of the obtained motion data.
  • the terrain characteristics of the selected route segments may be determined based on at least a portion of the obtained motion data and a reference motion data. For example, at least a portion of the obtained motion data may be compared to at least a portion of the reference motion data, and the terrain characteristics may be determined based on the comparison thereof.
  • the reference motion data may be, for example, stored in database 730.
  • the terrain characteristics of the selected route segments may be determined based on at least a portion of the obtained motion data using one or more artificial intelligence (Al) methods.
  • Al artificial intelligence
  • one or more Al methods may receive as an input at least a portion of the obtained motion data and output the terrain characteristics of the selected route segments.
  • computing device 710 may update the selection of route segments based on the determined terrain characteristics. For example, if the terrain characteristics determined for a particular selected route segment indicate that the particular selected route segment has poor terrain conditions (e.g., wet or bumpy terrain, etc.), computing device 710 may select a different route segment of the plurality of predefined route segments having better terrain conditions than the particular selected route segment and redirect the user to the different route segment.
  • the terrain characteristics determined for a particular selected route segment indicate that the particular selected route segment has poor terrain conditions (e.g., wet or bumpy terrain, etc.)
  • computing device 710 may select a different route segment of the plurality of predefined route segments having better terrain conditions than the particular selected route segment and redirect the user to the different route segment.
  • At least one of computing device 710 and remote computing device 730 may update one or more of the predefined route segments based on the terrain characteristics determined for the respective one or more selected route segments. For example, a particular selected route segment may have a first selected route segment section having first determined terrain characteristics, and a second selected route segment section having second determined terrain characteristics. At least one of computing device 710 and remote computing device 730 may, for example, split that particular selected route segment into two new route segments, wherein a first new route segment includes the first route segment section having first determined terrain characteristics, and a second new route segment includes the second route segment section having second determined terrain characteristics.
  • the update of one or more of the predefined route segments may be performed in real-time, or substantially in real-time, based on at least a portion of the motion data being obtained by the one or more motion sensors during the actual ride along the respective one or more selected route segments.
  • computing device 710 may generate and/or update the navigation instructions based on the determined terrain characteristics of the selected route segments. For example, if the computing device 710 determines that a particular selected route segment has poor terrain conditions (e.g., bumpy or wet terrain, etc.), computing device 710 may generate and/or update the navigation instructions so as to instruct the user of the human-scale vehicle, via the user interface, to slow down when driving along that particular selected route segment.
  • poor terrain conditions e.g., bumpy or wet terrain, etc.
  • At least one of computing device 710 and remote computing device 730 may determine, based on at least a portion of the obtained motion data, that the human-scale vehicle has been driven on a sidewalk (e.g., as shown in Fig. 11). In some embodiments, at least one of computing device 710 and remote computing device 730 may identify locations in the terrain in which the human-scale vehicle has been driven on the sidewalk. In some embodiments, at least one of computing device 710 and remote computing device 730 may issue a notification that the humanscale vehicle has been driven on the sidewalk. For example, the notification may be issued to the user (e.g., using the user interface, as described hereinabove) or to a third authorized party (e.g., municipal authority, etc.).
  • a third authorized party e.g., municipal authority, etc.
  • Computing device 710 and/or remote computing device 730 may determine that the human-scale vehicle has been driven on the sidewalk by, for example, comparing the obtained motion data to reference motion data and/or using one or more Al methods (e.g., as described above with respect to determination of terrain characteristics).
  • At least one of computing device 710 and remote computing device 730 may determine that the human-scale vehicle has been driven on a sidewalk further based on a geolocation data from one or more geolocation sensors (e.g., GPS sensors, etc.).
  • the geolocation sensor(s) may be disposed on the user and/or the human-scale vehicle (e.g., as described above with respect to motion sensor(s)). For example, geolocation of the sidewalks within the terrain may be known and may be taken into account when determining that the human-scale vehicle has been driven on sidewalks.
  • perimeters of parking zones for human-scale vehicles within the terrain may be bordered using a predefined three-dimensional (3D) parking pattern.
  • computing device 710 may determine, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone.
  • Computing device 710 may determine that the human scale vehicle has crossed the predefined 3D parking pattern of a parking zone by, for example, comparing the obtained motion data to reference motion data and/or using one or more Al methods (e.g., as described above with respect to determination of terrain characteristics).
  • computing device 710 may determine, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone, and may further determine, based on geolocation data from one or more geolocation sensors, that the human-scale vehicle is within the parking zone. In some embodiments, computing device 710 may prevent from the user to lock the human-scale vehicle if the human-scale vehicle is not within a parking zone.
  • computing device 710 may determine, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone, and may further determine, based on geolocation data from one or more geolocation sensors that the human-scale vehicle has left the parking zone. In some embodiments, computing device 710 may prevent from the user to unlock the human-scale vehicle if the human-scale vehicle is not within a parking zone.
  • Bordering perimeters of parking zones within the terrain using predefined 3D parking patterns and determining that human-scale vehicles are within parking zones based on the obtained motion data and geolocation data may, for example, eliminate a need in installing dedicated hardware within the parking zones thereof.
  • At least one of computing device 710 and remote computing device 730 may detect one or more driving-related events based on at least a portion of the obtained motion data.
  • the one or more driving-related events may be detected based on at least a portion of the obtained motion data, and based on the reference motion data and/or using one or more Al methods (e.g., as described hereinabove).
  • the driving -related events may, for example, include an acceleration of the human-scale vehicle (e.g., as shown in Fig. 8), a deceleration of the human-scale vehicle (e.g., as shown in Fig. 8), an emergency break of the human-scale vehicle (e.g., as shown in Fig. 9), a fall of the human-scale vehicle (e.g., as shown in Fig. 10), a fall type of the human-scale vehicle (e.g., slipping, falling from a height, falling into a pit, etc.), etc.
  • an acceleration of the human-scale vehicle e.g., as shown in Fig. 8
  • a deceleration of the human-scale vehicle e.g., as shown in Fig. 8
  • an emergency break of the human-scale vehicle e.g., as shown in Fig. 9
  • a fall of the human-scale vehicle e.g., as shown in Fig. 10
  • a fall type of the human-scale vehicle e
  • computing device 710 may update the selection of the route segments based on the one or more detected driving-related events. For example, frequent deceleration events of the human-scale vehicle may be indicative of poor terrain characteristics of a particular route segment. In this example, computing device 710 may select a different route segment of the plurality of predefined route segments and redirect the user to the different route segment.
  • At least one of computing device 710 and remote computing device 730 may update one or more of the predefined route segments based on the one or more detected driving-related events. For example, if the user has fallen in a particular selected route segment, the predefined route segment that corresponds to that particular selected route segment may be marked as dangerous and, for example, assigned with a low priority so that this particular route has a low chance of subsequently being selected by a computing device of another user.
  • the update of the one or more of the predefined route segments is performed in real-time, or substantially in real-time, based on the one or more driving-related events detected based on at least a portion of the motion data being obtained by the one or more motion sensors during the actual ride.
  • the selection of the route segments and/or the update of the selection thereof may be based on known driving skills of the user of the human-scaled vehicle. For example, if a particular user is skilled in driving in a terrain having known terrain characteristics (e.g., ice-crusted or wet terrain, etc.), computing device 710 may select route segments having those known terrain characteristics to navigate the user therethrough, and select other route segments for less skilled users.
  • known terrain characteristics e.g., ice-crusted or wet terrain, etc.
  • remote computing device 730 may update of the predefined route segments based on at least a portion of the obtained motion data and based on known driving skills of the user. For example, motion data obtained by users having high driving skills may be considered as robust data and may be used to update the predefined route segments, while motion data obtained by less skilled users may be ignored or may be marked as requiring confirmation prior to updating the predefined route segments based thereon.
  • computing device 710 may generate and/or update the navigation instructions based on known driving skills of the user of the human-scaled vehicle. For example, if a particular user is not skilled in driving in a terrain having known terrain characteristics (e.g., ice- crusted or wet terrain, etc.), computing device 710 may instruct that user to slow down when the user drives along a route segment having those terrain characteristics.
  • known terrain characteristics e.g., ice- crusted or wet terrain, etc.
  • At least one of the selection of the route segments, the update of the selection of route segments, the generation of the navigation instructions and the update of the navigation instructions may be based on data from one or more geolocation sensors disposed on at least one of the human-scaled vehicle and the user thereof.
  • At least one of the selection of the route segments, the update of the selection of route segments, the generation of the navigation instructions and the update of the navigation instructions may be based on data from one or more physiological monitoring sensors disposed on the user of the human-scaled vehicle. For example, elevated heart rate of the user may by indicative of a dangerous situation, e.g., due to an infrastructure failure, etc.
  • At least one of the selection of the route segments, the update of the selection of route segments, the generation of the navigation instructions and the update of the navigation instructions may be based on data from one or more cameras positioned on at least one of the human-scaled vehicle and the user thereof.
  • at least one of computing device 710 and remote computing device 730 may detect and/or classify obstacles in the images from the one or more cameras (e.g., using Al methods, etc.).
  • At least one of the selection of the route segments, the update of the selection of route segments, the generation of the navigation instructions and the update of the navigation instructions may be based on data from one or more microphones positioned on at least one of the human-scaled vehicle and the user thereof.
  • at least one of computing device 710 and remote computing device 730 may detect and/or classify obstacles based on acoustic signals from the one or more microphones (e.g., using Al methods, etc.).
  • At least one of the selection of the route segments, the update of the selection of route segments, the generation of the navigation instructions and the update of the navigation instructions may be based on data from one or more environmental sensors positioned on at least one of the human-scaled vehicle and the user thereof.
  • At least one of the selection of the route segments, the update of the selection of route segments, the generation of the navigation instructions and the update of the navigation instructions may be based on a type of the human-scaled vehicle.
  • different human-scale vehicle may have different driving capabilities and/or different capabilities of overcoming different obstacles (e.g., depending on the vehicle’s dimensions, number of wheels, wheels type, etc.).
  • remote computing device 730 may receive a plurality of ride datasets for multiple users of human-scale vehicles.
  • Each of the ride datasets may include route segments that have been actually used by the users accompanied with at least one of obtained motion data, obtained geolocation data, obtained physiological data, obtained camera data, obtained microphone data, obtained environmental data and human-scale vehicle data.
  • Remote computing device 730 may define route segments and/or or update the predefined route segments based on the plurality of the received ride datasets.
  • remote computing device 730 may receive a plurality of ride datasets for the user of the human-scale vehicle, wherein each of the ride datasets comprises selected route segments and motion data obtained during traveling of the human-scale vehicle along the selected route segments, and may determine driving patterns for the user based on at least a portion of the ride datasets.
  • the driving patterns determined for the user may be indicative of, for example, driving habits of the user, number of accidents the user has been involved in, the severity of those accidents, etc.
  • the determined driving patterns may be used by, for example, one or more third parties.
  • the driving patterns determined for a user may be used by an insurance company to determine insurance quotes for the user.
  • Computing device 710 and remote computing device 730 may each include a non-transitory computer readable medium storing one or more subsets of instructions that, when executed, cause a processor of the respective computing device to perform functions as described hereinbelow. As would be apparent to one skilled in the art, at least some of functions described above as being performed by computing device 710 may be performed by remote computing device 730, and at least some of functions described above as being performed by remote computing device 730 may be performed by computing device 710.
  • FIG. 8 shows graphs 800a, 800b of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of terrain characteristics and driving -related events, according to some embodiments of the invention.
  • Graphs 800a, 800b show variation with time of acceleration of the human-scale vehicle in Z-axis and Y-axis, respectively, as measured by one or more motion sensors disposed on the humanscale vehicle or on the user thereof.
  • Section 800aa in graph 800a may be indicative of a bumpy terrain, for example due to a “noisy” z-acceleration signal in that section.
  • Section 800ba in graph 800b may be indicative of an acceleration of the human-scale vehicle, for example due to a positive y-acceleration signal in that section.
  • Section 800bb in graph 800b may be indicative of a constant velocity drive of the human-scale vehicle, for example due to a zero y-acceleration signal in that section.
  • Section 800bc in graph 800b may be indicative of a deceleration of the human-scale vehicle, for example due to negative y-acceleration signal in that section.
  • FIG. 9 shows graphs 900a, 900b, 900c of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of an emergency break of the human-scale vehicle, according to some embodiments of the invention.
  • Graphs 900a, 900b, 900c show variation with time of acceleration of the human-scale vehicle in X-axis, Y-axis and Z-axis, respectively, as measured by one or more motion sensors disposed on the human-scale vehicle or on the user thereof. Motion data shown in graphs 900a, 900b, 900c is indicative of an emergency break 900d of the human-scale vehicle.
  • Fig. 10 shows graphs 1000a, 1000b, 1000c of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of a fall of the human-scale vehicle, according to some embodiments of the invention.
  • Graphs 1000a, 1000b, 1000c show variation with time of acceleration (lines lOOOd) and velocity (lines lOOOe) of the human-scale vehicle in X-axis, Y-axis and Z-axis, respectively, as measured by one or more motion sensors disposed on the human-scale vehicle or on the user thereof. Motion data shown in graphs 1000a, 1000b, 1000c is indicative of a fall lOOOf of the human-scale vehicle.
  • FIG. 11 shows graphs 1100a, 1100b, 1100c of motion data generatable by one or more motion sensors during motion of the human-scale vehicle, wherein the shown motion data is indicative of a driving on a sidewalk, according to some embodiments of the invention.
  • Graphs 1100a, 1100b, 1100c show variation with time of acceleration of the human-scale vehicle in X-axis, Y-axis and Z-axis, respectively, as measured by one or more motion sensors disposed on the human-scale vehicle or on the user thereof. Motion data shown in graphs 1100a, 1100b, 1100c is indicative of a driving on a sidewalk (sections 1 lOOd).
  • FIG. 12 is a flowchart of a method of navigating through a terrain using a human-scale vehicle, according to some embodiments of the invention.
  • Some embodiments may include receiving 1202 an origin point and one or more destination points in the terrain (e.g., by computing device 710 as described above with respect to Fig. 7).
  • Some embodiments may include selecting 1204 route segments of a plurality of predefined route segments to navigate the user from the origin point to the one or more destination points in the terrain using the human-scale vehicle (e.g., by computing device 710 as described above with respect to Fig. 7).
  • Each of the route segments may, for example, have its unique terrain characteristics that are different from terrain characteristics of other route segments.
  • the terrain characteristics may, for example, include a terrain type (e.g., roadway asphalt, sidewalk asphalt, pavement, etc.), a terrain condition (e.g., wet, dry, ice-crusted, bumpy, etc.), etc.
  • Route segments may include any section of the terrain that is suitable for driving in using the human-scale vehicle.
  • route segments may include roadway sections, pavement sections, bicycle lane sections, crosswalks, underground crossings, overhead passages, passageways, etc.
  • two or more route segments may be associated with a single pathway section in the terrain.
  • a single pathway section in the terrain may have a roadway section, a pavement section and a bicycle lane section, e.g., parallel, or substantially parallel, to each other, wherein the roadway section, the pavement section and the bicycle lane section of the same pathway section in the terrain may be associated with different route segments.
  • those route segments that are more suitable for driving using human-scale vehicles may be prioritized. For example, if a particular pathway section in the terrain has a pavement section and a bicycle lane section, a route segment associated with the bicycle lane section for that particular pathway section may be selected.
  • Some embodiments may include generating navigation instructions (e.g., navigation directives) based on the selected route segments (e.g., as described above with respect to Fig. 7).
  • the navigation instructions may include information concerning, for example, a maximal allowable speed along the selected route segments, turning directions at cross-roads and/or between the selected route segments, etc.
  • Some embodiments may include generating notifications indicative of the navigation instructions (e.g., as described above with respect to Fig. 7).
  • Some embodiments may include obtaining 1206 motion data from one or more motion sensors during traveling of the human-scale vehicle along the selected route segments (e.g., as described above with respect to Fig. 7).
  • the one or more motion sensors may include, for example, one or more accelerometers, one or more gyroscopes, etc.
  • the one or more motion sensors may be motion sensors of the computing device.
  • the one or more motion sensors may be wearable by the user of the human-scale vehicle.
  • the one or more motion sensors may be disposed on the human-scale vehicle.
  • the one or more motion sensors may be motion sensors of the human-scale vehicle.
  • Some embodiments may include determining 1208, based on at least a portion of the obtained motion data, terrain characteristics of the selected route segments (e.g., by computing device 710 as described above with respect to Fig. 7).
  • the determined terrain characteristics may include the terrain type (e.g., roadway asphalt, bicycle lane asphalt, pavement, etc.), the terrain condition (e.g., wet, dry, ice-crusted, etc.), etc. of the selected route segments.
  • the terrain characteristics of the selected route segments may be determined based on at least a portion of the obtained motion data and a reference motion data. For example, at least a portion of the obtained motion data may be compared to at least a portion of the reference motion data and the terrain characteristics may be determined based on the comparison thereof.
  • the terrain characteristics of the selected route segments may be determined based on at least a portion of the obtained motion data using one or more artificial intelligence (Al) methods.
  • Al artificial intelligence
  • one or more Al methods may receive as an input at least a portion of the obtained motion data and output the terrain characteristics of the selected route segments.
  • Some embodiments may include updating the selection of route segments based on the determined terrain characteristics (e.g., as described above with respect to Fig. 7). For example, if the terrain characteristics determined for a particular selected route segment indicate that the particular selected route segment has poor terrain conditions (e.g., wet or bumpy terrain, etc.), a different route segment of the plurality of predefined route segments having better terrain conditions than the particular selected route segment may be selected and the user may be redirected to the different route segment.
  • the terrain characteristics determined for a particular selected route segment indicate that the particular selected route segment has poor terrain conditions (e.g., wet or bumpy terrain, etc.)
  • a different route segment of the plurality of predefined route segments having better terrain conditions than the particular selected route segment may be selected and the user may be redirected to the different route segment.
  • Some embodiments may include updating one or more of the predefined route segments based on the terrain characteristics determined for the respective one or more selected route segments (e.g., as described above with respect to Fig. 7). For example, a particular selected route segment may have a first selected route segment section having first determined terrain characteristics and a second selected route segment section having second determined terrain characteristics. That particular selected route segment may be split into two new route segments, wherein a first new route segment includes the first route segment section having first determined terrain characteristics and a second new route segment includes the second route segment section having second determined terrain characteristics. Some embodiments may include updating the one or more of the predefined route segments in real-time, or substantially in real-time, based on at least a portion of the motion data being obtained by the one or more motion sensors during the actual ride along the respective one or more selected route segments.
  • Various embodiments may include generating and/or updating the navigation instructions based on the determined terrain characteristics of the selected route segments (e.g., as described above with respect to Fig. 7). For example, if a particular selected route segment has poor terrain conditions (e.g., bumpy or wet terrain, etc.), the navigation instructions may be generated and/or updated so as to instruct the user of the human-scale vehicle to slow down when driving along that particular selected route segment.
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has been driven on a sidewalk (e.g., by computing device 710 and/or remote computing device 730 as described above with respect to Figs. 7 and 11).
  • Some embodiments may include determining that the human-scale vehicle has been driven on the sidewalk further based on geolocation data from one or more geolocation sensors.
  • the one or more geolocation sensors may be disposed on at least one of the user and the human-scale vehicle (e.g., as described above with respect to Fig. 7).
  • Some embodiments may include identifying locations in the terrain in which the humanscale vehicle has been driven on the sidewalk (e.g., as described above with respect to Fig. 7). Some embodiments may include issuing a notification that the human-scale vehicle has been driven on the sidewalk (e.g., as described above with respect to Fig. 7). For example, the notification may be issued to the user (e.g., using the user interface, as described hereinabove) or to a third authorized party (e.g., municipal authority, etc.).
  • a third authorized party e.g., municipal authority, etc.
  • perimeters of parking zones for human-scale vehicles within the terrain may be bordered using predefined 3D parking pattern.
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone (e.g., by comparing the obtained motion data to the reference motion data and/or using one or more Al methods as described above with respect to Fig. 7).
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone, and further determining, based on geolocation data from one or more geolocation sensors that the human-scale vehicle is within the parking zone. Some embodiments may include preventing the user from locking the human-scale vehicle if the human-scale vehicle is not within the parking zone. For example, as described above with respect to Fig. 7.
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone, and further determining, based on geolocation data from one or more geolocation sensors, that the human-scale vehicle has left the parking zone. Some embodiments may include preventing the user from unlocking the human-scale vehicle if the human-scale vehicle is not within the parking zone. For example, as described above with respect to Fig. 7.
  • Some embodiments may include detecting one or more driving -related events based on at least a portion of the obtained motion data (e.g., as described above with respect to Fig. 7).
  • the one or more driving-related events may be detected based on at least a portion of the obtained motion data, and based on the reference motion data and/or using one or more Al methods (e.g., as described hereinabove).
  • the driving -related events may, for example, include an acceleration of the human-scale vehicle (e.g., as described above with respect to Figs. 7 and 8), a deceleration of the human-scale vehicle (e.g., as described above with respect to Figs. 7 and 8), an emergency break of the humanscale vehicle (e.g., as described above with respect to Figs. 7 and 9), a fall of the human-scale vehicle (e.g., as described above with respect to Figs. 7and 10), a fall type of the human-scale vehicle (e.g., slipping, falling from a height, falling into a pit, etc.), etc.
  • an acceleration of the human-scale vehicle e.g., as described above with respect to Figs. 7 and 8
  • a deceleration of the human-scale vehicle e.g., as described above with respect to Figs. 7 and 8
  • an emergency break of the humanscale vehicle e.g., as described above with respect to Figs. 7
  • Some embodiments may include updating the selection of the route segments based on the one or more detected driving-related events (e.g., as described above with respect to Figs. 7). For example, frequent deceleration events of the human-scale vehicle may be indicative of poor terrain characteristics of a particular route segment. In this example, a different route segment of the plurality of predefined route segments may be selected and the user may be redirected to the different route segment.
  • driving-related events e.g., as described above with respect to Figs. 7
  • frequent deceleration events of the human-scale vehicle may be indicative of poor terrain characteristics of a particular route segment.
  • a different route segment of the plurality of predefined route segments may be selected and the user may be redirected to the different route segment.
  • Some embodiments may include updating one or more of the predefined route segments based on the one or more detected driving -related events (e.g., as described above with respect to Figs. 7). For example, if the user has fallen in a particular selected route segment, the predefined route segment that corresponds to that particular selected route segment may be marked as dangerous and, for example, assigned with a low priority so as that particular route has a low chance of being selected by a computing device of another user. Some embodiments may include updating the one or more of the predefined route segments is performed in real-time, or substantially in real-time, based on the one or more driving-related events detected based on at least a portion of the motion data being obtained by the one or more motion sensors during the actual ride.
  • Various embodiments may include selecting the route segments and/or updating the selection thereof based on known driving skills of the user of the human-scaled vehicle (e.g., as described above with respect to Figs. 7). For example, if a particular user is skilled in driving in a terrain having known terrain characteristics (e.g., ice-crusted or wet terrain, etc.), route segments having those known terrain characteristics may be selected to navigate the user therethrough, and other route segments may be selected for less skilled users.
  • known terrain characteristics e.g., ice-crusted or wet terrain, etc.
  • Some embodiments may include updating the predefined route segments based on at least a portion of the obtained motion data and based on known driving skills of the user (e.g., as described above with respect to Figs. 7). For example, motion data obtained by users having high driving skills may be considered as robust data and may be used to update the predefined route segments, while motion data obtained by less skilled users may be ignored or may be marked as requiring confirmation prior to updating the predefined route segments based thereon.
  • Various embodiments may include generating and/or updating the navigation instructions based on known driving skills of the user of the human-scaled vehicle (e.g., as described above with respect to Figs. 7). For example, if a particular user is not skilled in driving in a terrain having known terrain characteristics (e.g., ice -crusted or wet terrain, etc.), that user may be instructed to slow down when the user drives along a route segment having those terrain characteristics.
  • known terrain characteristics e.g., ice -crusted or wet terrain, etc.
  • Some embodiments may include at least one of selecting the route segments, updating the selection of route segments, generating the navigation instructions and updating the navigation instructions based on data from one or more geolocation sensors disposed on at least one of the human-scaled vehicle and the user thereof (e.g., as described above with respect to Figs. 7).
  • Some embodiments may include at least one of selecting the route segments, updating the selection of route segments, generating the navigation instructions and updating the navigation instructions based on data from one or more physiological monitoring sensors disposed on the user of the human-scaled vehicle (e.g., as described above with respect to Figs. 7). For example, elevated heart rate of the user may by indicative of a dangerous situation, e.g., due to an infrastructure failure, etc.
  • Some embodiments may include at least one of selecting the route segments, updating the selection of route segments, generating the navigation instructions and updating the navigation instructions based on data from one or more cameras positioned on at least one of the human-scaled vehicle and the user thereof (e.g., as described above with respect to Figs. 7).
  • Some embodiments may include at least one of selecting the route segments, updating the selection of route segments, generating the navigation instructions and updating the navigation instructions based on data from one or more microphones positioned on at least one of the human- scaled vehicle and the user thereof (e.g., as described above with respect to Figs. 7).
  • Some embodiments may include at least one of selecting the route segments, updating the selection of route segments, generating the navigation instructions and updating the navigation instructions based on data from one or more environmental sensors positioned on at least one of the human-scaled vehicle and the user thereof (e.g., as described above with respect to Figs. 7).
  • Some embodiments may include at least one of selecting the route segments, updating the selection of route segments, generating the navigation instructions and updating the navigation instructions based on a type of the human-scaled vehicle (e.g., as described above with respect to Figs. 7).
  • Some embodiments may include receiving a plurality of ride datasets for multiple users of human-scale vehicles (e.g., as described above with respect to Figs. 7).
  • Each of the ride datasets may include route segments that have been actually used by the users accompanied with at least one of obtained motion data, obtained geolocation data, obtained physiological data, obtained camera data, obtained microphone data, obtained environmental data and human-scale vehicle data.
  • Various embodiments may include defining route segments and/or or updating the predefined route segments based on the plurality of the received ride datasets.
  • Some embodiments may include receiving a plurality of ride datasets for the user of the human-scale vehicle, each of the ride datasets comprises selected route segments and motion data obtained during traveling of the human-scale vehicle along the selected route segments, and determining driving patterns for the user based on at least a portion of the ride datasets (e.g., as described above with respect to Figs. 7).
  • the driving patterns determined for the user may be indicative of, for example, driving habits of the user, number of accidents the user has been involved in, the severity of the accidents thereof, etc.
  • the determined driving patterns may be used by, for example, one or more third parties.
  • the driving patterns determined for a user may be used by an insurance company to determine insurance quotes for the user.
  • Fig. 13 is a flowchart of a method of determining that a human-scale vehicle has been driven on a sidewalk, according to some embodiments of the invention.
  • Some embodiments may include receiving 1302 an origin point and one or more destination points in the terrain (e.g., by computing device 710 as described above with respect to Fig. 7).
  • Some embodiments may include selecting 1304 route segments of a plurality of predefined route segments to navigate the user from the origin point to the one or more destination points in the terrain using the human-scale vehicle (e.g., by computing device 710 as described above with respect to Fig. 7).
  • Some embodiments may include obtaining 1306 motion data from one or more motion sensors during traveling of the human-scale vehicle along the selected route segments (e.g., as described above with respect to Fig. 7).
  • the one or more motion sensors may include, for example, one or more accelerometers, one or more gyroscopes, etc.
  • the one or more motion sensors may be motion sensors of the computing device.
  • the one or more motion sensors may be wearable by the user of the human-scale vehicle.
  • the one or more motion sensors may be disposed on the human-scale vehicle.
  • the one or more motion sensors may be motion sensors of the human-scale vehicle.
  • Some embodiments may include determining 1308, based on at least a portion of the obtained motion data, that the human-scale vehicle has been driven on a sidewalk (e.g., by computing device 710 and/or by remote computing device 730 as described above with respect to Figs. 7 and 11).
  • Some embodiments may include determining that the human-scale vehicle has been driven on the sidewalk further based on geolocation data from one or more geolocation sensors.
  • the one or more geolocation sensors may be disposed on at least one of the user and the human-scale vehicle (e.g., as described above with respect to Fig. 7).
  • Some embodiments may include identifying locations in the terrain in which the humanscale vehicle has been driven on the sidewalk (e.g., as described above with respect to Fig. 7).
  • Some embodiments may include issuing a notification that the human-scale vehicle has been driven on the sidewalk (e.g., as described above with respect to Fig. 7).
  • the notification may be issued to the user (e.g., using the user interface, as described hereinabove) or to a third authorized party (e.g., municipal authority, etc.).
  • Fig. 14 is a flowchart of a method of determining that a human-scale vehicle has crossed a predefined three-dimensional (3D) parking pattern of a parking zone, according to some embodiments of the invention.
  • Some embodiments may include receiving 1402 an origin point and one or more destination points in the terrain (e.g., by computing device 710 as described above with respect to Fig. 7).
  • Some embodiments may include selecting 1404 route segments of a plurality of predefined route segments to navigate the user from the origin point to the one or more destination points in the terrain using the human-scale vehicle (e.g., by computing device 710 as described above with respect to Fig. 7).
  • Some embodiments may include obtaining 1406 motion data from one or more motion sensors during traveling of the human-scale vehicle along the selected route segments (e.g., as described above with respect to Fig. 7).
  • the one or more motion sensors may include, for example, one or more accelerometers, one or more gyroscopes, etc.
  • the one or more motion sensors may be motion sensors of the computing device.
  • the one or more motion sensors may be wearable by the user of the human-scale vehicle.
  • the one or more motion sensors may be disposed on the human-scale vehicle.
  • the one or more motion sensors may be motion sensors of the human-scale vehicle.
  • Some embodiments may include determining 1408, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed a predefined 3D parking pattern of a parking zone (e.g., by computing device 710 as described above with respect to Fig. 7).
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone, and further determining, based on geolocation data from one or more geolocation sensors that the human-scale vehicle is within the parking zone. Some embodiments may include preventing the user from locking the human-scale vehicle if the human-scale vehicle is not within the parking zone. For example, as described above with respect to Fig. 7.
  • Some embodiments may include determining, based on at least a portion of the obtained motion data, that the human-scale vehicle has crossed the predefined 3D parking pattern of a parking zone, and further determining, based on geolocation data from one or more geolocation sensors that the human-scale vehicle has left the parking zone. Some embodiments may include preventing the user from unlocking the human-scale vehicle if the human-scale vehicle is not within the parking zone. For example, as described above with respect to Fig. 7.
  • the disclosed system and method may navigate a user in a terrain (e.g., urban area and/or areas between two or more urban areas) through selected route segments that are most suitable for driving using human-scale vehicles.
  • route segments may include, for example roadway sections, pavement sections, bicycle lane sections, crosswalks, underground crossings, overhead passages, passageways, etc.
  • the route segments may be selected from a plurality of predefined route segments.
  • the predefined route segments may be predefined based on analysis of a plurality of ride datasets received from users of human-scale vehicles.
  • the route segments may be continuously updated based on, for example, motion data obtained by motion sensors of the user during the actual ride along the selected route segments.
  • the disclosed system and method may provide safer and faster navigation in the terrain using the human-scale vehicle as compared to current navigation systems and methods that typically treat human-scale vehicles as small automotive vehicles or pedestrians.
  • These computer program instructions can also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or portion diagram portion or portions thereof.
  • the computer program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or portion diagram portion or portions thereof.
  • each portion in the flowchart or portion diagrams can represent a module, segment, or portion of code, which includes one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the portion can occur out of the order noted in the figures. For example, two portions shown in succession can, in fact, be executed substantially concurrently, or the portions can sometimes be executed in the reverse order, depending upon the functionality involved.
  • each portion of the portion diagrams and/or flowchart illustration, and combinations of portions in the portion diagrams and/or flowchart illustration can be implemented by special purpose hardware -based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • an embodiment is an example or implementation of the invention.
  • the various appearances of "one embodiment”, “an embodiment”, “certain embodiments” or “some embodiments” do not necessarily all refer to the same embodiments.
  • various features of the invention can be described in the context of a single embodiment, the features can also be provided separately or in any suitable combination.
  • the invention can also be implemented in a single embodiment.
  • Certain embodiments of the invention can include features from different embodiments disclosed above, and certain embodiments can incorporate elements from other embodiments disclosed above.
  • the disclosure of elements of the invention in the context of a specific embodiment is not to be taken as limiting their use in the specific embodiment alone.
  • the invention can be carried out or practiced in various ways and that the invention can be implemented in certain embodiments other than the ones outlined in the description above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)

Abstract

La présente invention concerne, selon certains modes de réalisation, un procédé de navigation dans un terrain à l'aide d'un véhicule à l'échelle humaine, le procédé comprenant les étapes consistant à : recevoir un point d'origine et un ou plusieurs points de destination dans le terrain ; sélectionner des segments d'itinéraire d'une pluralité de segments d'itinéraire prédéfinis pour faire naviguer un utilisateur depuis le point d'origine vers le ou les points de destination dans le terrain à l'aide du véhicule à l'échelle humaine ; obtenir des données de déplacement à partir d'un ou de plusieurs capteurs de déplacement disposés sur au moins l'un de l'utilisateur et du véhicule à l'échelle humaine pendant le déplacement du véhicule à l'échelle humaine le long des segments d'itinéraire sélectionnés ; et déterminer, sur la base d'au moins une partie des données de déplacement obtenues, des caractéristiques de terrain des segments d'itinéraire sélectionnés.
PCT/IL2022/050097 2021-01-28 2022-01-24 Création automatique d'une base de données de cartographie de terrains WO2022162657A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/160,406 US12025446B2 (en) 2021-01-28 Automatically creating a terrain mapping database
US17/160,406 2021-01-28

Publications (1)

Publication Number Publication Date
WO2022162657A1 true WO2022162657A1 (fr) 2022-08-04

Family

ID=82653974

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2022/050097 WO2022162657A1 (fr) 2021-01-28 2022-01-24 Création automatique d'une base de données de cartographie de terrains

Country Status (1)

Country Link
WO (1) WO2022162657A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180058858A1 (en) * 2015-03-11 2018-03-01 Trailze, Ltd. Automatically creating a terrain mapping database
WO2018106763A1 (fr) * 2016-12-06 2018-06-14 Nissan North America, Inc. Interfaces de recouvrement de trajet de solution pour véhicules autonomes
US20200124430A1 (en) * 2018-10-19 2020-04-23 Neutron Holdings, Inc. Detecting types of travel corridors on which personal mobility vehicles travel

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180058858A1 (en) * 2015-03-11 2018-03-01 Trailze, Ltd. Automatically creating a terrain mapping database
WO2018106763A1 (fr) * 2016-12-06 2018-06-14 Nissan North America, Inc. Interfaces de recouvrement de trajet de solution pour véhicules autonomes
US20200124430A1 (en) * 2018-10-19 2020-04-23 Neutron Holdings, Inc. Detecting types of travel corridors on which personal mobility vehicles travel

Similar Documents

Publication Publication Date Title
US20200348139A1 (en) Automatically creating a terrain mapping database
JP6959450B2 (ja) 車両ルーティングのシーン難易度予測モデルの使用
US12018953B2 (en) Detecting types of travel corridors on which personal mobility vehicles travel
CA3033864C (fr) Procede et systeme de determination et de mise a jour dynamique d'un itineraire et d'un style de conduite pour le confort des passagers
JP3328939B2 (ja) 車両用ナビゲーション装置及びこれに使用する道路形状データの作成
US10083613B2 (en) Driving support
WO2010134824A1 (fr) Dispositif d'assistance à la conduite et système de véhicule
US11788859B2 (en) Method, apparatus, and computer program product for road noise mapping
CN112955362A (zh) 评估自主车辆的乘坐质量
JP2011209809A (ja) 災害時の車両の迂回路支援システム
Corno et al. Road slope estimation in bicycles without torque measurements
US12025446B2 (en) Automatically creating a terrain mapping database
JP2004110590A (ja) 自転車の走行路面状況の評価モデルを生成する方法、自転車の走行路面環境評価方法及び評価システム
US20210148708A1 (en) Automatically creating a terrain mapping database
Wage et al. Joint estimation of road roughness from crowd-sourced bicycle acceleration measurements
CN110956809A (zh) 监测车辆通过路口的系统及方法
WO2022162657A1 (fr) Création automatique d'une base de données de cartographie de terrains
US11449543B2 (en) Method, apparatus, and computer program product for vehicle localization via amplitude audio features
WO2022208765A1 (fr) Système de navigation, dispositif serveur, dispositif de navigation et véhicule
US11302345B2 (en) Method, apparatus, and computer program product for vehicle localization via frequency audio features
WO2007119348A1 (fr) appareil d'obtention d'informations, procédé d'obtention d'informations, programme d'obtention d'informations et support d'enregistrement
US20220198262A1 (en) Method, apparatus, and computer program product for surveillance of road environments via deep learning
Jiang et al. Dual Stream Meta Learning for Road Surface Classification and Riding Event Detection on Shared Bikes
Prabu et al. Risk assessment and mitigation of e-scooter crashes with naturalistic driving data
EP4394323A1 (fr) Procédé, appareil et produit programme d'ordinateur pour placement d'intervalle intelligent en temps réel au moins approximatif dans des données de mobilité

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22745502

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22745502

Country of ref document: EP

Kind code of ref document: A1