WO2018119907A1 - Procédé, appareil et produit-programme informatique de détermination de position d'arrêt d'un véhicule - Google Patents

Procédé, appareil et produit-programme informatique de détermination de position d'arrêt d'un véhicule Download PDF

Info

Publication number
WO2018119907A1
WO2018119907A1 PCT/CN2016/113099 CN2016113099W WO2018119907A1 WO 2018119907 A1 WO2018119907 A1 WO 2018119907A1 CN 2016113099 W CN2016113099 W CN 2016113099W WO 2018119907 A1 WO2018119907 A1 WO 2018119907A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
destination
stopping position
information
determining
Prior art date
Application number
PCT/CN2016/113099
Other languages
English (en)
Inventor
Yue Chen
Wanzheng ZHU
Weidong MENG
Original Assignee
Nokia Technologies Oy
Nokia Technologies (Beijing) Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy, Nokia Technologies (Beijing) Co., Ltd. filed Critical Nokia Technologies Oy
Priority to PCT/CN2016/113099 priority Critical patent/WO2018119907A1/fr
Publication of WO2018119907A1 publication Critical patent/WO2018119907A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3617Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3679Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
    • G01C21/3685Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities the POI's being parking facilities
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/165Anti-collision systems for passive traffic, e.g. including static obstacles, trees
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Definitions

  • An example embodiment of the present invention relates generally to vehicles, and more particularly, to a method, apparatus and computer program product for determining a stopping position of a vehicle.
  • An autonomous vehicle may comprise a navigation system which determines a route to a destination.
  • Passengers may use mobile devices or other means to request vehicles such as taxis, or ride-sharing cars, and may provide an input such as the pick-up location and destination.
  • a vehicle may maneuver itself to the pick-up point to pick up passengers and transport them to a destination.
  • Autonomous vehicles may also be used in industrial settings, for example, to route machine parts and/or equipment within a warehouse.
  • the retail industry may additionally use autonomous vehicles to pick up and deliver packages to an intended recipient.
  • the destination of an autonomous vehicle may include an address, point of interest, coordinates, and/or the like.
  • a navigation system may plan a route to the destination to perform a pickup, drop-off, delivery and/or other task. The system may then control the autonomous vehicle so that the vehicle drives to the destination with little or no additional user control.
  • the autonomous vehicle arrives at the destination and stops or parks based only on a detected arrival at the specified address or point of interest, as determined by a navigational system and/or global position system (GPS) .
  • the address, point of interest, or other specified destination may only be a general location to which an autonomous vehicle navigates.
  • a method, apparatus, and computer program product are therefore provided for determining a stopping position of a vehicle.
  • the vehicle may be a self- driving or autonomous vehicle, or any other machine configured to travel to a specified destination.
  • the destination may be indicated by a user, and/or provided by a system or computing device.
  • a vehicle may arrive at a destination and stop or park at the nearest available parking space with little or no regard to the vehicle’s surroundings. For example, the vehicle may stop in a position such that passengers may step into a puddle while unloading. As another example, a vehicle may stop near an uncovered manhole or other obstacle, posing hazards and/or inconveniences to passengers of the vehicle, recipients of deliveries by the vehicle, and/or the like. In some examples, such obstacles could also cause damage to packages, equipment, and/or the like. As another example, a vehicle arriving for a requested passenger or package pickup may park in an area where an obstacle obstructs the passenger or object from entering the vehicle.
  • a method including receiving an indication of a destination of a vehicle.
  • the method includes determining a stopping position corresponding to the destination based on at least one of historical data or an ambient condition, wherein the ambient condition is determined based on sensor data associated with at least the vehicle or the destination.
  • the method may further include identifying a position of an object in at least one of the environment of the vehicle or the environment of the destination, wherein the stopping position is determined further based on the position of the object.
  • the method may include causing information relating to the determination of the stopping position to be stored in an information management system.
  • the apparatus comprises at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least receive an indication of a destination of a vehicle.
  • the at least one memory and the computer program code are further configured to cause the apparatus to determine a stopping position corresponding to the destination based on at least one of historical data or an ambient condition, wherein the ambient condition is determined based on sensor data associated with at least the vehicle or the destination.
  • the at least one memory and the computer program code are further configured to identify a position of an object in at least one of the environment of the vehicle or the environment of the destination, wherein the stopping position is determined further based on the position of the object.
  • the at least one memory and the computer program code are further configured to cause information relating to the determination of the stopping position to be stored in an information management system.
  • a computer program product comprises at least one non-transitory computer-readable storage medium having computer-executable program code instructions stored therein, the computer-executable program code instructions comprising program code instructions to receive an indication of a destination of a vehicle.
  • the computer-executable program code instructions may further comprise program code instructions to determine a stopping position corresponding to the destination based on at least one of historical data or an ambient condition, wherein the ambient condition is determined based on sensor data associated with at least the vehicle or the destination.
  • the computer-executable program code instructions may further comprise program code instructions to identify a position of an object in at least one of the environment of the vehicle or the environment of the destination, wherein the stopping position is determined further based on the position of the object.
  • the computer-executable program code instructions may further comprise program code instructions to cause information relating to the determination of the stopping position to be stored in an information management system.
  • An apparatus with means for receiving an indication of a destination of a vehicle.
  • the apparatus may further comprise means for determining a stopping position corresponding to the destination based on at least one of historical data or an ambient condition, wherein the ambient condition is determined based on sensor data associated with at least the vehicle or the destination.
  • the apparatus further includes means for identifying a position of an object in at least one of the environment of the vehicle or the environment of the destination, wherein the stopping position is determined further based on the position of the object.
  • the apparatus further comprises means for causing information relating to the determination of the stopping position to be stored in an information management system
  • the stopping position is a more precisely indicated position relative to the destination.
  • the stopping position is further based on vehicle information, passenger information and/or delivery object information.
  • the sensor data may be determined based on light detection and ranging (LIDAR) .
  • the destination may be received via a global navigation satellite system (GNSS) .
  • GNSS global navigation satellite system
  • Figure 1 is a block diagram of an apparatus that may be configured to implement an example embodiment of the present disclosure
  • Figure 2 is a flowchart illustrating operations performed in accordance with an example embodiment of the present disclosure.
  • Figures 3-9 are example environments in which an example embodiment of the present disclosure may be utilized.
  • circuitry refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry) ; (b) combinations of circuits and computer program product (s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor (s) or a portion of a microprocessor (s) , that require software or firmware for operation even if the software or firmware is not physically present.
  • This definition of ‘circuitry’ a pplies to all uses of this term herein, including in any claims.
  • circuitry’ a lso includes an implementation comprising one or more processors and/or portion (s) thereof and accompanying software and/or firmware.
  • circuitry’ a s used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
  • a “computer-readable storage medium, ” which refers to a physical storage medium (e.g., volatile or non-volatile memory device) , may be differentiated from a “computer-readable transmission medium, ” which refers to an electromagnetic signal.
  • a method, apparatus and computer program product are provided for determining a stopping position of a vehicle corresponding to a destination of a vehicle.
  • the stopping position may be considered a more precise position of where the vehicle will stop relative to the destination which may be a general location.
  • the destination may be an address, point of interest, intersection and/or the like.
  • the destination may include any other practical information a user may enter as a destination for a vehicle, such as “home, ” “work, ” and/or the like.
  • the stopping position may indicate a position in which the vehicle is configured to or signaled to stop, such as a distance from a curb, intersection or other landmark.
  • the stopping position may include GPS coordinates, and/or the like.
  • the stopping position may include further details regarding the angle at which the vehicle should stop and/or a direction which the vehicle should face when stopped. The stopping positon may therefore indicate a more specific and/or precise location than the corresponding destination.
  • apparatus 25 may include or otherwise be in communication with a processor 20, communication interface 24, and memory device 26. As described below and as indicated by the dashed lines in Figure 1, in some embodiments, the apparatus 25 may also optionally include a user interface 22.
  • Apparatus 25 may be implemented as a server or distributed system, such as a server or centralized system for controlling or communicating with vehicles.
  • apparatus 25 may be configured to receive sensor data, navigational information, a destination, and/or the like from a vehicle or navigation system, process the data, and transmit a response to the vehicle or navigation system, such as a determined stopping position.
  • apparatus 25 need not necessarily be embodied by a server, and may be embodied by a wide variety of devices including personal computers, work stations, or mobile terminals, such as laptop computers, tablet computers, global positioning system (GPS) devices, navigation devices, or any combination of the aforementioned, and other types of voice and text communications systems.
  • apparatus 25 may be embodied within a vehicle, such as but not limited to a self-driving or autonomous vehicle.
  • the processor 20 (and/or co-processors or any other processing circuitry assisting or otherwise associated with the processor 20) may be in communication with the memory device 26 via a bus for passing information among components of the apparatus 25.
  • the memory device 26 may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device 26 may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device like the processor 20) .
  • the memory device 26 may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus to carry out various functions in accordance with an example embodiment of the present invention.
  • the memory device 26 could be configured to buffer input data for processing by the processor 20.
  • the memory device 26 could be configured to store instructions for execution by the processor 20.
  • memory device 26 may comprise an information management system configured to provide historical data regarding obstacles corresponding to vehicle destinations, and/or ambient conditions in an environment.
  • the apparatus 25 may, in some embodiments, be embodied in various devices as described above. However, in some embodiments, the apparatus 25 may be embodied as a chip or chip set. In other words, the apparatus 25 may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard) . The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus 25 may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single “system on a chip. ” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processor 20 may be embodied in a number of different ways.
  • the processor 20 may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP) , a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit) , an FPGA (field programmable gate array) , a microcontroller unit (MCU) , a hardware accelerator, a special-purpose computer chip, or the like.
  • the processor 20 may include one or more processing cores configured to perform independently.
  • a multi-core processor may enable multiprocessing within a single physical package.
  • the processor 20 may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
  • the processor 20 may be configured to execute instructions stored in the memory device 26 or otherwise accessible to the processor 20. Alternatively or additionally, the processor 20 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 20 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly. Thus, for example, when the processor 20 is embodied as an ASIC, FPGA or the like, the processor 20 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 20 is embodied as an executor of software instructions, the instructions may specifically configure the processor 20 to perform the algorithms and/or operations described herein when the instructions are executed.
  • the processor 20 may be a processor of a specific device (e.g., a mobile terminal or network entity) configured to employ an embodiment of the present invention by further configuration of the processor 20 by instructions for performing the algorithms and/or operations described herein.
  • the processor 20 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 20.
  • ALU arithmetic logic unit
  • the communication interface 24 may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the apparatus 25.
  • the communication interface 24 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network.
  • the communication interface 24 may include the Internet, GSM (Global System for Mobile communication) , any local area network (LAN) , and/or the like.
  • the communication interface 24 may include the circuitry for interacting with the antenna (s) to cause transmission of signals via the antenna (s) or to handle receipt of signals received via the antenna (s) .
  • the communication interface 24 may alternatively or also support wired communication.
  • the communication interface 24 may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL) , universal serial bus (USB) or other mechanisms.
  • DSL digital subscriber line
  • USB universal serial bus
  • the apparatus 25 may include a user interface 22 that may, in turn, be in communication with the processor 20 to receive an indication of a user input and/or to cause provision of an audible, visual, mechanical or other output to the user.
  • the user interface 22 may include, for example, a keyboard, a mouse, a joystick, a display, a touch screen (s) , touch areas, soft keys, a microphone, a speaker, or other input/output mechanisms.
  • the processor 20 may comprise user interface circuitry configured to control at least some functions of one or more user interface elements such as, for example, a speaker, ringer, microphone, display, and/or the like.
  • the processor 20 and/or user interface circuitry comprising the processor 20 may be configured to control one or more functions of one or more user interface elements through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor 20 (e.g., memory device 26, and/or the like) .
  • computer program instructions e.g., software and/or firmware
  • FIG. 2 the operations for determining a stopping position of a vehicle are outlined in accordance with an example embodiment. In this regard and as described below, the operations of Figure 2 may be performed by an apparatus 25.
  • the apparatus 25 may include means, such as the processor 20, the user interface 22, the communication interface 24 or the like, for receiving an indication of a destination of a vehicle.
  • the destination may be considered any location information, such as information indicative of an address, GPS coordinates, point of interest, and/or the like.
  • the destination may be one to which a vehicle, such as an autonomous vehicle, navigates or will subsequently navigate.
  • the indication of the destination may be received over a communication interface 24 from the vehicle, corresponding navigation system, and/or the like.
  • the destination may be a pickup, drop-off and/or delivery location such as that of a passenger, package, luggage, and/or the like.
  • the indication of the destination may be received via a user interface, such as user interface 22.
  • the indication of the destination may be generated based on a vehicle arriving in close proximity to the destination, or within a predefined threshold of the destination (e.g., 50 feet) .
  • the apparatus 25 may receive a GPS signal, global navigation satellite system (GNSS) signal, or other location information indicative of the destination and/or a vehicle traveling along a planned route.
  • GNSS global navigation satellite system
  • the destination therefore, may be received from a wireless telecommunication system to which a vehicle is communicatively connected.
  • the indication of the destination need not provide an exact or precise stopping position of the vehicle, but that the destination indicates a general location to which the vehicle travels, for passenger or package pickup, delivery, and/or the like.
  • the destination may include or may be proximate to a large parking lot in which the precise stopping position is ambiguous.
  • the destination may be in a pedestrian-only area, a forbidden area, or a no-loading zone in which the vehicle cannot stop.
  • the apparatus 25 may include means, such as the processor 20, the communication interface 24 or the like, for receiving sensor data detected from sensors of the vehicle.
  • the sensor data may include any data or information regarding a position of the vehicle, ambient conditions in the environment of the vehicle or destination, and/or the like.
  • the sensor data may be detected from sensors operative in or on the vehicle, for example, or may be received by the apparatus 25 via a system communicatively connected with the vehicle and/or apparatus 25, such as a GNSS or information management system.
  • a GNSS may provide information regarding the location of vehicles, such as autonomous vehicle which the system may track.
  • Various information management systems may provide sensor data collected in the vicinity of the destination, such as data describing ambient conditions (e.g., weather) .
  • the sensor data may be detected from any number of various sensors or sensor types.
  • a vehicle may be equipped with a global positioning system (GPS) , for detecting a real-time location of the vehicle and/or for the apparatus 25 to track the progress of the vehicle.
  • apparatus 25 may be configured to track the progress of various vehicles to their respective destinations, and manage the routing and dispatching of vehicles and delivery and/or drop-off requests accordingly. This may be performed with a GNSS operative in various vehicles, other automated tracking systems, and/or the like.
  • GPS global positioning system
  • Sensor data may be detected via the vehicle via any number of sensors, including but not limited to sensors configured to enable the vehicle for autonomous control which may enable the vehicle to travel along an intended path without direct control from a user.
  • the vehicle may include a sensor array or a plurality of sensors.
  • the plurality of sensors may comprise any devices that may be configured to detect a physical parameter relating to the environment of the vehicle and/or destination, and provide an electrical signal indicative of the physical parameter.
  • one or more of the sensors may be configured to detect the distance between a vehicle and other objects.
  • the information obtained from such sensors may be used to determine and/or modify the intended path of the vehicle, to prevent the vehicle from colliding with other objects, and/or to identify objects or obstacles in proximity to the destination that may inhibit the delivery or pick-up of passengers, objects, and/or the like.
  • the vehicle may be equipped with any number of cameras and/or other components configured to provide computer vision capabilities.
  • the cameras or other such components may be configured for photographing or recording video imagery of the surrounding environments and/or the vehicle. Any number of cameras or components may be mounted to, affixed to, or built-in to the body of the vehicle. Any such photographs may include a portion of or the entire vehicle for reference.
  • the photographs may include a front view of the vehicle, rear view, left view, and/or side view of or from the vehicle and may be taken from any angle of the vehicle.
  • the photographs or video imagery may be analyzed to identify the presence of objects near the vehicle, such as puddles, bumps, manholes, and/or the like.
  • the photographs, video imagery, and/or data interpreted therefrom may therefore be considered sensor data.
  • the sensor data may further include data detected from a radar detection device such as one implemented on a vehicle. Radar signals may be emitted and received so as to detect objects in the vicinity of the vehicle.
  • the sensor data may further include data detected from sensors, devices and/or systems such as those described in further detail by Tsai-Hong Hong, et. al., Fusing Ladar and Color Image Information for Mobile Robot Feature Detection and Tracking (2002) ,
  • sensors utilizing light detection and ranging may be used to generate or provide sensor data.
  • a laser beam may be emitted such that the sensor detects reflected light.
  • qualities of the reflected light may differ.
  • no light may be reflected, which may further indicate the types of objects, or depth of a puddle, for example, in an environment.
  • the sensor data may include LIDAR images generated based on sensor data detected from sensors in the environment.
  • the sensors performing LIDAR may be affixed to a vehicle or to an object in the vicinity of the destination.
  • the LIDAR images indicate varying levels of reflected light or voids in the reflected light based on the detected reflections of the laser. Voids in LIDAR images may correspond to the lack of reflected light associated with those areas.
  • apparatus 25 and/or processor 20 may process the data to identify and/or distinguish objects such as puddles.
  • Apparatus 25, such as with processor 20, can convert a LIDAR image by an iterative processing to distinguish pixels based on variances in the detected light reflectivity. This iterative processing of the LIDAR image enables the apparatus 25 and/or processor 20 to distinguish variances in the reflected light such that the apparatus 25 identifies puddles or other objects in the environment.
  • Small variances in light reflectivity enable apparatus 25 to estimate or determine a location, size and depth of an object such as a puddle.
  • the apparatus 25 and/or processor 20 may not distinguish the limbs from the background by using LIDAR alone due to the physical qualities and position of the limb and tree relative to a sensor. Additional object detection methods may therefore be used instead of, or in combination with LIDAR.
  • Any of the above examples of data detected with LIDAR, or determined based on the subsequent processing of data collected with LIDAR, may be considered sensor data utilized according to an example embodiment.
  • sensor data may be stored as it is detected, and/or stored for subsequent use.
  • the sensor data may be stored for use by the apparatus 25, such as on a local memory (e.g., memory 26) of an in-vehicle navigation system or other system.
  • a local memory e.g., memory 26
  • an autonomous vehicle may create a map of its surroundings based on detected sensor data, allowing apparatus 25, or other navigation system to keep track of its position even when conditions change or when it enters uncharted environments.
  • the sensor data may therefore be tracked relative to the vehicle’s position even as the vehicle moves.
  • the sensor data may be stored for long term, repeated, and/or future use, such as on a memory 26 of apparatus 25, or other computing system.
  • sensor data may be stored in an information management system to track locations of obstacles, which the apparatus 25, such as with processor 20, may determine as semi-permanent or permanent objects (e.g., manholes, etc. ) .
  • the objects may be associated with ambient conditions, such as puddles occurring in rainy conditions.
  • the sensor data may include historical data. Additional information regarding storage of information indicating ambient conditions and historical data is described in further detail below with respect to operations 204, 212, Figures 8 and 9, and Tables 1 and 2.
  • Additional sensors present on the vehicle may detect moisture or water, such as puddles or ground water in the vicinity of the destination or near the vehicle.
  • a sensor configured for detected moisture or water such as a hydrometer or aerometer, may be implemented in or on a vehicle.
  • the corresponding sensor data may be provided to apparatus 25, such as via communication interface 24.
  • the apparatus 25 may include means, such as the processor 20, communication interface 24, memory 26, or the like, for determining an ambient condition based on the sensor data.
  • the ambient condition may be considered an ambient condition of the vehicle and/or the destination.
  • the ambient condition may include weather conditions, such as any form of precipitation such as snow, sleet, rain, and/or the like.
  • the ambient condition may be determined based on any of the aforementioned sensor data (e.g., sensor data relating to moisture or water) .
  • apparatus 25 may include means, such as the processor 20, communication interface 24, memory 26, or the like, for determining at least one of vehicle information, passenger type and/or delivery object type.
  • vehicle information may be any information describing the vehicle.
  • the vehicle information may indicate the vehicle size and/or height, for example, and therefore impact the positioning of the vehicle relative to a curb or other object when the vehicle is stopped.
  • the passenger information may include information regarding age or capabilities of the passengers, such as the passengers being children, elderly, disabled, and/or the like.
  • the delivery object information may indicate information regarding size, shape, and/or weight of luggage, packages, equipment, and/or any other information describing an object that is being picked up or delivered.
  • the apparatus 25 may include means, such as the processor 20, communication interface 24, memory 26, or the like, for identifying a position of an object in at least one of the environment of the vehicle or in the environment of the destination.
  • the object may be identified, for example based on the sensor data associated with the vehicle and/or the destination.
  • the object may be considered to be in the environment of the vehicle and/or the destination when the object is detected to be within a predefined distance of the vehicle and/or destination. For example, an object may be determined to be in proximity to the vehicle and/or destination if it is 30 feet or less from the vehicle and/or destination.
  • the object may be determined to be in the environment of the destination ifthe object is detected by the vehicle when the vehicle is positioned as close to the destination that may be achieved based on the vehicle capabilities and/or environment (e.g., the end of the road, or transition from road to a pedestrian-only area) .
  • Examples of objects detected in the environment of the vehicle and/or destination may include other vehicles, puddles, ponds, manholes, manhole covers, curbs, bumps, steps, and/or the like.
  • the objects may also include objects that would obstruct opening of a vehicle door, such as but not limited to walls, buildings, street signs, posts, mailboxes, trees, shrubs, other vehicles and/or the like.
  • an object identified with respect to operation 208 may be considered an obstacle.
  • the apparatus 25 may include means, such as the processor 20, communication interface 24, memory 26, or the like, for determining a stopping position corresponding to the destination based on at least one of the sensor data, the ambient condition (s) , position of the object (s) , vehicle information, passenger information, delivery object information, and/or historical data.
  • any of operations 202, 204, 206 or 208 respectively relating to receiving sensor data, determining an ambient condition, determining vehicle, passenger and/or delivery object information, and identifying a position of an object may be considered optional operations.
  • any combination of operations 202, 204, 206, and/or 208 may be performed to enable the apparatus 25 to determine the stopping positon.
  • the apparatus 25 may utilize sensor data, an ambient condition (s) , vehicle information, passenger information, delivery object information, and/or a position of an object to determine the stopping position, the apparatus 25 may additionally or alternatively determine a stopping position based on historical data.
  • the historical data may relate to any of the aforementioned information. The collection of historical data and access thereto is described in further detail below with respect to Figures 8 and 9 and Tables 1 and 2.
  • Figures 3 and 4 illustrate an environment of a destination, according to an example embodiment.
  • the apparatus 25 and/or processor 20 may detect or determine whether there is an object in the environment, such as a puddle 300, for example, or other obstacle.
  • a puddle is referred to herein by way of example, but it will be appreciated that other types of objects are detected, and a stopping position determined accordingly.
  • the target stopping position may be considered an initially estimated target stopping position based on the destination.
  • the target stopping position may be the first position to which a vehicle arrives that is within a threshold distance (e.g., 50 feet) of the destination.
  • the target stopping position may be the closest position to the destination in which the apparatus 25, such as with processor 20, determines the vehicle can reach or access.
  • the target stopping positon may be adjusted until a stopping position is determined that is suitable for the vehicle to stop.
  • the apparatus 25 and/or processor 20 may identify the target stopping position as the stopping position.
  • the vehicle may navigate itself to the stopping position, stop, and launch a procedure to unload passengers such as notifying passengers about arrival, reminding passengers to carry luggage, unlocking the door, locking the windows and doors, and/or the like.
  • apparatus 25 may detect the side of the road or roadside 310.
  • the side of the road may be the left roadside of the vehicle if the vehicle is driving on the left, or right roadside of the vehicle if the vehicle is driving on the right.
  • the apparatus 25 and/or processor 20 may determine whether the detected roadside, or edge of the road, is inside or adjacent to the puddle such that the puddle may be an obstacle.
  • the apparatus 25, such as with processor 20, may calculate the distance 320 from the puddle edge to the roadside (d puddle) .
  • the apparatus 25, such as with processor 20 and/or memory 26, may store, generate, and/or access a configuration that indicates the minimum distance (d stop) between a vehicle edge to the roadside when determining a stopping position along the roadside.
  • the apparatus 25, such as with processor 20, may then adjust the target stopping position. The operation may be repeated until a target stopping position is determined as the stopping position of the vehicle.
  • the stopping positon may be determined such that the vehicle avoids various obstacles, or such that passengers and/or delivery objects can avoid obstacles.
  • the iterations of evaluating a target stopping position as a suitable stopping position may be capped such that a stopping position is determined within a reasonable (e.g., predetermined) amount of time or within a predetermined number of iterations.
  • the target stopping position may be considered suitable, or may be determined as the stopping position of the vehicle. This is because d puddle is indicative of the puddle not being an obstacle relative to the target stopping position.
  • apparatus 25, such as with processor 20 determines the desired drop-off spot, or target stopping position 500 is not suitable for unloading due to puddle 300.
  • Apparatus 25, such as with processor 20, may repeat the procedure described above to detect and determine whether the alternate spot 510 is suitable. If so, the apparatus 25, such as with processor 20, may determine the stopping position as the alternate spot 510. If the alternate spot 510 is not suitable for stopping, such as because an obstacle is detected, the apparatus 25, such as with processor 20, may repeat the procedure until the stopping position is determined.
  • Figures 6 and 7 illustrate an example in which the determination of the stopping position is based on a vehicle type or size.
  • target stopping position 610 may be identified based on a destination for vehicle 600.
  • Apparatus 25, such as with processor 20, may detect there is a puddle 602 in the vicinity of target stopping position 610 or the destination, but that d puddle > d stop. Therefore, apparatus 25, such as with processor 20, determines that target stopping position 610 is suitable for drop-off, or that target stopping position 610 is the determined stopping position corresponding to the destination.
  • a vehicle 700 of a different vehicle type and/or size than that of vehicle 600, approaches the same destination as described in Figure 6, at a different instance in time as described with respect to Figure 6.
  • the vehicle 700 is a different type and/or size, such that d stop is different.
  • apparatus 25, such as with processor 20, determines d puddle ⁇ d stop, and therefore target stopping position 610 is not suitable, and is not determined as the stopping position.
  • Apparatus 25, such as with processor 20, may then repeat operations to determine if a different target stopping position, such as target stopping position 720, is suitable for stopping.
  • apparatus 25 such as with processor 20, may be configured to access and/or store any information relating to the sensor data, the ambient condition, the position of an object, the destination, vehicle, and/or the like on an information management system (e.g., memory 26) .
  • the stored information may be considered historical data and may be accessed by apparatus 25 to determine the stopping position, such as described below in the context of historical weather data.
  • the apparatus 25 may include means, such as the processor 20, communication interface 24, memory 26, or the like, for causing information regarding any of the sensor data, the ambient condition, the positon of the object, and/or the determination of the stopping position to be stored, such as in an information management system.
  • a vehicle may upload information to the information management system and/or apparatus 25.
  • the data may include any of the detected and/or calculated data described herein, such as, but not limited to:
  • a location of the vehicle such as GPS location, with which to associate the other data to be stored;
  • the apparatus 25, such as with processor 20, may access any of the above data for performance of operations 200, 202, 204, 206, 208 and/or 210.
  • the information management system may be embodied by apparatus 25 or may be external to apparatus 25.
  • Figures 8 and 9 illustrate examples in which apparatus 25 utilizes historical data accessed from the information management system, and a determined ambient condition to determine the stopping positions.
  • Figures 8 and 9 illustrate parking spots 850 and 860 in different moments or instances in time.
  • Table 1 indicates historical data relating to parking spots 850 and 860.
  • the data may be stored in an information management system and is accessible to or implemented on apparatus 25.
  • the type of weather is recorded along with the date on which the weather was assessed.
  • Indications of whether parking spots 850 and/or 860 were found to be suitable, or were determined as stopping positions for a vehicle, are also recorded in Table 1.
  • apparatus 25 determined that the weather was sunny, and that parking spots 850 and 860 were both available for stopping or drop-off.
  • parking spot 850 was unavailable or unsuitable for stopping, but parking spot 860 was determined to be available or suitable.
  • apparatus 25 may determine a stopping position of a vehicle. As illustrated in Figure 8, at a time of 14 ⁇ 15 (as indicated by timestamp 800) , vehicle 810 approaches a destination and/or target stopping position.
  • the apparatus 25, such as with processor 20, determines an ambient condition indicating that the weather is sunny.
  • both parking spots 850 and 860 have previously been indicated as suitable stopping positions in sunny weather.
  • apparatus 25 may identify the closer parking spot 850, of the two suitable stopping positions, as the determined stopping position.
  • vehicle 900 approaches a destination and/or target stopping position at 14 ⁇ 25.
  • Apparatus 25, such as with processor 20, determines an ambient condition indicating that the weather is rainy. Based on the historical data in Table 1, apparatus 25 may further determine that parking spot 850 is not a suitable stopping position in rainy weather, but that parking spot 860 is a suitable stopping position, and the apparatus 25 may therefore determine the stopping positing as parking spot 860. Therefore, even if the apparatus 25 does not detect the actual puddle 910, the apparatus 25 may determine there is an obstacle likely present in parking spot 850, based on the historical data.
  • the historical data may include information regarding precipitation.
  • the historical data in Table 2 may indicate precipitation recorded on specified days, and whether parking spots 850 and 860 were suitable stopping positions.
  • any type of information may be stored and/or accessed by apparatus 25 in determining a stopping position of a vehicle.
  • An example embodiment provided herein therefore provides improvements to autonomous vehicles, or systems that control autonomous vehicles. Rather than stopping in any position in the vicinity of a destination, which may include obstacles or hazards for passengers and/or delivered objects, an example embodiment enables a vehicle to stop in a position that is more suitable for passenger pickup, delivery and/or the like.
  • Figure 2 illustrates a flowchart of an apparatus 25, method, and computer program product according to certain example embodiments of the invention.
  • each block of the flowchart, and combinations of blocks in the flowchart may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other devices associated with execution of software including one or more computer program instructions.
  • one or more of the procedures described above may be embodied by computer program instructions.
  • the computer program instructions which embody the procedures described above may be stored by a memory device 26 of an apparatus 25 employing an embodiment of the present invention and executed by a processor 20 of the apparatus 25.
  • any such computer program instructions may be loaded onto a computer or other programmable apparatus (e.g., hardware) to produce a machine, such that the resulting computer or other programmable apparatus implements the functions specified in the flowchart blocks.
  • These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart blocks.
  • blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, may be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
  • certain ones of the operations above may be modified or further amplified. Furthermore, in some embodiments, additional optional operations may be included. Modifications, additions, or amplifications to the operations above may be performed in any order and in any combination.
  • the method, apparatus 25 and computer program product may be utilized in various scenarios and implementations.
  • the apparatus 25 may be implemented within a vehicle, or, as in some examples, may be embodied by a server configured to communicate with a vehicle and/or navigation system.
  • information regarding a destination may be received and processed by apparatus 25, and information regarding the stopping position may be transmitted from the apparatus 25 to the vehicle and/or navigation system.
  • apparatus 25 may be operative in an industrial setting in which packages, machines, and/or parts are routed

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé, un appareil et un produit-programme informatique de détermination d'une position d'arrêt d'un véhicule. Le véhicule peut être une voiture sans conducteur ou autonome. La position d'arrêt est déterminée sur la base d'une destination, et d'un nombre quelconque d'autres facteurs, tels que des données de capteur, une ou des conditions ambiantes, une position d'un objet dans l'environnement du véhicule et/ou de la destination, des données historiques, des informations de véhicule, des informations de passager et/ou des informations d'objet de livraison. Des données et des informations concernant des positions d'arrêt peuvent être mémorisées à des fins d'utilisation ultérieure pour une détermination d'une position d'arrêt. Des véhicules peuvent par conséquent s'arrêter à des positions telles que des passagers et/ou des objets de livraison ne soient pas gênés par des flaques, des trous d'homme et/ou d'autres obstacles.
PCT/CN2016/113099 2016-12-29 2016-12-29 Procédé, appareil et produit-programme informatique de détermination de position d'arrêt d'un véhicule WO2018119907A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/113099 WO2018119907A1 (fr) 2016-12-29 2016-12-29 Procédé, appareil et produit-programme informatique de détermination de position d'arrêt d'un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/113099 WO2018119907A1 (fr) 2016-12-29 2016-12-29 Procédé, appareil et produit-programme informatique de détermination de position d'arrêt d'un véhicule

Publications (1)

Publication Number Publication Date
WO2018119907A1 true WO2018119907A1 (fr) 2018-07-05

Family

ID=62710167

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/113099 WO2018119907A1 (fr) 2016-12-29 2016-12-29 Procédé, appareil et produit-programme informatique de détermination de position d'arrêt d'un véhicule

Country Status (1)

Country Link
WO (1) WO2018119907A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014211705A (ja) * 2013-04-17 2014-11-13 日本電気株式会社 最寄り駐車場推定システムおよび最寄り駐車場推定方法と最寄り駐車場推定用プログラム
WO2015137012A1 (fr) * 2014-03-12 2015-09-17 日産自動車株式会社 Dispositif de conduite de véhicule
CN204965489U (zh) * 2015-08-13 2016-01-13 杭州若联科技有限公司 一种智能快递系统
GB2528081A (en) * 2014-07-08 2016-01-13 Jaguar Land Rover Ltd Improvements in automotive navigation systems
JP2016023978A (ja) * 2014-07-17 2016-02-08 株式会社 ミックウェア ナビゲーション装置、携帯端末、ナビゲーション方法、およびプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014211705A (ja) * 2013-04-17 2014-11-13 日本電気株式会社 最寄り駐車場推定システムおよび最寄り駐車場推定方法と最寄り駐車場推定用プログラム
WO2015137012A1 (fr) * 2014-03-12 2015-09-17 日産自動車株式会社 Dispositif de conduite de véhicule
GB2528081A (en) * 2014-07-08 2016-01-13 Jaguar Land Rover Ltd Improvements in automotive navigation systems
JP2016023978A (ja) * 2014-07-17 2016-02-08 株式会社 ミックウェア ナビゲーション装置、携帯端末、ナビゲーション方法、およびプログラム
CN204965489U (zh) * 2015-08-13 2016-01-13 杭州若联科技有限公司 一种智能快递系统

Similar Documents

Publication Publication Date Title
US11989028B2 (en) Mobile robot system and method for generating map data using straight lines extracted from visual images
US10024965B2 (en) Generating 3-dimensional maps of a scene using passive and active measurements
US11313976B2 (en) Host vehicle position estimation device
US9495602B2 (en) Image and map-based detection of vehicles at intersections
US20160363647A1 (en) Vehicle positioning in intersection using visual cues, stationary objects, and gps
CA2971594A1 (fr) Detection d'objet a l'aide de donnees d'emplacement et de representations d'espace d'echelle de donnees d'image
WO2019061311A1 (fr) Procédé de commande d'une voiture autonome, terminal de commande et support d'informations lisible par machine
US20200327811A1 (en) Devices for autonomous vehicle user positioning and support
CN111736587A (zh) 用于向递送机器人提供导航辅助的系统和方法
JPWO2018221454A1 (ja) 地図作成装置、制御方法、プログラム及び記憶媒体
Jiménez et al. Improving the lane reference detection for autonomous road vehicle control
CN110929475B (zh) 对象的雷达简档的注释
JP7385388B2 (ja) 自己位置推定装置
US20210090004A1 (en) System and method for returning lost items
Park et al. Vehicle localization using an AVM camera for an automated urban driving
WO2018119907A1 (fr) Procédé, appareil et produit-programme informatique de détermination de position d'arrêt d'un véhicule
US20220262177A1 (en) Responding to autonomous vehicle error states
JP2022034051A (ja) 測定装置、測定方法およびプログラム
US20230168688A1 (en) Sequential mapping and localization (smal) for navigation
JP6680502B2 (ja) 移動体
Tang et al. An approach of dynamic object removing for indoor mapping based on UGV SLAM
US20230243666A1 (en) Method for mapping, mapping device, computer program, computer readable medium, and vehicle
KR20240030733A (ko) 운송 로봇 시스템 및 그 제어 방법
JP2008191841A (ja) 動作状態解析システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16925000

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16925000

Country of ref document: EP

Kind code of ref document: A1