WO2018191886A1 - Autonomous vehicle and control method therefor - Google Patents

Autonomous vehicle and control method therefor Download PDF

Info

Publication number
WO2018191886A1
WO2018191886A1 PCT/CN2017/081078 CN2017081078W WO2018191886A1 WO 2018191886 A1 WO2018191886 A1 WO 2018191886A1 CN 2017081078 W CN2017081078 W CN 2017081078W WO 2018191886 A1 WO2018191886 A1 WO 2018191886A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
user
visual signal
information
pick
Prior art date
Application number
PCT/CN2017/081078
Other languages
French (fr)
Inventor
Biyun ZHOU
Markus Seidel
Original Assignee
Bayerische Motoren Werke Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke Aktiengesellschaft filed Critical Bayerische Motoren Werke Aktiengesellschaft
Priority to CN201780088607.1A priority Critical patent/CN110431604B/en
Priority to PCT/CN2017/081078 priority patent/WO2018191886A1/en
Publication of WO2018191886A1 publication Critical patent/WO2018191886A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/503Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
    • B60Q1/5035Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text electronic displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/507Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/549Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for expressing greetings, gratitude or emotions

Definitions

  • the present disclosure relates in general to a field of an autonomous vehicle, and in particular, to a method, system, and a non-transitory computer readable medium for controlling an autonomous vehicle to provide passenger pick-up services.
  • the current process for passenger pick-up service comprises: receiving and confirming the order by the driver, driving to the pick-up location to meet the passenger, calling the passenger when the driver approaches the pick-up location and waiting for the passenger to come.
  • autonomous vehicles e.g., unmanned vehicles
  • ODM On-Demand Mobility
  • An aspect of the present disclosure mainly aims to provide an autonomous vehicle, as well as a method, system, and non-transitory computer readable medium for controlling an autonomous vehicle to provide a passenger pick-up service.
  • an autonomous vehicle comprising: a communication unit configured to receive order information, wherein the order information includes at least information about a user to be picked up and a pick-up location; a visual signal providing unit configured to provide one or more visual signal; and a control unit configured to: acquire the order information from the communication unit; determine whether the vehicle arrives at or is about to arrive at the pick-up location or not; and control the visual signal providing unit to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle
  • the visual signal providing unit may comprise at least one of: at least one display apparatus configured to display one or more characters and/or one or more images on a window and/or an outer surface of the vehicle; at least one light-emitting apparatus configured to emit colored light and/or blinking light visible from outside the vehicle; and at least one adaptation mechanism configured to adapt an external appearance of the vehicle to provide a visual signal, wherein the external appearance includes at least color and/or shape.
  • the at least one display apparatus may comprise one or more of a flat display apparatus, a curved display apparatus, a flexible display apparatus, projection display apparatus, and a holographic display apparatus.
  • the vehicle may further comprise a detecting unit configured to detect a distance between the user and the vehicle, wherein the control unit may be configured to control, in the case that the distance is less than a threshold, the visual signal providing unit to perform at least one of: providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
  • a detecting unit configured to detect a distance between the user and the vehicle
  • the control unit may be configured to control, in the case that the distance is less than a threshold, the visual signal providing unit to perform at least one of: providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
  • control unit may be further configured to: control the communication unit to send, after acquiring the order information, a signal indicating an avatar of the vehicle to an electronic device of the user; and control the visual signal providing unit to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal received from the electronic device of the user by the communication unit and/or detecting the user in a vicinity of the vehicle by the detecting unit.
  • control unit may be further configured to control the visual signal providing unit to display, after acquiring the order information, a picture or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
  • control unit may be further configured to control the communication unit to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the arrival notification may include information about a position of the vehicle, and/or video and/or audio information about surroundings of the vehicle.
  • control unit may be further configured to: control the communication unit to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle by the detecting unit; and control the visual signal providing unit to provide a second visual signal after receiving a response signal from the electronic device of the user by the communication unit.
  • a computer-implemented method for controlling an autonomous vehicle comprising: acquiring order information, wherein the order information includes at least information about a user to be picked up and a pick-up location; determining whether the vehicle arrives at or is about to arrive at the pick-up location or not; and causing the vehicle to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture and/or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle.
  • the step of causing the vehicle to provide the first visual signal may comprise causing the vehicle to perform, in the case of detecting that a distance between the user and the vehicle is less than a threshold, at least one of: providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
  • the method may further comprise causing the vehicle to send a signal indicating an avatar of the vehicle to an electronic device of the user after acquiring the order information, wherein the step of causing the vehicle to provide the first visual signal may comprise causing the vehicle to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal from the electronic device of the user and/or detecting the user in a vicinity of the vehicle.
  • the method may further comprise causing the vehicle to display, after acquiring the order information, a picture and/or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
  • the first visual signal may vary with a distance between the user and the vehicle.
  • the method may further comprise causing the vehicle to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the arrival notification may include information about a position of the vehicle, and/or video and/or audio information about surroundings of the vehicle.
  • the method may further comprise: causing the vehicle to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle; and causing the vehicle to provide a second visual signal and/or audible signal after receiving a response signal from the electronic device of the user.
  • a system for controlling an autonomous vehicle comprising: one or more processors; and one or more memories configured to store a series of computer executable instructions, wherein the series of computer executable instructions, when executed by the one or more processors, cause the one or more processors to perform the steps of the above mentioned method.
  • a non-transitory computer readable medium having instructions stored thereon that, when executed by one or more processors, causing the one or more processors to perform the steps of the above mentioned method is provided.
  • Fig. 1 illustrates a block diagram of an autonomous vehicle in accordance with an exemplary embodiment of the present disclosure.
  • Fig. 2 illustrates a flow chart showing a method of controlling the autonomous vehicle in accordance with an exemplary embodiment of the present disclosure.
  • Fig. 3 illustrates a block diagram of an apparatus for controlling the autonomous vehicle in accordance with an exemplary embodiment of the present disclosure.
  • Fig. 4 illustrates possible visual effects of the vehicle in accordance with an exemplary embodiment of the present disclosure.
  • Fig. 5 illustrates possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure.
  • Figs. 6A-6B illustrate possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure.
  • Figs. 7A-7B illustrate possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure.
  • Fig. 8 illustrates a general hardware environment wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
  • vehicle used throughout the specification refers to a land vehicle, a watercraft, an underwater vehicle, an aircraft, a spacecraft, or the like.
  • a and/or B used throughout the specification refers to “A” , “B” , or “A and B” .
  • the vehicle 100 may comprise at least: a communication unit 101 that may communicate with an external device (not shown) ; a visual signal providing unit 103 that may provide at least one visual signal that allows the vehicle 100 to be identified by the user; and a control unit 102 that may control the visual signal providing unit 103 or an overall operation of the vehicle 100.
  • the communication unit 101 may communicate with the external device, e.g., a server for providing the ODM (On-Demand Mobility) business, or an electronic device (e.g. a smart phone) of a user of the ODM business who is intended to be the passenger of this vehicle, via a network (not shown) .
  • the network may include a local area network (LAN) , a wide area network (WAN) (e.g., the Internet) , a virtual network, a telecommunications network, and/or other interconnected paths across which multiple entities may communicate.
  • the network includes communication networks or a cellular communications network for sending and receiving data via e.g.
  • the network may be a mobile data network such as CDMA, GPRS, TDMA, GSM, WIMAX, 3G, 4G, LTE, VoLTE, or any other mobile data network or combination of mobile data networks.
  • the communication unit 101 may communicate with a platform (not shown) via the network described above so as to communicate with the electronic device of the user.
  • the platform may include at least one server and at least one application operated thereon.
  • the communication unit 101 and the electronic device of the user may both connect to the platform via the network, thus the communication unit 101 may send data to and receive data from the electronic device of the user.
  • the visual signal providing unit 103 may comprise at least one of: one or more display apparatus, one or more light-emitting apparatus, and one or more adaptation mechanisms, so as to provide at least one visual signal to the user.
  • the display apparatus may be configured to display one or more characters and/or one or more images on one or more windows and/or one or more outer surfaces of the vehicle.
  • the characters may include letters, numbers, symbols, and so on, and the images may include pictures, photos, icon, portrait, and so on. Both the characters and the images may be displayed statically or dynamically by the display apparatus.
  • the one or more characters and/or the one or more images displayed by the display apparatus may present, as the at least one visual signal to the user, at least one of greeting information for the user, personal information of the user, a picture or a user name predefined by the user, information customized by the user and the like.
  • the display apparatus may comprise one or more of a flat display apparatus, a curved display apparatus, a flexible display apparatus, projection display apparatus, a holographic display apparatus and the like. It will be apparent to those skilled in the art that the present invention is not limited to these listed displays, but can be any type of display as long as it can display the characters and/or images on the windows and/or the outer surfaces of the vehicle.
  • the one or more characters and/or the one or more images may be displayed on one or more windows of the vehicle.
  • the display apparatus may be the window itself, that is, the window of the vehicle may be configured as a display screen.
  • the one or more characters and/or the one or more images may be projected, e.g., holographically, to the window.
  • One or more projectors provided in, on or out of the vehicle project light distribution to the window so as to form display on or in the window.
  • the window of the vehicle may be configured as a projection screen.
  • a display may form on the inner surface and/or the outer surface of the window, and/or inside the window. No matter the display is formed in which portion of the window, the display surface may face toward the interior and/or the exterior of the vehicle.
  • the window may be nontransparent, translucent, or transparent.
  • the one or more characters and/or the one or more images may be displayed on one or more outer surfaces of the vehicle.
  • An outer surface of the vehicle may be an outer surface of a door, a frame, a side mirror, a windshield, a wheel, a mudflap, a roof, a trunk, a tailgate, an engine case, an engine hood, etc.
  • the outer surface of the vehicle may be configured as a display screen or a projection screen, the features of which are similar with the above.
  • the light-emitting apparatus may be configured to emit light, preferably colored light and/or blinking light, visible from outside the vehicle to provide at least one visual signal.
  • the light emitted from the light-emitting apparatus may be more conspicuous to the user in conditions of weak light, such as in indoor environments, in tunnels, in underground parking places or the like and/or at dusk, at night or the like.
  • the colored light and/or blinking light may catch the attention of the user more easily.
  • the colored light and/or blinking light may be white colored, single colored or multiple colored light.
  • the light emitted from the light-emitting apparatus may interact with the user and/or the electronic device of the user.
  • the colored light and/or blinking light may present a visual effect to the user to express greeting or the like.
  • the adaptation mechanism may be configured to adapt an external appearance of the vehicle to provide at least one visual signal, wherein the external appearance includes at least color and/or shape.
  • the adaptation mechanism may be implemented as a color-changing material and/or a color-changing surface of the vehicle, and/or a material which changes its shape and/or a surface of the vehicle which changes its shape. Examples of this are surface coating agents which can change their color, switchable windows/films and shape-memory polymers.
  • the vehicle 100 may further comprise a detecting unit 104 that may detect a distance between the user and the vehicle 100, as shown in Fig. 1 where the dashed line represents the component 104 is optional. It should be understood that a control unit 102 could also control the operation of the detecting unit 104.
  • the detecting unit 104 may be configured to detect a distance between the user and the vehicle 100 by detecting the position of the user.
  • the detecting unit 104 may acquire position data from a global positioning system (GPS) , which are provided by an electronic device of the user having a GPS positioning function, so as to detect the position of the user.
  • GPS global positioning system
  • the position data need not necessarily be from a GPS, but may also be from a base station positioning system or a WiFi positioning system.
  • the detecting unit 104 may need to establish communicating connection via a network with the electronic device of the user or with the platform to acquire the position data.
  • the detecting unit 104 and the communication unit 101 may be integral. While in other implementations, the detecting unit 104 may be distinct from the communication unit 101.
  • the detecting unit 104 and the communication unit 101 may be implemented as one element or as separate elements.
  • the detecting unit 104 may be configured to detect a distance between the user and the vehicle 100 by detecting the distance directly without detecting the position of the user.
  • a distance sensor may function as the detecting unit 104 to detect the distance between the user and the vehicle 100.
  • the method of detecting the distance by the distance sensor is not particularly limited thereto.
  • the distance sensor may detect the distance using infrared rays, ultrasonic wave or the like.
  • the detecting unit 104 may use a wireless communication means to detect the distance between the user and the vehicle 100, such as a body area network (BAN) , a communication, Bluetooth Low Energy (BLE) communication (e.g., iBeacon) , near field communication (NFC) , WiFi communication, Zigbee communication, magnetic communications, electromagnetic communications (including RF, microwave, etc. ) , and other such communication means.
  • a wireless communication means to detect the distance between the user and the vehicle 100, such as a body area network (BAN) , a communication, Bluetooth Low Energy (BLE) communication (e.g., iBeacon) , near field communication (NFC) , WiFi communication, Zigbee communication, magnetic communications, electromagnetic communications (including RF, microwave, etc. ) , and other such communication means.
  • BLE Bluetooth Low Energy
  • NFC near field communication
  • WiFi communication WiFi communication
  • Zigbee communication Zigbee communication
  • magnetic communications including RF, microwave, etc.
  • electromagnetic communications including
  • the control unit 102 receives data from various other components of the vehicle 100, e.g., the communication unit 101, the visual signal providing unit 103 and the detecting unit 104. And, the control unit 102 transmits control commands to the above-mentioned various other components.
  • a connection line between various components represents a bi-directional communication line, which may be tangible wires or may be achieved wirelessly, such as via radio, RF, or the like.
  • the specific controlling operations performed by the control unit 102 will be described in details later.
  • the control unit 102 may be a processor, a microprocessor or the like.
  • the control unit 102 may be provided on the vehicle 100, for example, at the central console of the vehicle 100. Alternatively, the control unit 102 may be provided remotely and may be accessed via various networks or the like.
  • each of the units of the vehicle 100 may be in communication with each other, and each of the units of the vehicle 100 may be integrated into or installed on the vehicle 100 and may also be external to the vehicle 100.
  • the components of the vehicle 100 may be implemented by hardware, software, firmware, or any combination thereof to carry out the principles of the present disclosure. It should be understood by those skilled in the art that the blocks described in Fig. 1 may be combined or separated into sub-blocks to implement the principles of the present disclosure as described above. Therefore, the description herein may support any possible combination or separation or further definition of the blocks described herein.
  • the electronic device of the user may be any type of wired or wireless device having communication capability.
  • Exemplary electronic devices of the user include wearable devices, cell phones or other mobile communication devices, GPS devices, slate computers or other personal computing devices, personal data assistants (PDAs) , MP3 players or other personal music-playing devices, cameras, video cameras, or the like.
  • PDAs personal data assistants
  • the communication unit 101 may receive order information, wherein the order information includes at least information about a user to be picked up and a pick-up location.
  • the order may be placed by the user or someone else.
  • the communication unit 101 may receive order information from the user or from the platform via a network.
  • the order information received by the communication unit 101 may be transmitted to the control unit 102 via wire (s) or wirelessly.
  • the control unit 102 may acquire the order information from the communication unit 101 and may then confirm the order and travel to the pick-up location automatically.
  • the control unit 102 may control the visual signal providing unit 103 to provide a first visual signal that allows the vehicle 100 to be identified by the user.
  • the control unit 102 may determine whether the vehicle 100 arrives at or is about to arrive at the pick-up location with a position of the vehicle 100.
  • the position of the vehicle 100 may be detected by a position detecting unit.
  • the position detecting unit may be the same as or distinct from the detecting unit 104.
  • the position detecting unit and the detecting unit 104 may be implemented as one element or as separate elements.
  • the control unit 102 may determine how much time is left before arriving at the pick-up location. If the determined left time is in the range of a predetermined time range, e.g., 5 minutes, 10 minutes or the like, the control unit 102 may determine that the vehicle 100 is about to arrive at the pick-up location. Similarly, the control unit 102 may determine the distance between the vehicle 100 and the pick-up location. If the determined distance is in the range of a predetermined distance range, e.g., 1 kilometers, 5 kilometers or the like, the control unit 102 may determine that the vehicle 100 is about to arrive at the pick-up location. That is to say, the term “be about to arrive” herein generally means there is less than a predetermined time’s drive (e.g. less than 5 or 10 minutes) away from the pick-up location, or the vehicle has less than a predetermined distance (e.g. less than 1, 2, 3 or 5 kilometers) away from the pick-up location.
  • a predetermined time range e.g., 5 minutes, 10 minutes or the
  • the determination of the predetermined distance time rage or the predetermined distance range is not particularly limited. For example, they may be predetermined based on the time to be taken by the user to the pick-up location, the position of the user (or the distance between the user and the pick-up location) and/or the user preferences.
  • the control unit 102 may control the visual signal providing unit 103 to provide the first visual signal when or after the vehicle 100 is about to arrive at the pick-up location (e.g., during the vehicle 100 is driving to the pick-up location) , when the vehicle 100 arrives at the pick-up location (e.g., at the time of arriving at the pick-up location) , and/or after arriving at the pick-up location (e.g., after waiting for the user for a while at the pick-up location) .
  • the user may identify which vehicle is the one ordered to pick him/her up.
  • control unit 102 may control the visual signal providing unit 103 to perform at least one of: providing greeting information for the user on one or more windows and/or one or more outer surfaces of the vehicle 100 as the first visual signal; and emitting colored light and/or blinking light visible from outside the vehicle 100 as the first visual signal.
  • the control unit 102 may control the visual signal providing unit 103 to provide greeting information for the user on one or more windows and/or one or more outer surfaces of the vehicle 100.
  • the visual signal providing unit 103 may comprise one or more display apparatus, and the greeting information may be presented as one or more characters and/or one or more images on one or more windows and/or one or more outer surfaces of the vehicle 100.
  • the greeting information may comprise, personal information of the user (e.g., name information, title information, gender information, etc.
  • a picture or a user name predefined by the user information customized by the user (e.g., the content, style, color of displaying customized by the user) , an image representing an avatar of the vehicle 100 (e.g., the image representing the avatar of the vehicle 100 may say some greeting words) , etc.
  • Fig. 4 Possible visual effects of the vehicle in accordance with an exemplary embodiment of the present disclosure is shown in Fig. 4.
  • the greeting information 401 for the user “MS. ZHANG, WELCOME TO SHANGHAI” , is provided on the window (s) 402 of the vehicle 100 by the visual signal providing unit 103.
  • the greeting information may also be provided on the outer surface of the vehicle 100.
  • the user may easily identify the vehicle 100 with the help of the greeting information.
  • the greeting information may elevate the experience of the user.
  • a welcome voice or a notification sound or other audio can be played for helping the user to find the vehicle or greeting the user or the like.
  • the greeting information may comprise personal information of the user, in order to protect privacy of the user, it is preferable that the greeting information may be provided in the case that the distance between the user and the vehicle 100 is less than a threshold (e.g., the user is close to the vehicle 100) .
  • the threshold may be preset to any desired values, e.g., 5 meters, 10 meters or the like, according to the practical application or experiences.
  • control unit 102 may control the visual signal providing unit 103 to emit colored light and/or blinking light visible from outside the vehicle 100 as the first visual signal.
  • the visual signal providing unit 103 may comprise one or more light-emitting apparatus. Possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure is shown in Fig. 5.
  • the visual signal providing unit 103 may comprise light stripes 501, 502 and 503. The colored and/or blinking light visible from outside the vehicle 100 has been emitted by the light stripes 501, 502 and 503.
  • the shown light is in the shape of strip around the roof and the rear lamps of the vehicle 100, it should be understood that the light-emitting apparatus could be disposed anywhere and the light could be any color and in any shape.
  • the user may predefine the luminance, the color, the shape, the blinking frequency and/or the like of the light.
  • the first visual signal implemented as the light emitted from the light-emitting apparatus it may be more conspicuous to the user in conditions of weak light.
  • the colored light and/or blinking light may catch the attention of the user more easily.
  • the user may easy to identify the vehicle 100 with the help of the colored light and/or blinking light.
  • the colored light and/or blinking light may present a visual effect to greet the user so as to elevate the experience of the user.
  • the light may be emitted in the case that the distance between the user and the vehicle 100 is less than a threshold (e.g., the user is in the vicinity of the vehicle 100, the vehicle 100 is within the visual range of the user, etc. ) .
  • a threshold e.g., the user is in the vicinity of the vehicle 100, the vehicle 100 is within the visual range of the user, etc.
  • the car of Fig. 5 may additionally display a picture 504 predetermined by the user on its side window, so as to help the user to more easily identify the car.
  • the picture 504 may be e.g. the picture used in a social network by the user, a picture defined on the platform by the user, or the like.
  • Fig. 4 and Fig. 5 respectively, it should be understood that the above visual effects may be presented in combination with each other.
  • greeting information for the user is provided by the visual signal providing unit 103 (e.g., at least one display apparatus)
  • colored light and/or blinking light may also be emitted by the visual signal providing unit 103 (e.g., at least one light-emitting apparatus) .
  • the control unit 102 may control the communication unit 101 to send a signal indicating an avatar of the vehicle 100 to an electronic device of the user.
  • the avatar of the vehicle 100 may be a picture of a person, an animal or a cartoon that may represent the vehicle 100.
  • the avatar of the vehicle 100 may be a virtual representation of appearance of its virtual driver.
  • the electronic device of the user may display an image representing the avatar of the vehicle 100 based on the received signal. Possible visual effects of the electronic device of the user displaying an image representing the avatar of the vehicle 100 is illustrated in Fig. 6A.
  • the electronic device of the user may be a smart watch 603.
  • the smart watch receives a signal indicating an avatar of the vehicle 100 and displays the image 601 representing the avatar holographically.
  • the avatar of the vehicle 100 is a virtual representation of appearance of a person (adriver specifically) .
  • the displayed image 601 representing the avatar may “talk” like a real person, e.g., it utters “I will pick you up in 5 min. ” as indicated by reference numeral 602 in Fig. 6A. This utterance can be provided visually or acoustically or both.
  • control unit 102 may control the visual signal providing unit 103 to provide, as the first visual signal, an image representing the avatar of the vehicle 100 on one or more windows and/or one or more outer surfaces of the vehicle 100.
  • the electronic device of the user may send a signal that desire a response from the vehicle 100 when the user approaches or is close to the vehicle 100.
  • the image 604 representing the avatar of the vehicle 100 is provided (e.g., holographically) on one or more side windows of the vehicle 100.
  • the avatar of the vehicle 100 is a virtual representation of appearance of a person (adriver specifically) .
  • the virtual representation of the person may appear in the windows and greet the user.
  • the displayed image 604 representing the avatar may “talk” like a real person, e.g., it utters “HELLO, LUKO! ” , as indicated by reference numeral 605 in Fig. 6B.
  • This utterance can be provided visually or acoustically or both.
  • Such utterance from the displayed virtual representation of the person may further greet the user so as to elevate the experience of the user.
  • the displayed image representing the avatar of the vehicle 100 conveys a sense to the user that a virtual chauffeur recognizes him/her and greets him/her, such that the user may be reassured and the level of trust to the vehicle 100 may be increased. Additionally, since the image provided on a window and/or an outer surface of the vehicle 100 and the image displayed in the electronic device of the user and are both representing the avatar of the vehicle 100, that is, the two images are associated with each other (even may be the same) , the user may identify the vehicle 100 with the help of the displayed image more easily.
  • the image 604 representing the avatar of the vehicle 100 shown in Fig. 6B is on the windows of the vehicle 100, it could be provided on one or more outer surface of the vehicle 100. Alternatively or additionally, in this case, greeting information for the user may also be provided on one or more windows and/or one or more outer surfaces of the vehicle 100.
  • the detecting unit 104 may detect the distance between the user and the vehicle 100.
  • the control unit 102 may control the visual signal providing unit 103 to provide, as the first visual signal, an image representing the avatar of the vehicle 100 on a window and/or an outer surface of the vehicle.
  • greeting information for the user may also be provided on one or more windows and/or one or more outer surfaces of the vehicle 100.
  • the control unit 102 may control the visual signal providing unit 103 to display a picture or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle 100.
  • the picture or a user name predefined by the user may be a picture or a user name used in a social network by the user, a picture or a user name defined on the platform by the user and the like.
  • the information customized by the user may be any type of characters and/or images.
  • the user may post something desired to self-express and/or socialize. Additionally, since the displayed contents are predefined or customized by the user, the user is so familiar with the displayed contents or the displayed contents are so attractive that the user may identify the vehicle 100 more easily (maybe at the first glance) .
  • Figs. 7A-7B Possible visual effects of the vehicle 100 displaying a picture or a user name predefined by the user and/or information customized by the user is illustrated in Figs. 7A-7B.
  • the picture 701 predefined by the user or the photo 702 customized by the user are displayed on the side window of the vehicle 100. It should be understood that they may also be displayed on the outer surface of the vehicle 100 additionally or alternatively.
  • the vehicle 100 enables the user to post something desired to self-express and/or socialize as shown in Figs. 7A-7B.
  • the vehicle 100 enables the user to post, as the picture 701 or photo 702, a game video he/she plays recently, a movie poster he/she likes, a photo of an idol who will give a concert that he/she will go to tonight, and so on.
  • the displayed contents may be presented after the control unit 102 acquires the order information. That is to say, there is no need to wait until the vehicle 100 arrives at or is about to arrive at the pick-up location to display the above contents.
  • the vehicle 100 may display the picture or the user name predefined by the user and/or the information customized by the user during its automatic travel, thus the vehicle 100 with a strong visual indicator may become a tool to show the visible identity in the traffic, making it more convenient for self-expression and socializing of the user.
  • the vehicle 100 can always display the above contents for self-expression and socializing of the user.
  • control unit 102 may control the communication unit 101 to send an arrival notification to an electronic device of the user when determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the method of determining whether the vehicle 100 arrives at or is about to arrive at the pick-up location by the control unit 102 is the same as the above described.
  • the arrival notification send by the communication unit 101 enables the user to know the vehicle 100 has arrived or is about to arrive.
  • the arrival notification includes information about a position of the vehicle 100 that may indicate the user the specific position of the vehicle 100.
  • the position of the vehicle 100 may be detected by the detection unit 104 as described above.
  • video and/or audio information about surroundings of the vehicle 100 may be send to the electronic device of the user, by being included in the arrival notification.
  • the video information about surroundings of the vehicle 100 may be a picture or a video captured by a camera.
  • the video information may show to the user that the vehicle 100 is parking beside a convenience store, or the number of the parking space and so on, so the user would find the vehicle 100 more easily with the help of the video information.
  • the audio information could also provide such help.
  • the audio information about surroundings of the vehicle 100 may be a recorded sound captured by a receiver or a video captured by a camera.
  • the audio information may assist the user in determining the specific position of the vehicle 100, especially for the user whose vision is not sensitive. For example, if the audio information represents noticeable echo (s) , the user may determine that the vehicle 100 is likely to be in an indoor space. With the help of the video and/or audio information about surroundings of the vehicle 100, the user could find the vehicle 100 more easily.
  • the first visual signal varies with the distance between the user and the vehicle.
  • the control unit 102 may control the first visual signal to change, so that the user could identify the vehicle 100 more easily. For example, the user notices that when he/she moves from position A to position B, a visual signal varies from first format to second format. Then, maybe after several going there and back, he/she may determine that the vehicle that provides the visual signal is the one ordered to pick him/her up.
  • the control unit 102 may control the communication unit 101 to send a signal that requires a response from the user to an electronic device of the user.
  • the vehicle 100 may send a signal to the electronic device of the user initiatively. The user may feel the signal by the vibration of the electronic device of the user. Then the user know that he/she is in the vicinity of the vehicle 100. In order to find the vehicle 100 more easily, the user may response the signal by sending a response signal to the vehicle 100.
  • control unit 102 may control the visual signal providing unit 103 to provide a second visual signal.
  • the second visual signal is different from the first visual signal. The user could identify the vehicle more easily based on the interaction between the vehicle 100 and the electronic device of the user.
  • Fig. 2 it illustrates a flow chart showing a method 200 of controlling the autonomous vehicle in accordance with an exemplary embodiment of the present disclosure.
  • the method 200 may be performed by e.g. the above-described control unit 102 of Fig. 1, or other apparatus.
  • the steps of the method 200 presented below are intended to be illustrative. In some embodiments, the method may be accomplished with one or more additional steps not described, and/or without one or more of the steps discussed. Additionally, the order in which the steps of method are illustrated in Fig. 2 and described as below is not intended to be limiting.
  • the method may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information) .
  • the one or more processing devices may include one or more modules executing some or all of the steps of method in response to instructions stored electronically on an electronic storage medium.
  • the one or more processing modules may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the steps of method.
  • order information may be acquired.
  • the order information includes at least information about a user to be picked up and a pick-up location.
  • step 220 whether the vehicle arrives at or is about to arrive at the pick-up location or not is determined.
  • the vehicle may be caused to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the visual signal providing unit 103 of the vehicle may be controlled by the control unit 102 to provide the first visual signal.
  • the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture and/or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle.
  • the first visual signal may vary with a distance between the user and the vehicle.
  • the vehicle may be caused to perform, in the case of detecting that a distance between the user and the vehicle is less than a threshold, at least one of: providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
  • the vehicle may be caused to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal from the electronic device of the user and/or detecting the user in a vicinity of the vehicle.
  • the method 200 may further comprise causing the vehicle to send a signal indicating an avatar of the vehicle to an electronic device of the user after acquiring the order information,
  • the method 200 may further comprise causing the vehicle to display, after acquiring the order information, a picture and/or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
  • the method 200 may further comprise causing the vehicle to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the arrival notification includes information about a position of the vehicle, and/or video and/or audio information about surroundings of the vehicle.
  • the method 200 may further comprises: causing the vehicle to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle; and causing the vehicle to provide a second visual signal and/or audible signal after receiving a response signal from the electronic device of the user.
  • Fig. 3 illustrates a block diagram of an apparatus for controlling the autonomous vehicle (e.g., the controller 102 as shown in Fig. 1) in accordance with an exemplary embodiment of the present disclosure.
  • the blocks of the apparatus 300 may be implemented by hardware, software, firmware, or any combination thereof to carry out the principles of the present disclosure. It is understood by those skilled in the art that the blocks described in Fig. 3 may be combined or separated into sub-blocks to implement the principles of the present disclosure as described above. Therefore, the description herein may support any possible combination or separation or further definition of the blocks described herein.
  • the apparatus 300 for controlling an autonomous vehicle may comprise: acquiring unit 301 for acquiring order information, wherein the order information includes at least information about a user to be picked up and a pick-up location; determining unit 302 for determining whether the vehicle arrives at or is about to arrive at the pick-up location or not; and vehicle-controlling unit 303 for causing the vehicle to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
  • the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture and/or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle.
  • the respective units in the apparatus 300 can be configured to perform the respective operations as discussed above in the method 200 of Fig. 2, and thus their details are omitted here.
  • Fig. 8 illustrates a general hardware environment 800 wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
  • the hardware environment 800 may be any machine configured to perform processing and/or calculations, may be but is not limited to a work station, a server, a desktop computer, a laptop computer, a tablet computer, a personal data assistant, a smart phone, an on-vehicle computer or any combination thereof.
  • the aforementioned control unit 102 or the apparatus 300 for controlling the autonomous vehicle may be wholly or at least partially implemented by the hardware environment 800 or a similar device or system.
  • the hardware environment 800 may comprise elements that are connected with or in communication with a bus 802, possibly via one or more interfaces.
  • the hardware environment 800 may comprise the bus 802, one or more processors 804, one or more input devices 806 and one or more output devices 808.
  • the one or more processors 804 may be any kinds of processors, and may comprise but are not limited to one or more general-purpose processors and/or one or more special-purpose processors (such as special processing chips) .
  • the input devices 806 may be any kinds of devices that can input information to the computing device, and may comprise but are not limited to a mouse, a keyboard, a touch screen, a microphone and/or a remote control.
  • the output devices 808 may be any kinds of devices that can present information, and may comprise but are not limited to display, a speaker, a video/audio output terminal, a vibrator and/or a printer.
  • the hardware environment 800 may also comprise or be connected with non-transitory storage devices 810 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage device, a solid-state storage, a floppy disk, a flexible disk, hard disk, a magnetic tape or any other magnetic medium, a compact disc or any other optical medium, a ROM (Read Only Memory) , a RAM (Random Access Memory) , a cache memory and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions and/or code.
  • non-transitory storage devices 810 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage
  • the non-transitory storage devices 810 may be detachable from an interface.
  • the non-transitory storage devices 810 may have data/instructions/code for implementing the methods and steps which are described above.
  • the hardware environment 800 may also comprise a communication device 812.
  • the communication device 812 may be any kinds of device or system that can enable communication with external apparatuses and/or with a network, and may comprise but are not limited to a modem, a network card, an infrared communication device, a wireless communication device and/or a chipset such as a Bluetooth TM device, 1302.11 device, WiFi device, WiMax device, cellular communication facilities and/or the like.
  • the hardware environment 800 When the hardware environment 800 is used as an on-vehicle device, it may also be connected to external device, for example, a GPS receiver, sensors for sensing different environmental data such as an acceleration sensor, a wheel speed sensor, a gyroscope and so on. In this way, the hardware environment 800 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle.
  • external device for example, a GPS receiver, sensors for sensing different environmental data such as an acceleration sensor, a wheel speed sensor, a gyroscope and so on.
  • the hardware environment 800 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle.
  • other facilities such as an engine system, a wiper, an anti-lock Braking System or the like
  • non-transitory storage device 810 may have map information and software elements so that the processor 804 may perform route guidance processing.
  • the output device 806 may comprise a display for displaying the map, the location mark of the vehicle, images indicating the travelling situation of the vehicle and also the visual signals.
  • the output device 806 may also comprise a speaker for audio output.
  • the bus 802 may include but is not limited to Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus. Particularly, for an on-vehicle device, the bus 802 may also include a Controller Area Network (CAN) bus or other architectures designed for application on an automobile.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • CAN Controller Area Network
  • the hardware environment 800 may also comprise a working memory 814, which may be any kind of working memory that may store instructions and/or data useful for the working of the processor 804, and may comprise but is not limited to a random access memory and/or a read-only memory device.
  • working memory 814 may be any kind of working memory that may store instructions and/or data useful for the working of the processor 804, and may comprise but is not limited to a random access memory and/or a read-only memory device.
  • Software elements may be located in the working memory 814, including but are not limited to an operating system 816, one or more application programs 818, drivers and/or other data and codes. Instructions for performing the methods and steps described in the above may be comprised in the one or more application programs 818, and the units of the aforementioned control unit 102 or the apparatus 300 may be implemented by the processor 804 reading and executing the instructions of the one or more application programs 818. More specifically, the aforementioned apparatus 300 or the control unit 102 may, for example, be implemented by the processor 804 when executing an application 818 having instructions to perform the steps of the method 200.
  • the vehicle-controlling unit 303 of the aforementioned apparatus 300 may, for example, be implemented by the processor 804 when executing an application 818 having instructions to perform the step 230 of the method 200.
  • Other units of the aforementioned apparatus 300 may also, for example, be implemented by the processor 804 when executing an application 818 having instructions to perform one or more of the aforementioned respective steps.
  • the executable codes or source codes of the instructions of the software elements may be stored in a non-transitory computer-readable storage medium, such as the storage device (s) 810 described above, and may be read into the working memory 814 possibly with compilation and/or installation.
  • the executable codes or source codes of the instructions of the software elements may also be downloaded from a remote location.
  • the present disclosure may be implemented by software with necessary hardware, or by hardware, firmware and the like. Based on such understanding, the embodiments of the present disclosure may be embodied in part in a software form.
  • the computer software may be stored in a readable storage medium such as a floppy disk, a hard disk, an optical disk or a flash memory of the computer.
  • the computer software comprises a series of instructions to make the computer (e.g., a personal computer, a service station or a network terminal) execute the method or a part thereof according to respective embodiment of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • User Interface Of Digital Computer (AREA)
  • Traffic Control Systems (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

An autonomous vehicle (100) and a control method therefor are disclosed. The autonomous vehicle (100) comprises a communication unit (101) configured to receive order information, wherein the order information includes at least information about a user to be picked up and a pick-up location; a visual signal providing unit (103); and a control unit (102) configured to acquire the order information from the communication unit (101) and control the visual signal providing unit (103) to provide a first visual signal that allows the vehicle to be identified by the user when and/or after determining that the vehicle arrives and/or is about to arrive at the pick-up location.

Description

AUTONOMOUS VEHICLE AND CONTROL METHOD THEREFOR TECHNICAL FIELD
The present disclosure relates in general to a field of an autonomous vehicle, and in particular, to a method, system, and a non-transitory computer readable medium for controlling an autonomous vehicle to provide passenger pick-up services.
BACKGROUND
The current process for passenger pick-up service comprises: receiving and confirming the order by the driver, driving to the pick-up location to meet the passenger, calling the passenger when the driver approaches the pick-up location and waiting for the passenger to come.
If autonomous vehicles, e.g., unmanned vehicles, provide the ODM (On-Demand Mobility) services, the passengers will be picked up and chauffeured without drivers.
SUMMARY
An aspect of the present disclosure mainly aims to provide an autonomous vehicle, as well as a method, system, and non-transitory computer readable medium for controlling an autonomous vehicle to provide a passenger pick-up service.
In accordance with a first exemplary embodiment of the present disclosure, an autonomous vehicle is provided, comprising: a communication unit configured to receive order information, wherein the order information includes at least information about a user to be picked up and a pick-up location; a visual signal providing unit configured to provide one or more visual signal; and a control unit configured to: acquire the order information from the communication unit; determine whether the vehicle arrives at or is about to arrive at the pick-up location or not; and control the visual signal providing unit to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
In an example of the present embodiment, the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle, and the visual signal  providing unit may comprise at least one of: at least one display apparatus configured to display one or more characters and/or one or more images on a window and/or an outer surface of the vehicle; at least one light-emitting apparatus configured to emit colored light and/or blinking light visible from outside the vehicle; and at least one adaptation mechanism configured to adapt an external appearance of the vehicle to provide a visual signal, wherein the external appearance includes at least color and/or shape.
In another example of the present embodiment, the at least one display apparatus may comprise one or more of a flat display apparatus, a curved display apparatus, a flexible display apparatus, projection display apparatus, and a holographic display apparatus.
In another example of the present embodiment, the vehicle may further comprise a detecting unit configured to detect a distance between the user and the vehicle, wherein the control unit may be configured to control, in the case that the distance is less than a threshold, the visual signal providing unit to perform at least one of: providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
In another example of the present embodiment, the control unit may be further configured to: control the communication unit to send, after acquiring the order information, a signal indicating an avatar of the vehicle to an electronic device of the user; and control the visual signal providing unit to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal received from the electronic device of the user by the communication unit and/or detecting the user in a vicinity of the vehicle by the detecting unit.
In another example of the present embodiment, the control unit may be further configured to control the visual signal providing unit to display, after acquiring the order information, a picture or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
In another example of the present embodiment, the control unit may be further configured to control the communication unit to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location.
In another example of the present embodiment, the arrival notification may include information about a position of the vehicle, and/or video and/or audio information  about surroundings of the vehicle.
In another example of the present embodiment, the control unit may be further configured to: control the communication unit to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle by the detecting unit; and control the visual signal providing unit to provide a second visual signal after receiving a response signal from the electronic device of the user by the communication unit.
In accordance with a second exemplary embodiment of the present disclosure, a computer-implemented method for controlling an autonomous vehicle is provided, comprising: acquiring order information, wherein the order information includes at least information about a user to be picked up and a pick-up location; determining whether the vehicle arrives at or is about to arrive at the pick-up location or not; and causing the vehicle to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
In an example of the present embodiment, the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture and/or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle.
In another example of the present embodiment, the step of causing the vehicle to provide the first visual signal may comprise causing the vehicle to perform, in the case of detecting that a distance between the user and the vehicle is less than a threshold, at least one of: providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
In another example of the present embodiment, the method may further comprise causing the vehicle to send a signal indicating an avatar of the vehicle to an electronic device of the user after acquiring the order information, wherein the step of causing the vehicle to provide the first visual signal may comprise causing the vehicle to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal from the electronic device of the user and/or detecting the user in a vicinity of the vehicle.
In another example of the present embodiment, the method may further comprise causing the vehicle to display, after acquiring the order information, a picture  and/or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
In another example of the present embodiment, the first visual signal may vary with a distance between the user and the vehicle.
In another example of the present embodiment, the method may further comprise causing the vehicle to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location.
In another example of the present embodiment, the arrival notification may include information about a position of the vehicle, and/or video and/or audio information about surroundings of the vehicle.
In another example of the present embodiment, the method may further comprise: causing the vehicle to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle; and causing the vehicle to provide a second visual signal and/or audible signal after receiving a response signal from the electronic device of the user.
In accordance with a third exemplary embodiment of the present disclosure, a system for controlling an autonomous vehicle is provided, comprising: one or more processors; and one or more memories configured to store a series of computer executable instructions, wherein the series of computer executable instructions, when executed by the one or more processors, cause the one or more processors to perform the steps of the above mentioned method.
In accordance with a fourth exemplary embodiment of the present disclosure, a non-transitory computer readable medium having instructions stored thereon that, when executed by one or more processors, causing the one or more processors to perform the steps of the above mentioned method is provided.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other aspects and advantages of the present disclosure will become apparent from the following detailed description of exemplary embodiments taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the present disclosure. Note that the drawings are not necessarily drawn to scale.
Fig. 1 illustrates a block diagram of an autonomous vehicle in accordance with an exemplary embodiment of the present disclosure.
Fig. 2 illustrates a flow chart showing a method of controlling the autonomous vehicle in accordance with an exemplary embodiment of the present disclosure.
Fig. 3 illustrates a block diagram of an apparatus for controlling the autonomous vehicle in accordance with an exemplary embodiment of the present disclosure.
Fig. 4 illustrates possible visual effects of the vehicle in accordance with an exemplary embodiment of the present disclosure.
Fig. 5 illustrates possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure.
Figs. 6A-6B illustrate possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure.
Figs. 7A-7B illustrate possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure.
Fig. 8 illustrates a general hardware environment wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the following detailed description, numerous specific details are set forth to provide a thorough understanding of the described exemplary embodiments. It will be apparent, however, to one skilled in the art that the described embodiments can be practiced without some or all of these specific details. In other exemplary embodiments, well known structures or process steps have not been described in detail in order to avoid unnecessarily obscuring the concept of the present disclosure.
The term “vehicle” used throughout the specification refers to a land vehicle, a watercraft, an underwater vehicle, an aircraft, a spacecraft, or the like. The term “A and/or B” used throughout the specification refers to “A” , “B” , or “A and B” .
Referring first to Fig. 1, there is shown a block diagram of an autonomous vehicle 100 in accordance with an exemplary embodiment of the present disclosure. The vehicle 100 may comprise at least: a communication unit 101 that may communicate with an external device (not shown) ; a visual signal providing unit 103 that may provide at least one visual signal that allows the vehicle 100 to be identified by the user; and a control unit 102 that may control the visual signal providing unit 103 or an overall operation of the vehicle 100.
The communication unit 101 may communicate with the external device, e.g.,  a server for providing the ODM (On-Demand Mobility) business, or an electronic device (e.g. a smart phone) of a user of the ODM business who is intended to be the passenger of this vehicle, via a network (not shown) . The network may include a local area network (LAN) , a wide area network (WAN) (e.g., the Internet) , a virtual network, a telecommunications network, and/or other interconnected paths across which multiple entities may communicate. In some embodiments, the network includes 
Figure PCTCN2017081078-appb-000001
communication networks or a cellular communications network for sending and receiving data via e.g. short messaging service (SMS) , multimedia messaging service (MMS) , hypertext transfer protocol (HTTP) , direct data connection, WAP, e-mail, etc. In other embodiments, the network may be a mobile data network such as CDMA, GPRS, TDMA, GSM, WIMAX, 3G, 4G, LTE, VoLTE, or any other mobile data network or combination of mobile data networks.
Furthermore, the communication unit 101 may communicate with a platform (not shown) via the network described above so as to communicate with the electronic device of the user. The platform may include at least one server and at least one application operated thereon. The communication unit 101 and the electronic device of the user may both connect to the platform via the network, thus the communication unit 101 may send data to and receive data from the electronic device of the user.
In some embodiments, the visual signal providing unit 103 may comprise at least one of: one or more display apparatus, one or more light-emitting apparatus, and one or more adaptation mechanisms, so as to provide at least one visual signal to the user.
The display apparatus may be configured to display one or more characters and/or one or more images on one or more windows and/or one or more outer surfaces of the vehicle. The characters may include letters, numbers, symbols, and so on, and the images may include pictures, photos, icon, portrait, and so on. Both the characters and the images may be displayed statically or dynamically by the display apparatus. The one or more characters and/or the one or more images displayed by the display apparatus may present, as the at least one visual signal to the user, at least one of greeting information for the user, personal information of the user, a picture or a user name predefined by the user, information customized by the user and the like.
In some embodiments, the display apparatus may comprise one or more of a flat display apparatus, a curved display apparatus, a flexible display apparatus, projection display apparatus, a holographic display apparatus and the like. It will be apparent to those skilled in the art that the present invention is not limited to these listed displays, but can be any type of display as long as it can display the characters and/or images on the windows  and/or the outer surfaces of the vehicle.
The one or more characters and/or the one or more images may be displayed on one or more windows of the vehicle. In some embodiments, the display apparatus may be the window itself, that is, the window of the vehicle may be configured as a display screen. In other embodiments, the one or more characters and/or the one or more images may be projected, e.g., holographically, to the window. One or more projectors provided in, on or out of the vehicle project light distribution to the window so as to form display on or in the window. In these cases, the window of the vehicle may be configured as a projection screen. Despite the window is configured as a display screen or a projection screen, a display may form on the inner surface and/or the outer surface of the window, and/or inside the window. No matter the display is formed in which portion of the window, the display surface may face toward the interior and/or the exterior of the vehicle. The window may be nontransparent, translucent, or transparent.
In addition, the one or more characters and/or the one or more images may be displayed on one or more outer surfaces of the vehicle. An outer surface of the vehicle may be an outer surface of a door, a frame, a side mirror, a windshield, a wheel, a mudflap, a roof, a trunk, a tailgate, an engine case, an engine hood, etc. In these cases, the outer surface of the vehicle may be configured as a display screen or a projection screen, the features of which are similar with the above.
The light-emitting apparatus may be configured to emit light, preferably colored light and/or blinking light, visible from outside the vehicle to provide at least one visual signal. The light emitted from the light-emitting apparatus may be more conspicuous to the user in conditions of weak light, such as in indoor environments, in tunnels, in underground parking places or the like and/or at dusk, at night or the like. Particularly, the colored light and/or blinking light may catch the attention of the user more easily. The colored light and/or blinking light may be white colored, single colored or multiple colored light. Additionally, the light emitted from the light-emitting apparatus may interact with the user and/or the electronic device of the user. Furthermore, the colored light and/or blinking light may present a visual effect to the user to express greeting or the like.
The adaptation mechanism may be configured to adapt an external appearance of the vehicle to provide at least one visual signal, wherein the external appearance includes at least color and/or shape. The adaptation mechanism may be implemented as a color-changing material and/or a color-changing surface of the vehicle, and/or a material which changes its shape and/or a surface of the vehicle which changes its shape. Examples of  this are surface coating agents which can change their color, switchable windows/films and shape-memory polymers.
In some exemplary embodiment of the present disclosure, the vehicle 100 may further comprise a detecting unit 104 that may detect a distance between the user and the vehicle 100, as shown in Fig. 1 where the dashed line represents the component 104 is optional. It should be understood that a control unit 102 could also control the operation of the detecting unit 104.
In some embodiments, the detecting unit 104 may be configured to detect a distance between the user and the vehicle 100 by detecting the position of the user. For example, the detecting unit 104 may acquire position data from a global positioning system (GPS) , which are provided by an electronic device of the user having a GPS positioning function, so as to detect the position of the user. It should be understood that the position data need not necessarily be from a GPS, but may also be from a base station positioning system or a WiFi positioning system. In this case, the detecting unit 104 may need to establish communicating connection via a network with the electronic device of the user or with the platform to acquire the position data. In some implementations, the detecting unit 104 and the communication unit 101 may be integral. While in other implementations, the detecting unit 104 may be distinct from the communication unit 101. The detecting unit 104 and the communication unit 101 may be implemented as one element or as separate elements.
Furthermore, the detecting unit 104 may be configured to detect a distance between the user and the vehicle 100 by detecting the distance directly without detecting the position of the user. For example, a distance sensor may function as the detecting unit 104 to detect the distance between the user and the vehicle 100. The method of detecting the distance by the distance sensor is not particularly limited thereto. For example, the distance sensor may detect the distance using infrared rays, ultrasonic wave or the like. Alternatively or additionally, the detecting unit 104 may use a wireless communication means to detect the distance between the user and the vehicle 100, such as a body area network (BAN) , a 
Figure PCTCN2017081078-appb-000002
communication, Bluetooth Low Energy (BLE) communication (e.g., iBeacon) , near field communication (NFC) , WiFi communication, Zigbee communication, magnetic communications, electromagnetic communications (including RF, microwave, etc. ) , and other such communication means.
The control unit 102 receives data from various other components of the vehicle 100, e.g., the communication unit 101, the visual signal providing unit 103 and the detecting unit 104. And, the control unit 102 transmits control commands to the  above-mentioned various other components.
In Fig. 1, a connection line between various components represents a bi-directional communication line, which may be tangible wires or may be achieved wirelessly, such as via radio, RF, or the like. The specific controlling operations performed by the control unit 102 will be described in details later. The control unit 102 may be a processor, a microprocessor or the like. The control unit 102 may be provided on the vehicle 100, for example, at the central console of the vehicle 100. Alternatively, the control unit 102 may be provided remotely and may be accessed via various networks or the like.
It should be understood that each of the units of the vehicle 100 may be in communication with each other, and each of the units of the vehicle 100 may be integrated into or installed on the vehicle 100 and may also be external to the vehicle 100.
It should be also understood that the components of the vehicle 100 may be implemented by hardware, software, firmware, or any combination thereof to carry out the principles of the present disclosure. It should be understood by those skilled in the art that the blocks described in Fig. 1 may be combined or separated into sub-blocks to implement the principles of the present disclosure as described above. Therefore, the description herein may support any possible combination or separation or further definition of the blocks described herein.
The features, types, numbers, and locations of the communication unit 101, the control unit 102, the visual signal providing unit 103 and the detecting unit 104 have been described in detail. But as can be easily understood by those skilled in the art, the features, types, numbers, and locations of the above components are not limited to the illustrated embodiments, and other features, types, numbers, and locations may be also used according to the actual requirements.
The electronic device of the user may be any type of wired or wireless device having communication capability. Exemplary electronic devices of the user include wearable devices, cell phones or other mobile communication devices, GPS devices, slate computers or other personal computing devices, personal data assistants (PDAs) , MP3 players or other personal music-playing devices, cameras, video cameras, or the like.
Next, the operations of the vehicle 100 will be described in details.
In an exemplary embodiment of the present disclosure, the communication unit 101 may receive order information, wherein the order information includes at least information about a user to be picked up and a pick-up location. The order may be placed by the user or someone else. The communication unit 101 may receive order information from  the user or from the platform via a network. The order information received by the communication unit 101 may be transmitted to the control unit 102 via wire (s) or wirelessly. The control unit 102 may acquire the order information from the communication unit 101 and may then confirm the order and travel to the pick-up location automatically. After determining that the vehicle 100 arrives at or is about to arrive at the pick-up location, the control unit 102 may control the visual signal providing unit 103 to provide a first visual signal that allows the vehicle 100 to be identified by the user.
The control unit 102 may determine whether the vehicle 100 arrives at or is about to arrive at the pick-up location with a position of the vehicle 100. The position of the vehicle 100 may be detected by a position detecting unit. The position detecting unit may be the same as or distinct from the detecting unit 104. The position detecting unit and the detecting unit 104 may be implemented as one element or as separate elements.
The control unit 102 may determine how much time is left before arriving at the pick-up location. If the determined left time is in the range of a predetermined time range, e.g., 5 minutes, 10 minutes or the like, the control unit 102 may determine that the vehicle 100 is about to arrive at the pick-up location. Similarly, the control unit 102 may determine the distance between the vehicle 100 and the pick-up location. If the determined distance is in the range of a predetermined distance range, e.g., 1 kilometers, 5 kilometers or the like, the control unit 102 may determine that the vehicle 100 is about to arrive at the pick-up location. That is to say, the term “be about to arrive” herein generally means there is less than a predetermined time’s drive (e.g. less than 5 or 10 minutes) away from the pick-up location, or the vehicle has less than a predetermined distance (e.g. less than 1, 2, 3 or 5 kilometers) away from the pick-up location.
The determination of the predetermined distance time rage or the predetermined distance range is not particularly limited. For example, they may be predetermined based on the time to be taken by the user to the pick-up location, the position of the user (or the distance between the user and the pick-up location) and/or the user preferences.
The control unit 102 may control the visual signal providing unit 103 to provide the first visual signal when or after the vehicle 100 is about to arrive at the pick-up location (e.g., during the vehicle 100 is driving to the pick-up location) , when the vehicle 100 arrives at the pick-up location (e.g., at the time of arriving at the pick-up location) , and/or after arriving at the pick-up location (e.g., after waiting for the user for a while at the pick-up location) .
Since the first visual signal may allow the vehicle 100 to be identified by the user, the user may identify which vehicle is the one ordered to pick him/her up.
In an exemplary embodiment of the present disclosure, in the case that the distance between the user and the vehicle 100 detected by the detecting unit 104 is less than a threshold, the control unit 102 may control the visual signal providing unit 103 to perform at least one of: providing greeting information for the user on one or more windows and/or one or more outer surfaces of the vehicle 100 as the first visual signal; and emitting colored light and/or blinking light visible from outside the vehicle 100 as the first visual signal.
The control unit 102 may control the visual signal providing unit 103 to provide greeting information for the user on one or more windows and/or one or more outer surfaces of the vehicle 100. In an embodiment, the visual signal providing unit 103 may comprise one or more display apparatus, and the greeting information may be presented as one or more characters and/or one or more images on one or more windows and/or one or more outer surfaces of the vehicle 100. The greeting information may comprise, personal information of the user (e.g., name information, title information, gender information, etc. ) , a picture or a user name predefined by the user, information customized by the user (e.g., the content, style, color of displaying customized by the user) , an image representing an avatar of the vehicle 100 (e.g., the image representing the avatar of the vehicle 100 may say some greeting words) , etc.
Possible visual effects of the vehicle in accordance with an exemplary embodiment of the present disclosure is shown in Fig. 4. The greeting information 401 for the user, “MS. ZHANG, WELCOME TO SHANGHAI” , is provided on the window (s) 402 of the vehicle 100 by the visual signal providing unit 103. Obviously, although not shown in the figure, the greeting information may also be provided on the outer surface of the vehicle 100.
In this case, when the use approaches the vehicle 100, the user may easily identify the vehicle 100 with the help of the greeting information. Particularly, the greeting information may elevate the experience of the user.
In other embodiments, in addition to providing the visual signal, a welcome voice or a notification sound or other audio can be played for helping the user to find the vehicle or greeting the user or the like.
Since the greeting information may comprise personal information of the user, in order to protect privacy of the user, it is preferable that the greeting information may be provided in the case that the distance between the user and the vehicle 100 is less than a threshold (e.g., the user is close to the vehicle 100) . The threshold may be preset to any  desired values, e.g., 5 meters, 10 meters or the like, according to the practical application or experiences.
In some implementations, the control unit 102 may control the visual signal providing unit 103 to emit colored light and/or blinking light visible from outside the vehicle 100 as the first visual signal. In an embodiment, the visual signal providing unit 103 may comprise one or more light-emitting apparatus. Possible visual effects of the vehicle in accordance with another exemplary embodiment of the present disclosure is shown in Fig. 5. In this case, the visual signal providing unit 103 may comprise  light stripes  501, 502 and 503. The colored and/or blinking light visible from outside the vehicle 100 has been emitted by the  light stripes  501, 502 and 503. Although the shown light is in the shape of strip around the roof and the rear lamps of the vehicle 100, it should be understood that the light-emitting apparatus could be disposed anywhere and the light could be any color and in any shape. The user may predefine the luminance, the color, the shape, the blinking frequency and/or the like of the light.
When the first visual signal implemented as the light emitted from the light-emitting apparatus, it may be more conspicuous to the user in conditions of weak light. Particularly, the colored light and/or blinking light may catch the attention of the user more easily. In this case, when the use approaches the vehicle 100, the user may easy to identify the vehicle 100 with the help of the colored light and/or blinking light. Furthermore, when the use approaches the vehicle 100, the colored light and/or blinking light may present a visual effect to greet the user so as to elevate the experience of the user. Since it is desired that the colored light and/or blinking light can be seen by the user, it is preferable that the light may be emitted in the case that the distance between the user and the vehicle 100 is less than a threshold (e.g., the user is in the vicinity of the vehicle 100, the vehicle 100 is within the visual range of the user, etc. ) .
In some embodiments, the car of Fig. 5 may additionally display a picture 504 predetermined by the user on its side window, so as to help the user to more easily identify the car. The picture 504 may be e.g. the picture used in a social network by the user, a picture defined on the platform by the user, or the like.
Although the above visual effects of providing greeting information and emitting light are shown in Fig. 4 and Fig. 5 respectively, it should be understood that the above visual effects may be presented in combination with each other. For example, while greeting information for the user is provided by the visual signal providing unit 103 (e.g., at least one display apparatus) , colored light and/or blinking light may also be emitted by the  visual signal providing unit 103 (e.g., at least one light-emitting apparatus) .
In another exemplary embodiment of the present disclosure, after acquiring the order information from the communication unit 101, the control unit 102 may control the communication unit 101 to send a signal indicating an avatar of the vehicle 100 to an electronic device of the user. The avatar of the vehicle 100 may be a picture of a person, an animal or a cartoon that may represent the vehicle 100. Preferably, the avatar of the vehicle 100 may be a virtual representation of appearance of its virtual driver.
The electronic device of the user may display an image representing the avatar of the vehicle 100 based on the received signal. Possible visual effects of the electronic device of the user displaying an image representing the avatar of the vehicle 100 is illustrated in Fig. 6A. In this case, the electronic device of the user may be a smart watch 603. The smart watch receives a signal indicating an avatar of the vehicle 100 and displays the image 601 representing the avatar holographically. The avatar of the vehicle 100 is a virtual representation of appearance of a person (adriver specifically) . The displayed image 601 representing the avatar may “talk” like a real person, e.g., it utters “I will pick you up in 5 min. ” as indicated by reference numeral 602 in Fig. 6A. This utterance can be provided visually or acoustically or both.
In response to a signal received from the electronic device of the user by the communication unit 101, the control unit 102 may control the visual signal providing unit 103 to provide, as the first visual signal, an image representing the avatar of the vehicle 100 on one or more windows and/or one or more outer surfaces of the vehicle 100. The electronic device of the user may send a signal that desire a response from the vehicle 100 when the user approaches or is close to the vehicle 100.
Possible visual effects of the vehicle 100 providing the image representing the avatar of the vehicle 100 is illustrated in Fig. 6B. In this case, the image 604 representing the avatar of the vehicle 100 is provided (e.g., holographically) on one or more side windows of the vehicle 100. The avatar of the vehicle 100 is a virtual representation of appearance of a person (adriver specifically) . The virtual representation of the person may appear in the windows and greet the user. The displayed image 604 representing the avatar may “talk” like a real person, e.g., it utters “HELLO, LUKO! ” , as indicated by reference numeral 605 in Fig. 6B. This utterance can be provided visually or acoustically or both. Such utterance from the displayed virtual representation of the person may further greet the user so as to elevate the experience of the user.
In this embodiment, the displayed image representing the avatar of the vehicle  100 conveys a sense to the user that a virtual chauffeur recognizes him/her and greets him/her, such that the user may be reassured and the level of trust to the vehicle 100 may be increased. Additionally, since the image provided on a window and/or an outer surface of the vehicle 100 and the image displayed in the electronic device of the user and are both representing the avatar of the vehicle 100, that is, the two images are associated with each other (even may be the same) , the user may identify the vehicle 100 with the help of the displayed image more easily.
Although the image 604 representing the avatar of the vehicle 100 shown in Fig. 6B is on the windows of the vehicle 100, it could be provided on one or more outer surface of the vehicle 100. Alternatively or additionally, in this case, greeting information for the user may also be provided on one or more windows and/or one or more outer surfaces of the vehicle 100.
Alternatively or additionally, the detecting unit 104 may detect the distance between the user and the vehicle 100. In response to detecting the user in a vicinity of the vehicle 100, the control unit 102 may control the visual signal providing unit 103 to provide, as the first visual signal, an image representing the avatar of the vehicle 100 on a window and/or an outer surface of the vehicle. Alternatively or additionally, in this case, greeting information for the user may also be provided on one or more windows and/or one or more outer surfaces of the vehicle 100.
In another exemplary embodiment of the present disclosure, after acquiring the order information, the control unit 102 may control the visual signal providing unit 103 to display a picture or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle 100. The picture or a user name predefined by the user may be a picture or a user name used in a social network by the user, a picture or a user name defined on the platform by the user and the like. The information customized by the user may be any type of characters and/or images. With the picture or the user name predefined by the user and/or the information customized by the user displaying on the window and/or the outer surface of the vehicle 100, the user may post something desired to self-express and/or socialize. Additionally, since the displayed contents are predefined or customized by the user, the user is so familiar with the displayed contents or the displayed contents are so attractive that the user may identify the vehicle 100 more easily (maybe at the first glance) .
Possible visual effects of the vehicle 100 displaying a picture or a user name predefined by the user and/or information customized by the user is illustrated in Figs. 7A-7B.  In these cases, the picture 701 predefined by the user or the photo 702 customized by the user are displayed on the side window of the vehicle 100. It should be understood that they may also be displayed on the outer surface of the vehicle 100 additionally or alternatively. The vehicle 100 enables the user to post something desired to self-express and/or socialize as shown in Figs. 7A-7B. For example, the vehicle 100 enables the user to post, as the picture 701 or photo 702, a game video he/she plays recently, a movie poster he/she likes, a photo of an idol who will give a concert that he/she will go to tonight, and so on.
In this embodiment, the displayed contents may be presented after the control unit 102 acquires the order information. That is to say, there is no need to wait until the vehicle 100 arrives at or is about to arrive at the pick-up location to display the above contents. For example, the vehicle 100 may display the picture or the user name predefined by the user and/or the information customized by the user during its automatic travel, thus the vehicle 100 with a strong visual indicator may become a tool to show the visible identity in the traffic, making it more convenient for self-expression and socializing of the user. Alternatively or additionally, after picking up the user and during the drive for the user, the vehicle 100 can always display the above contents for self-expression and socializing of the user.
In another exemplary embodiment of the present disclosure, the control unit 102 may control the communication unit 101 to send an arrival notification to an electronic device of the user when determining that the vehicle arrives at or is about to arrive at the pick-up location. The method of determining whether the vehicle 100 arrives at or is about to arrive at the pick-up location by the control unit 102 is the same as the above described. The arrival notification send by the communication unit 101 enables the user to know the vehicle 100 has arrived or is about to arrive.
Particularly, the arrival notification includes information about a position of the vehicle 100 that may indicate the user the specific position of the vehicle 100. The position of the vehicle 100 may be detected by the detection unit 104 as described above.
Alternatively or additionally, video and/or audio information about surroundings of the vehicle 100 may be send to the electronic device of the user, by being included in the arrival notification. The video information about surroundings of the vehicle 100 may be a picture or a video captured by a camera. For example, the video information may show to the user that the vehicle 100 is parking beside a convenience store, or the number of the parking space and so on, so the user would find the vehicle 100 more easily with the help of the video information. Alternatively or additionally, the audio information  could also provide such help. The audio information about surroundings of the vehicle 100 may be a recorded sound captured by a receiver or a video captured by a camera. The audio information may assist the user in determining the specific position of the vehicle 100, especially for the user whose vision is not sensitive. For example, if the audio information represents noticeable echo (s) , the user may determine that the vehicle 100 is likely to be in an indoor space. With the help of the video and/or audio information about surroundings of the vehicle 100, the user could find the vehicle 100 more easily.
In another exemplary embodiment of the present disclosure, the first visual signal varies with the distance between the user and the vehicle. In an embodiment, when the distance detected by the detecting unit 104 less than a certain value, the control unit 102 may control the first visual signal to change, so that the user could identify the vehicle 100 more easily. For example, the user notices that when he/she moves from position A to position B, a visual signal varies from first format to second format. Then, maybe after several going there and back, he/she may determine that the vehicle that provides the visual signal is the one ordered to pick him/her up.
In another exemplary embodiment of the present disclosure, in the case of detecting the user in a vicinity of the vehicle 100 by the detecting unit 104, the control unit 102 may control the communication unit 101 to send a signal that requires a response from the user to an electronic device of the user. When the detecting unit 104 detects that the user is in the vicinity of the vehicle 100, the vehicle 100 may send a signal to the electronic device of the user initiatively. The user may feel the signal by the vibration of the electronic device of the user. Then the user know that he/she is in the vicinity of the vehicle 100. In order to find the vehicle 100 more easily, the user may response the signal by sending a response signal to the vehicle 100. After receiving a response signal from the electronic device of the user by the communication unit 101, the control unit 102 may control the visual signal providing unit 103 to provide a second visual signal. In an embodiment, the second visual signal is different from the first visual signal. The user could identify the vehicle more easily based on the interaction between the vehicle 100 and the electronic device of the user.
Referring back to Fig. 2, it illustrates a flow chart showing a method 200 of controlling the autonomous vehicle in accordance with an exemplary embodiment of the present disclosure. The method 200 may be performed by e.g. the above-described control unit 102 of Fig. 1, or other apparatus. The steps of the method 200 presented below are intended to be illustrative. In some embodiments, the method may be accomplished with one or more additional steps not described, and/or without one or more of the steps discussed.  Additionally, the order in which the steps of method are illustrated in Fig. 2 and described as below is not intended to be limiting. In some embodiments, the method may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information) . The one or more processing devices may include one or more modules executing some or all of the steps of method in response to instructions stored electronically on an electronic storage medium. The one or more processing modules may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the steps of method.
As shown in Fig. 2, at step 210, order information may be acquired. The order information includes at least information about a user to be picked up and a pick-up location.
At step 220, whether the vehicle arrives at or is about to arrive at the pick-up location or not is determined.
At step 230, the vehicle may be caused to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location. In one embodiment, the visual signal providing unit 103 of the vehicle may be controlled by the control unit 102 to provide the first visual signal.
In some embodiments, the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture and/or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle.
In another embodiment, the first visual signal may vary with a distance between the user and the vehicle.
In other embodiments, the vehicle may be caused to perform, in the case of detecting that a distance between the user and the vehicle is less than a threshold, at least one of: providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
In yet other embodiments, the vehicle may be caused to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal from the electronic device of the user and/or detecting the user in a vicinity of the vehicle.
In some implementations, the method 200 may further comprise causing the  vehicle to send a signal indicating an avatar of the vehicle to an electronic device of the user after acquiring the order information,
In another implementation, the method 200 may further comprise causing the vehicle to display, after acquiring the order information, a picture and/or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
In other implementations, the method 200 may further comprise causing the vehicle to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location. In some examples, the arrival notification includes information about a position of the vehicle, and/or video and/or audio information about surroundings of the vehicle.
In yet other implementations, the method 200 may further comprises: causing the vehicle to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle; and causing the vehicle to provide a second visual signal and/or audible signal after receiving a response signal from the electronic device of the user.
Fig. 3 illustrates a block diagram of an apparatus for controlling the autonomous vehicle (e.g., the controller 102 as shown in Fig. 1) in accordance with an exemplary embodiment of the present disclosure. The blocks of the apparatus 300 may be implemented by hardware, software, firmware, or any combination thereof to carry out the principles of the present disclosure. It is understood by those skilled in the art that the blocks described in Fig. 3 may be combined or separated into sub-blocks to implement the principles of the present disclosure as described above. Therefore, the description herein may support any possible combination or separation or further definition of the blocks described herein.
Referring to Fig. 3, the apparatus 300 for controlling an autonomous vehicle may comprise: acquiring unit 301 for acquiring order information, wherein the order information includes at least information about a user to be picked up and a pick-up location; determining unit 302 for determining whether the vehicle arrives at or is about to arrive at the pick-up location or not; and vehicle-controlling unit 303 for causing the vehicle to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
In an example of the present embodiment, the first visual signal may comprise at least one of greeting information for the user, personal information of the user, a picture and/or a user name predefined by the user, information customized by the user, colored light  and/or blinking light, and an image representing an avatar of the vehicle.
Please note that, the respective units in the apparatus 300 can be configured to perform the respective operations as discussed above in the method 200 of Fig. 2, and thus their details are omitted here.
Fig. 8 illustrates a general hardware environment 800 wherein the present disclosure is applicable in accordance with an exemplary embodiment of the present disclosure.
With reference to FIG. 8, a hardware environment 800, which is an example of the hardware device that may be applied to the aspects of the present disclosure, will now be described. The hardware environment 800 may be any machine configured to perform processing and/or calculations, may be but is not limited to a work station, a server, a desktop computer, a laptop computer, a tablet computer, a personal data assistant, a smart phone, an on-vehicle computer or any combination thereof. The aforementioned control unit 102 or the apparatus 300 for controlling the autonomous vehicle may be wholly or at least partially implemented by the hardware environment 800 or a similar device or system.
The hardware environment 800 may comprise elements that are connected with or in communication with a bus 802, possibly via one or more interfaces. For example, the hardware environment 800 may comprise the bus 802, one or more processors 804, one or more input devices 806 and one or more output devices 808. The one or more processors 804 may be any kinds of processors, and may comprise but are not limited to one or more general-purpose processors and/or one or more special-purpose processors (such as special processing chips) . The input devices 806 may be any kinds of devices that can input information to the computing device, and may comprise but are not limited to a mouse, a keyboard, a touch screen, a microphone and/or a remote control. The output devices 808 may be any kinds of devices that can present information, and may comprise but are not limited to display, a speaker, a video/audio output terminal, a vibrator and/or a printer. The hardware environment 800 may also comprise or be connected with non-transitory storage devices 810 which may be any storage devices that are non-transitory and can implement data stores, and may comprise but are not limited to a disk drive, an optical storage device, a solid-state storage, a floppy disk, a flexible disk, hard disk, a magnetic tape or any other magnetic medium, a compact disc or any other optical medium, a ROM (Read Only Memory) , a RAM (Random Access Memory) , a cache memory and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions and/or code. The non-transitory storage devices 810 may be detachable from an interface. The  non-transitory storage devices 810 may have data/instructions/code for implementing the methods and steps which are described above. The hardware environment 800 may also comprise a communication device 812. The communication device 812 may be any kinds of device or system that can enable communication with external apparatuses and/or with a network, and may comprise but are not limited to a modem, a network card, an infrared communication device, a wireless communication device and/or a chipset such as a BluetoothTM device, 1302.11 device, WiFi device, WiMax device, cellular communication facilities and/or the like.
When the hardware environment 800 is used as an on-vehicle device, it may also be connected to external device, for example, a GPS receiver, sensors for sensing different environmental data such as an acceleration sensor, a wheel speed sensor, a gyroscope and so on. In this way, the hardware environment 800 may, for example, receive location data and sensor data indicating the travelling situation of the vehicle. When the hardware environment 800 is used as an on-vehicle device, it may also be connected to other facilities (such as an engine system, a wiper, an anti-lock Braking System or the like) for controlling the traveling and operation of the vehicle.
In addition, the non-transitory storage device 810 may have map information and software elements so that the processor 804 may perform route guidance processing. In addition, the output device 806 may comprise a display for displaying the map, the location mark of the vehicle, images indicating the travelling situation of the vehicle and also the visual signals. The output device 806 may also comprise a speaker for audio output.
The bus 802 may include but is not limited to Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus. Particularly, for an on-vehicle device, the bus 802 may also include a Controller Area Network (CAN) bus or other architectures designed for application on an automobile.
The hardware environment 800 may also comprise a working memory 814, which may be any kind of working memory that may store instructions and/or data useful for the working of the processor 804, and may comprise but is not limited to a random access memory and/or a read-only memory device.
Software elements may be located in the working memory 814, including but are not limited to an operating system 816, one or more application programs 818, drivers and/or other data and codes. Instructions for performing the methods and steps described in the above may be comprised in the one or more application programs 818, and the units of  the aforementioned control unit 102 or the apparatus 300 may be implemented by the processor 804 reading and executing the instructions of the one or more application programs 818. More specifically, the aforementioned apparatus 300 or the control unit 102 may, for example, be implemented by the processor 804 when executing an application 818 having instructions to perform the steps of the method 200. In addition, the vehicle-controlling unit 303 of the aforementioned apparatus 300 may, for example, be implemented by the processor 804 when executing an application 818 having instructions to perform the step 230 of the method 200. Other units of the aforementioned apparatus 300 may also, for example, be implemented by the processor 804 when executing an application 818 having instructions to perform one or more of the aforementioned respective steps. The executable codes or source codes of the instructions of the software elements may be stored in a non-transitory computer-readable storage medium, such as the storage device (s) 810 described above, and may be read into the working memory 814 possibly with compilation and/or installation. The executable codes or source codes of the instructions of the software elements may also be downloaded from a remote location.
Those skilled in the art may clearly know from the above embodiments that the present disclosure may be implemented by software with necessary hardware, or by hardware, firmware and the like. Based on such understanding, the embodiments of the present disclosure may be embodied in part in a software form. The computer software may be stored in a readable storage medium such as a floppy disk, a hard disk, an optical disk or a flash memory of the computer. The computer software comprises a series of instructions to make the computer (e.g., a personal computer, a service station or a network terminal) execute the method or a part thereof according to respective embodiment of the present disclosure.
The present disclosure being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure, and all such modifications as would be obvious to those skilled in the art are intended to be included within the scope of the following claims.

Claims (20)

  1. An autonomous vehicle, characterized by comprising:
    a communication unit configured to receive order information, wherein the order information includes at least information about a user to be picked up and a pick-up location;
    a visual signal providing unit configured to provide one or more visual signal; and
    a control unit configured to:
    acquire the order information from the communication unit;
    determine whether the vehicle arrives at or is about to arrive at the pick-up location or not; and
    control the visual signal providing unit to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
  2. The vehicle of claim 1, wherein
    the first visual signal comprises at least one of greeting information for the user, personal information of the user, a picture or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle, and
    the visual signal providing unit comprises at least one of:
    at least one display apparatus configured to display one or more characters and/or one or more images on a window and/or an outer surface of the vehicle;
    at least one light-emitting apparatus configured to emit colored light and/or blinking light visible from outside the vehicle; and
    at least one adaptation mechanism configured to adapt an external appearance of the vehicle to provide a visual signal, wherein the external appearance includes at least color and/or shape.
  3. The vehicle of claim 2, wherein the at least one display apparatus comprises one or more of a flat display apparatus, a curved display apparatus, a flexible display apparatus, projection display apparatus, and a holographic display apparatus.
  4. The vehicle of any one of claims 1-3, further comprising:
    a detecting unit configured to detect a distance between the user and the vehicle,
    wherein the control unit is configured to control, in the case that the distance is less than a threshold, the visual signal providing unit to perform at least one of:
    providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and
    emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
  5. The vehicle of any one of claims 1-4, wherein the control unit is further configured to:
    control the communication unit to send, after acquiring the order information, a signal indicating an avatar of the vehicle to an electronic device of the user; and
    control the visual signal providing unit to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal received from the electronic device of the user by the communication unit and/or detecting the user in a vicinity of the vehicle by the detecting unit.
  6. The vehicle of any one of claims 1-5, wherein the control unit is further configured to control the visual signal providing unit to display, after acquiring the order information, a picture or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
  7. The vehicle of any one of claims 1-6, wherein the control unit is further configured to control the communication unit to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location.
  8. The vehicle of claim 7, wherein the arrival notification includes information about a position of the vehicle, and/or video and/or audio information about surroundings of the vehicle.
  9. The vehicle of any one of claims 1-8, wherein the control unit is further configured to:
    control the communication unit to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle by the detecting unit; and
    control the visual signal providing unit to provide a second visual signal after receiving a response signal from the electronic device of the user by the communication unit.
  10. A computer-implemented method for controlling an autonomous vehicle, characterized by comprising:
    acquiring order information, wherein the order information includes at least information about a user to be picked up and a pick-up location;
    determining whether the vehicle arrives at or is about to arrive at the pick-up location or not; and
    causing the vehicle to provide a first visual signal that allows the vehicle to be identified by the user, after determining that the vehicle arrives at or is about to arrive at the pick-up location.
  11. The method of claim 10, wherein the first visual signal comprises at least one of greeting information for the user, personal information of the user, a picture and/or a user name predefined by the user, information customized by the user, colored light and/or blinking light, and an image representing an avatar of the vehicle.
  12. The method of claims 10 or 11, wherein the step of causing the vehicle to provide the first visual signal comprises causing the vehicle to perform, in the case of detecting that a distance between the user and the vehicle is less than a threshold, at least one of:
    providing, as the first visual signal, greeting information for the user on a window and/or an outer surface of the vehicle; and
    emitting, as the first visual signal, colored light and/or blinking light visible from outside the vehicle.
  13. The method of any one of claims 10-12, further comprising
    causing the vehicle to send a signal indicating an avatar of the vehicle to an electronic device of the user after acquiring the order information,
    wherein the step of causing the vehicle to provide the first visual signal comprises causing the vehicle to provide, as the first visual signal, an image representing the avatar of the vehicle and/or greeting information for the user on a window and/or an outer surface of the vehicle in response to a signal from the electronic device of the user and/or detecting the user in a vicinity of the vehicle.
  14. The method of any one of claims 10-13, further comprising causing the vehicle to display, after acquiring the order information, a picture and/or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle.
  15. The method of any one of claims 10-14, wherein the first visual signal varies with a distance between the user and the vehicle.
  16. The method of any one of claims 10-15, further comprising causing the vehicle to send an arrival notification to an electronic device of the user, in the case of determining that the vehicle arrives at or is about to arrive at the pick-up location.
  17. The method of claim 16, wherein the arrival notification includes information about a position of the vehicle, and/or video and/or audio information about surroundings of the vehicle.
  18. The method of any one of claims 10-17, further comprising:
    causing the vehicle to send a signal that requires a response from the user to an electronic device of the user in the case of detecting the user in a vicinity of the vehicle; and
    causing the vehicle to provide a second visual signal and/or audible signal after receiving a response signal from the electronic device of the user.
  19. A system for controlling an autonomous vehicle, characterized by comprising:
    one or more processors; and
    one or more memories configured to store a series of computer executable instructions,
    wherein the series of computer executable instructions, when executed by the one or more processors, cause the one or more processors to perform the method of any one of claims 10-18.
  20. A non-transitory computer readable medium having instructions stored thereon that, when executed by one or more processors, causing the one or more processors to perform the method of any one of claims 10-18.
PCT/CN2017/081078 2017-04-19 2017-04-19 Autonomous vehicle and control method therefor WO2018191886A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780088607.1A CN110431604B (en) 2017-04-19 2017-04-19 Autonomous vehicle and control method thereof
PCT/CN2017/081078 WO2018191886A1 (en) 2017-04-19 2017-04-19 Autonomous vehicle and control method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/081078 WO2018191886A1 (en) 2017-04-19 2017-04-19 Autonomous vehicle and control method therefor

Publications (1)

Publication Number Publication Date
WO2018191886A1 true WO2018191886A1 (en) 2018-10-25

Family

ID=63855511

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/081078 WO2018191886A1 (en) 2017-04-19 2017-04-19 Autonomous vehicle and control method therefor

Country Status (2)

Country Link
CN (1) CN110431604B (en)
WO (1) WO2018191886A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200183415A1 (en) * 2018-12-10 2020-06-11 GM Global Technology Operations LLC System and method for control of an autonomous vehicle
WO2021249752A1 (en) * 2020-06-10 2021-12-16 Daimler Ag Methods and systems for displaying visual content on a motor vehicle and method for providing a motor vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927789A (en) * 2014-04-29 2014-07-16 刘兴光 Unmanned taxi system
CN105046942A (en) * 2015-06-05 2015-11-11 卢泰霖 Internet-based unmanned electric automobile service system
US20160179094A1 (en) * 2014-12-17 2016-06-23 Bayerische Motoren Werke Aktiengesellschaft Communication Between a Vehicle and a Road User in the Surroundings of a Vehicle
CN105957377A (en) * 2016-05-03 2016-09-21 北京新能源汽车股份有限公司 Intelligent traffic control system and method based on unmanned electric automobiles
US20170080850A1 (en) * 2015-09-18 2017-03-23 Clearpath Robotics, Inc. Lighting control system and method for autonomous vehicles
CN106549981A (en) * 2017-01-13 2017-03-29 邹城众达知识产权咨询服务有限公司 A kind of directionless disk intelligent network about car system and its method for running based on big data cloud computing service

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3759896B2 (en) * 2001-10-31 2006-03-29 邦道 高田 Dispatch system
US7698033B2 (en) * 2006-04-12 2010-04-13 General Motors Llc Method for realizing a preferred in-vehicle chime
CN101826256A (en) * 2010-04-29 2010-09-08 联华电信股份有限公司 Vehicle dispatching method and vehicle dispatching system
CN102426780A (en) * 2011-11-16 2012-04-25 深圳欧奇网络技术有限公司 Paging system used for summoning taxis and paging method thereof
CN103000024A (en) * 2012-07-26 2013-03-27 苏州大通多宝软件技术有限公司 Taxi reservation calling method and taxi reservation calling system
CN203733303U (en) * 2014-01-07 2014-07-23 杭州路招网络科技有限公司 Taxi dynamic information reminding system
CN103745593A (en) * 2014-01-07 2014-04-23 杭州九树网络科技有限公司 Taxi dynamic information reminding system and method
KR101714514B1 (en) * 2014-11-24 2017-03-09 현대자동차주식회사 Car emergency system and method of emergency measures using the same
CN105818735A (en) * 2016-04-01 2016-08-03 蔡洪斌 Vehicle-mounted electronic display screen prompting method for indicating passenger to take reserved vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103927789A (en) * 2014-04-29 2014-07-16 刘兴光 Unmanned taxi system
US20160179094A1 (en) * 2014-12-17 2016-06-23 Bayerische Motoren Werke Aktiengesellschaft Communication Between a Vehicle and a Road User in the Surroundings of a Vehicle
CN105046942A (en) * 2015-06-05 2015-11-11 卢泰霖 Internet-based unmanned electric automobile service system
US20170080850A1 (en) * 2015-09-18 2017-03-23 Clearpath Robotics, Inc. Lighting control system and method for autonomous vehicles
CN105957377A (en) * 2016-05-03 2016-09-21 北京新能源汽车股份有限公司 Intelligent traffic control system and method based on unmanned electric automobiles
CN106549981A (en) * 2017-01-13 2017-03-29 邹城众达知识产权咨询服务有限公司 A kind of directionless disk intelligent network about car system and its method for running based on big data cloud computing service

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200183415A1 (en) * 2018-12-10 2020-06-11 GM Global Technology Operations LLC System and method for control of an autonomous vehicle
WO2021249752A1 (en) * 2020-06-10 2021-12-16 Daimler Ag Methods and systems for displaying visual content on a motor vehicle and method for providing a motor vehicle
US20230191910A1 (en) * 2020-06-10 2023-06-22 Mercedes-Benz Group AG Methods and systems for displaying visual content on a motor vehicle and method for providing a motor vehicle
JP7316469B1 (en) 2020-06-10 2023-07-27 メルセデス・ベンツ グループ アクチェンゲゼルシャフト Methods and systems for displaying visual content on automobiles and methods for providing automobiles
JP2023536211A (en) * 2020-06-10 2023-08-24 メルセデス・ベンツ グループ アクチェンゲゼルシャフト Methods and systems for displaying visual content on automobiles and methods for providing automobiles
US11780331B2 (en) * 2020-06-10 2023-10-10 Mercedes-Benz Group AG Methods and systems for displaying visual content on a motor vehicle and method for providing a motor vehicle

Also Published As

Publication number Publication date
CN110431604A (en) 2019-11-08
CN110431604B (en) 2022-06-21

Similar Documents

Publication Publication Date Title
EP3319063B1 (en) Method and apparatus for launching start-stop function
US10214145B2 (en) Vehicle control device and vehicle control method thereof
US10096249B2 (en) Method, apparatus and storage medium for providing collision alert
CN108680173B (en) Electronic device, control method of electronic device, and computer-readable recording medium
KR101502013B1 (en) Mobile terminal and method for providing location based service thereof
US20150163620A1 (en) Method and Apparatus For Social Telematics
JP5795278B2 (en) Navigation device, autonomous navigation support method, and autonomous navigation support program
CN109017554B (en) Driving reminding method and device and computer readable storage medium
CN111681455B (en) Control method of electronic device, and recording medium
CN110260877B (en) Driving related guidance providing method and apparatus, and computer readable recording medium
US20120064865A1 (en) Mobile terminal and control method thereof
US11790283B2 (en) Parking space lock and system and method for providing parking service
WO2017110526A1 (en) Mobile terminal and vehicle
US10609510B2 (en) Mobile electronic apparatus, mobile electronic apparatus control method, a non-transitory computer readable recording medium, for providing warnings to a user of the apparatus based on the location of the electronic apparatus
CN109649268B (en) Intelligent voice assistant system, device and method for vehicle
JP2016053880A (en) On-vehicle system, information processing method, and computer program
WO2018191886A1 (en) Autonomous vehicle and control method therefor
US11181386B2 (en) Navigation device, destination guiding system, and non-transitory recording medium
JP2024052899A (en) Communication device and communication method
WO2021192873A1 (en) Positioning system
KR20140128800A (en) An method for determinating a direction and an appratus using it
JP7166419B1 (en) Boarding intention estimating device, vehicle control system, boarding intention estimating program, and boarding intention estimating method
CN112572661B (en) Information terminal and storage medium
WO2024046353A2 (en) Presentation control method, device for in-vehicle glass of vehicle, and vehicle
KR101838820B1 (en) Smart door scuff

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906099

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906099

Country of ref document: EP

Kind code of ref document: A1