CN110431604B - Autonomous vehicle and control method thereof - Google Patents

Autonomous vehicle and control method thereof Download PDF

Info

Publication number
CN110431604B
CN110431604B CN201780088607.1A CN201780088607A CN110431604B CN 110431604 B CN110431604 B CN 110431604B CN 201780088607 A CN201780088607 A CN 201780088607A CN 110431604 B CN110431604 B CN 110431604B
Authority
CN
China
Prior art keywords
vehicle
user
visual signal
information
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780088607.1A
Other languages
Chinese (zh)
Other versions
CN110431604A (en
Inventor
周碧云
M·塞德尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Publication of CN110431604A publication Critical patent/CN110431604A/en
Application granted granted Critical
Publication of CN110431604B publication Critical patent/CN110431604B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/503Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
    • B60Q1/5035Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text electronic displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/507Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to autonomous vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/549Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for expressing greetings, gratitude or emotions

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • User Interface Of Digital Computer (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)
  • Traffic Control Systems (AREA)

Abstract

An autonomous vehicle (100) and a control method thereof are disclosed. An autonomous vehicle (100) comprises: a communication unit (101) configured for receiving order information, wherein the order information comprises at least information about a user to be boarded and a boarding place; a visual signal providing unit (103); and a control unit (102) configured for: order information is acquired from a communication unit (101), and when and/or after it is determined that a vehicle arrives and/or is about to arrive at a boarding point, a visual signal providing unit (103) is controlled to provide a first visual signal that allows the vehicle to be recognized by a user.

Description

Autonomous vehicle and control method thereof
Technical Field
The present disclosure relates generally to the field of autonomous vehicles, and in particular to a method, system, and non-transitory computer-readable medium for controlling an autonomous vehicle to provide passenger ride service.
Background
Current processes for passenger boarding services include: the driver receives and confirms the order, drives to the boarding location to pick up the passenger, calls the passenger when the driver approaches the boarding location, and waits for the passenger to arrive.
If an autonomous vehicle (e.g., unmanned vehicle) provides ODM (on demand mobile) services, the passenger will be picked up and carried without a driver.
Disclosure of Invention
One aspect of the present disclosure is generally directed to an autonomous vehicle and a method, system, and non-transitory computer-readable medium for controlling an autonomous vehicle to provide passenger ride service.
According to a first exemplary embodiment of the present disclosure, there is provided an autonomous vehicle comprising: a communication unit configured to receive order information, wherein the order information includes at least information on a user to board and a boarding place; a visual signal providing unit configured for providing one or more visual signals; and a control unit configured to: the order information is acquired from the communication unit, it is determined whether the vehicle arrives or is about to arrive at the boarding point, and after it is determined that the vehicle arrives or is about to arrive at the boarding point, the visual signal providing unit is controlled to provide the first visual signal allowing the vehicle to be recognized by the user.
In an example of this embodiment, the first visual signal may comprise at least one of: for the greeting information of the user, personal information of the user, a picture or user name predefined by the user, information customized by the user, colored and/or flashing lights, and an image of an avatar representing the vehicle, and the visual signal providing unit may comprise at least one of: at least one display device configured for displaying one or more characters and/or one or more images on a window and/or an exterior surface of a vehicle; at least one light emitting device configured for emitting colored and/or scintillating light visible from outside the vehicle; and at least one adaptation mechanism configured for adapting an appearance of the vehicle to provide a visual signal, wherein the appearance comprises at least a color and/or a shape.
In another example of this embodiment, the at least one display device may include one or more of: a flat panel display device, a curved surface display device, a flexible display device, a projection display device, and a holographic display device.
In another example of this embodiment, the vehicle may further comprise a detection unit configured to detect a distance between the user and the vehicle, wherein the control unit may be configured to control the visual signal providing unit to perform at least one of the following if the distance is less than a threshold value: providing greeting information for a user as the first visual signal on a window and/or an exterior surface of a vehicle; and emitting colored and/or scintillating light visible from outside the vehicle as the first visual signal.
In another example of this embodiment, the control unit may be further configured to: controlling the communication unit to transmit a signal indicating an avatar of the vehicle to the electronic device of the user after the order information is acquired; and in response to a signal received by the communication unit from the electronic device of the user and/or detection by the detection unit that the user is in the vicinity of the vehicle, control the visual signal providing unit to provide an image representing an avatar of the vehicle and/or greeting information for the user as said first visual signal on a window and/or an exterior surface of the vehicle.
In another example of this embodiment, the control unit may be further configured to control the visual signal providing unit to display a picture and/or a user name predefined by the user and/or information customized by the user on a window and/or an exterior surface of the vehicle after acquiring the order information.
In another example of the present embodiment, the control unit may be further configured to control the communication unit to transmit an arrival notification to the electronic device of the user in a case where it is determined that the vehicle arrives or is about to arrive at the boarding place.
In another example of this embodiment, the arrival notification may include information about the location of the vehicle, and/or video and/or audio information about the surroundings of the vehicle.
In another example of this embodiment, the control unit may be further configured to: in a case where the detection unit detects that the user is near the vehicle, controlling the communication unit to transmit a signal that requires a response from the user to the electronic device of the user; and after the communication unit receives the response signal from the electronic device of the user, control the visual signal providing unit to provide a second visual signal.
According to a second exemplary embodiment of the present disclosure, there is provided a computer-implemented method for controlling an autonomous vehicle, the method comprising: acquiring order information, wherein the order information at least comprises information about a user to be picked up and a pickup place; determining whether a vehicle arrives or is about to arrive at a ride; and after determining that the vehicle arrives or is about to arrive at the ride location, causing the vehicle to provide a first visual signal that allows the vehicle to be recognized by the user.
In an example of this embodiment, the first visual signal may comprise at least one of: greeting information for the user, personal information of the user, a picture or username predefined by the user, information customized by the user, colored and/or flashing lights, and an image of an avatar representing the vehicle.
In another example of this embodiment, the step of causing the vehicle to provide the first visual signal may comprise: in the event that it is detected that the distance between the user and the vehicle is less than a threshold, causing the vehicle to perform at least one of: providing greeting information for a user as the first visual signal on a window and/or an exterior surface of a vehicle; and emitting colored and/or scintillating light visible from outside the vehicle as the first visual signal.
In another example of this embodiment, the method may further comprise, after obtaining the order information, causing the vehicle to transmit a signal indicative of an avatar of the vehicle to the electronic device of the user, wherein the step of causing the vehicle to provide the first visual signal may comprise causing the vehicle to provide an image representative of the avatar of the vehicle and/or greeting information for the user as the first visual signal on a window and/or an exterior surface of the vehicle in response to the signal from the electronic device of the user and/or detecting that the user is in the vicinity of the vehicle.
In another example of this embodiment, the method may further include displaying a picture or user name predefined by the user and/or information customized by the user on a window and/or exterior surface of the vehicle after obtaining the order information.
In another example of this embodiment, the first visual signal may vary with distance between the user and the vehicle.
In another example of this embodiment, the method may further include causing the vehicle to transmit an arrival notification to the electronic device of the user if it is determined that the vehicle is arriving or is about to arrive at the ride location.
In another example of this embodiment, the arrival notification may include information about the location of the vehicle, and/or video and/or audio information about the surroundings of the vehicle.
In another example of this embodiment, the method may further include: in the event that the user is detected to be in the vicinity of the vehicle, causing the vehicle to transmit a signal to the electronic device of the user requesting a response from the user; and causing the vehicle to provide a second visual signal and/or an audible signal after receiving the response signal from the user's electronic device.
According to a third exemplary embodiment of the present disclosure, there is provided a system for controlling an autonomous vehicle, the system comprising: one or more processors; and one or more memories configured to store a series of computer-executable instructions, wherein the series of computer-executable instructions, when executed by the one or more processors, cause the one or more processors to perform the steps of the above-described method.
According to a fourth exemplary embodiment of the present disclosure, a non-transitory computer-readable medium is provided having instructions stored thereon, which, when executed by one or more processors, cause the one or more processors to perform the steps of the above-described method.
Drawings
The above and other aspects and advantages of the present disclosure will become apparent from the following detailed description of exemplary embodiments, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the principles of the disclosure. Note that the drawings are not necessarily drawn to scale.
Fig. 1 illustrates a block diagram of an autonomous vehicle according to an exemplary embodiment of the present disclosure.
Fig. 2 illustrates a flowchart showing a method of controlling an autonomous vehicle according to an exemplary embodiment of the present disclosure.
Fig. 3 illustrates a block diagram of an apparatus for controlling an autonomous vehicle according to an exemplary embodiment of the present disclosure.
Fig. 4 illustrates a possible visual effect of a vehicle according to an exemplary embodiment of the present disclosure.
Fig. 5 illustrates a possible visual effect of a vehicle according to another exemplary embodiment of the present disclosure.
Fig. 6A-6B illustrate possible visual effects of a vehicle according to another exemplary embodiment of the present disclosure.
Fig. 7A-7B illustrate possible visual effects of a vehicle according to another exemplary embodiment of the present disclosure.
FIG. 8 illustrates a general hardware environment in which the present disclosure is applicable, according to an exemplary embodiment of the present disclosure.
Detailed Description
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the described exemplary embodiments. It will be apparent, however, to one skilled in the art, that the described embodiments may be practiced without some or all of these specific details. In other exemplary embodiments, well-known structures or process steps have not been described in detail in order to avoid unnecessarily obscuring the concepts of the present disclosure.
The term "vehicle" is used throughout this specification to refer to land vehicles, water vehicles, underwater vehicles, aircraft, spacecraft, and the like. The term "and/or" is used throughout this specification to mean "a", "B" or "a and B".
Referring initially to fig. 1, a block diagram of an autonomous vehicle 100 is shown, according to an exemplary embodiment of the present disclosure. The vehicle 100 may include at least: a communication unit 101 that can communicate with an external device (not shown); a visual signal providing unit 103 that can provide at least one visual signal that allows the vehicle 100 to be recognized by a user; and a control unit 102 that can control the entire operation of the visual signal providing unit 103 or the vehicle 100.
The communication unit 101 may communicate with an external device, such as a server for providing an ODM (on demand mobile) service, or an electronic device (e.g., a smartphone) of an ODM service user who intends to become a passenger of the vehicle, via a network (not shown). The network may include a Local Area Network (LAN), a Wide Area Network (WAN) (e.g., the internet), a virtual network, a telecommunications network, and/or other interconnected paths through which multiple entities may communicate. In some embodiments, the network includes a network for sending and receiving data via, for example, Short Messaging Service (SMS), Multimedia Messaging Service (MMS), Hypertext transfer protocol (HTTP), direct data connection, WAP, email, and the like
Figure BDA0002205827770000061
A communication network or a cellular communication network. In other embodiments, the network may be a mobile data network, such as a CDMA, GPRS, TDMA, GSM, WIMAX, 3G, 4G, LTE, VoLTE or any other mobile data network or combination of mobile data networks.
Further, the communication unit 101 may communicate with a platform (not shown) via the above-described network to communicate with the electronic device of the user. The platform may include at least one server and at least one application operating thereon. Both the communication unit 101 and the user's electronic device may be connected to the platform via a network, and thus the communication unit 101 may transmit and receive data to and from the user's electronic device.
In some embodiments, the visual signal providing unit 103 may include at least one of: one or more display devices, one or more light emitting devices, and one or more adapting mechanisms to provide at least one visual signal to a user.
The display device may be configured to display one or more characters and/or one or more images on one or more windows and/or one or more exterior surfaces of the vehicle. The characters may include letters, numbers, symbols, etc., and the images may include pictures, photographs, portraits, etc. Both characters and images may be displayed statically or dynamically by the display device. The one or more characters and/or the one or more images displayed by the display device may present at least one of the following as at least one visual signal presented to the user: greeting information for the user, personal information of the user, a picture or user name predefined by the user, information customized by the user, etc.
In some embodiments, the display device may include one or more of: flat panel display devices, curved surface display devices, flexible display devices, projection display devices, holographic display devices, and the like. It will be apparent to those skilled in the art that the present invention is not limited to these listed displays, but may be any type of display as long as it can display characters and/or images on the windows and/or exterior surfaces of a vehicle.
The one or more characters and/or the one or more images may be displayed on one or more windows of the vehicle. In some embodiments, the display device may be the window itself, that is, the window of the vehicle may be configured as a display screen. In other embodiments, the one or more characters and/or the one or more images may be projected (e.g., holographically projected) to a window. One or more projectors are provided in, on or outside the vehicle projected light distribution projected onto the window to form a display on or in the window. In these cases, the window of the vehicle may be configured as a projection screen. Although the window is configured as a display screen or projection screen, the display may be formed on an inner and/or outer surface of the window, and/or an interior of the window. Regardless of the portion of the window in which the display is formed, the display surface may face the interior and/or exterior of the vehicle. The window may be opaque, translucent or transparent.
Additionally, the one or more characters and/or the one or more images may be displayed on one or more exterior surfaces of the vehicle. The exterior surface of the vehicle may be a door, frame, side view mirror, windshield, wheel, fender, roof, trunk, tailgate, engine compartment, hood, or the like. In these cases, the exterior surface of the vehicle may be configured as a display screen or a projection screen, which features are similar to the above exterior surface.
The light emitting device may be configured to emit light, preferably colored and/or blinking light, that is visible from outside the vehicle to provide at least one visual signal. The light emitted from the light emitting device may be more noticeable to the user in low light conditions, such as in an indoor environment, in a tunnel, in an underground parking lot, etc., and/or at dusk, at night, etc. In particular, colored and/or blinking lights may more easily attract the attention of a user. The colored light and/or the scintillating light can be white, monochromatic, or polychromatic. In addition, light emitted from the light emitting device may interact with the user and/or the user's electronics. Further, colored and/or flashing lights may present a visual effect to the user that conveys a greeting or the like.
The adaptation mechanism may be configured to adapt an appearance of the vehicle to provide the at least one visual signal, wherein the appearance comprises at least a color and/or a shape. The adaptation means may be realized as a colour change material and/or a colour change surface of the vehicle, and/or a material changing its shape, and/or a surface of the vehicle changing its shape. Examples of this are surface coatings, switchable windows/films and shape memory polymers which can change their colour.
In some exemplary embodiments of the present disclosure, the vehicle 100 may further comprise a detection unit 104 that may detect a distance between the user and the vehicle 100 as shown in fig. 1, wherein the dashed line indicates that the component 104 is optional. It should be understood that the control unit 102 may also control the operation of the detection unit 104.
In some embodiments, the detection unit 104 may be configured to detect the distance between the user and the vehicle 100 by detecting the position of the user. For example, the detection unit 104 may acquire position data from a Global Positioning System (GPS) to detect the position of the user, the position data being provided by the user's electronic device having a GPS positioning function. It should be understood that the location data need not necessarily be from GPS, but may also be from a base station positioning system or WiFi positioning system. In this case, the detection unit 104 may need to establish a communication connection with the electronic device of the user or with the platform via the network to acquire the location data. In some implementations, the detection unit 104 and the communication unit 101 may be integrated. While in other implementations the detection unit 104 may be separate from the communication unit 101. The detection unit 104 and the communication unit 101 may be implemented as one element or as separate elements.
Further, the detection unit 104 may be configured to detect the distance between the user and the vehicle 100 by directly detecting the distance, rather than detecting the position of the user. For example, a distance sensor may be used as the detection unit 104 to detect the distance between the user and the vehicle 100. The method of detecting the distance with the distance sensor is not particularly limited thereto. For example, the distance sensor may detect the distance using infrared rays, ultrasonic waves, or the like. Alternatively or additionally, the detection unit 104 may detect the distance between the user and the vehicle 100 using wireless communication means, such as a Body Area Network (BAN),
Figure BDA0002205827770000081
Communication, Bluetooth Low Energy (BLE) communication (e.g., iBeacon), Near Field Communication (NFC), WiFi communication, Zigbee communication, magnetic communication, electromagnetic communication (including RF, microwave, etc.), and other such communication means.
The control unit 102 receives data from various other components of the vehicle 100, such as the communication unit 101, the visual signal providing unit 103, and the detection unit 104. And, the control unit 102 transmits a control command to the above-described various other components.
In fig. 1, the connecting lines between the various components represent bi-directional communication lines, which may be tangible wires or may be implemented wirelessly, such as via radio, RF, etc. The specific control operation performed by the control unit 102 will be described in detail later. The control unit 102 may be a processor, microprocessor, or the like. The control unit 102 may be provided on the vehicle 100, for example at a central console of the vehicle 100. Alternatively, the control unit 102 may be provided remotely, and may be accessed via various networks or the like.
It should be understood that each of the units of the vehicle 100 may be in communication with each other, and that each of the units of the vehicle 100 may be integrated into the vehicle 100 or mounted to the vehicle 100 and may also be external to the vehicle 100.
It should also be understood that the components of the vehicle 100 may be implemented in hardware, software, firmware, or any combination thereof to implement the principles of the present disclosure. It will be appreciated by those skilled in the art that the blocks described in fig. 1 may be combined or divided into sub-blocks to implement the principles of the disclosure as described above. Thus, the description herein may support any possible combination or division or further definition of the blocks described herein.
The features, types, numbers and locations of the communication unit 101, the control unit 102, the visual signal providing unit 103 and the detection unit 104 have been described in detail. As can be readily appreciated by those skilled in the art, however, the features, types, numbers and positions of the above components are not limited to the illustrated embodiments, and other features, types, numbers and positions may be used according to actual requirements.
The electronic equipment of the user may be any type of wired or wireless device having communication capabilities. Exemplary electronic devices of a user include wearable devices, cellular phones or other mobile communication devices, GPS devices, tablets or other personal computing devices, Personal Data Assistants (PDAs), MP3 players or other personal music playing devices, cameras, video cameras, and the like.
Next, the operation of the vehicle 100 will be described in detail.
In an exemplary embodiment of the present disclosure, the communication unit 101 may receive order information, wherein the order information includes at least information about a user to be boarded and a boarding place. The order may be placed by the user or someone else. The communication unit 101 may receive order information from a user or from a platform via a network. The order information received by the communication unit 101 may be transmitted to the control unit 102 via wire(s) or wirelessly. The control unit 102 may acquire order information from the communication unit 101, and then may automatically confirm the order and travel to the boarding place. After determining that the vehicle arrives or is about to arrive at the boarding location, the control unit 102 may control the visual signal providing unit 103 to provide the first visual signal that allows the vehicle 100 to be recognized by the user.
The control unit 102 may use the position of the vehicle 100 to determine whether the vehicle 100 arrives at or is about to arrive at the ride. The position of the vehicle 100 may be detected by a position detection unit. The position detection unit may be the same as or different from the detection unit 104. The position detection unit and the detection unit 104 may be implemented as one element or as separate elements.
The control unit 102 may determine how much time remains before arriving at the ride. If the determined remaining time is within a predetermined time range (e.g., 5 minutes, 10 minutes, etc.), the control unit 102 may determine that the vehicle 100 is about to arrive at or arrives at the ride. Similarly, the control unit 102 may determine the distance between the vehicle 100 and the ride. If the determined distance is within a predetermined distance range (e.g., 1 kilometer, 5 kilometers, etc.), the control unit 102 may determine that the vehicle 100 is about to arrive at the ride location. That is, the term "about to arrive" herein generally means travel less than a predetermined time from the pickup location (e.g., less than 5 minutes or 10 minutes) or a vehicle less than a predetermined distance from the pickup location (e.g., less than 1 kilometer, 2 kilometers, 3 kilometers, or 5 kilometers).
The determination of the predetermined time range or the predetermined distance range is not particularly limited. For example, they may be predetermined based on the time it takes for the user to reach the ride, the location of the user (or the distance between the user and the ride), and/or user preferences.
The control unit 102 may control the visual signal providing unit 103 to provide the first visual signal when or immediately after the vehicle 100 is arriving at the ride spot (e.g., during the vehicle 100 is driving to the ride spot), when the vehicle 100 arrives at the ride spot (e.g., upon arriving at the ride spot), and/or after arriving at the ride spot (e.g., after waiting for the user at the ride spot for a while).
Because the first visual signal may allow the vehicle 100 to be identified by the user, the user may identify which vehicle is ordered to pick up his/her vehicle.
In an exemplary embodiment of the present disclosure, in case the distance between the user and the vehicle 100 detected by the detection unit 104 is less than a threshold, the control unit 102 may control the visual signal providing unit 103 to perform at least one of: providing greeting information for the user as the first visual signal on one or more windows and/or one or more exterior surfaces of the vehicle 100; and emitting colored and/or flashing light visible from outside the vehicle 100 as the first visual signal.
The control unit 102 may control the visual signal providing unit 103 to provide greeting information for the user on one or more windows and/or one or more exterior surfaces of the vehicle 100. In one embodiment, the visual signal providing unit 103 may include one or more display devices, and the greeting information may be presented as one or more characters and/or one or more images on one or more windows and/or one or more exterior surfaces of the vehicle 100. Greeting information may include personal information of the user (e.g., name information, title information, gender information, etc.), a picture or username predefined by the user, information customized by the user (e.g., content, style, color of display customized by the user), an image representing the avatar of vehicle 100 (e.g., the image representing the avatar of vehicle 100 may say some greeting), and so forth.
A possible visual effect of a vehicle according to an exemplary embodiment of the present disclosure is shown in fig. 4. For the user's greeting message 401, "a woman, welcoming to the sea," is provided on the window(s) 402 of the vehicle 100 by the visual signal providing unit 103. It will be apparent that although not shown in this figure, complimentary information may also be provided on the exterior surface of the vehicle 100.
In this case, when the user approaches the vehicle 100, the user can easily recognize the vehicle 100 with the help of the greeting information. In particular, greeting information may enhance the user experience.
In other embodiments, in addition to providing a visual signal, a welcome voice or a notification sound or other audio may be played to assist the user in finding the vehicle or greeting the user, or the like.
Because the greeting information may include personal information of the user, to protect the privacy of the user, it may be preferable to provide the greeting information if the distance between the user and the vehicle 100 is less than a threshold (e.g., the user is close to the vehicle 100). The threshold value may be preset to any desired value, such as 5 meters, 10 meters, etc., depending on the actual application or experience.
In some implementations, the control unit 102 may control the visual signal providing unit 103 to emit colored light and/or flashing light visible from outside the vehicle 100 as the first visual signal. In one embodiment, the visual signal providing unit 103 may comprise one or more light emitting devices. A possible visual effect of a vehicle according to another exemplary embodiment of the present disclosure is shown in fig. 5. In this case, the visual signal providing unit 103 may include light bars 501, 502, and 503. Colored and/or flashing light visible from outside the vehicle 100 has been emitted by the light bars 501, 502, and 503. Although the light is shown in the shape of a strip around the top and rear lights of the vehicle 100, it is understood that the light emitting device may be positioned anywhere, and the light may be any color and any shape. The user may predefine the illuminance, color, shape, and/or blinking frequency of the light, etc.
When the first visual signal is realized as light emitted from the light emitting means, it may be more noticeable to the user in a weak light condition. In particular, colored and/or flashing lights may more easily attract the attention of a user. In this case, when the user approaches the vehicle 100, the user can easily identify the vehicle 100 with the aid of colored light and/or blinking light. Further, the colored lights and/or flashing lights may present a visual effect that greets the user as the user approaches the vehicle 100 so as to enhance the user's experience. Because it is desirable that colored light and/or flashing light be visible to the user, it is preferable that light be emitted in the event that the distance between the user and the vehicle 100 is less than a threshold value (e.g., the user is in the vicinity of the vehicle 100, the vehicle 100 is within the visual range of the user, etc.).
In some embodiments, the automobile of fig. 5 may additionally display a picture 504 on its side window that is predetermined by the user to help the user more easily identify the automobile. The picture 504 may be, for example, a picture used by the user in a social network, a picture defined by the user on a platform, and so forth.
Although the above visual effects of providing greeting information and emitting light are shown in fig. 4 and 5, respectively, it is to be understood that the above visual effects may be presented in combination with each other. For example, although greeting information for the user is provided by the visual signal providing unit 103 (e.g., at least one display device), colored light and/or flashing light may also be emitted by the visual signal providing unit 103 (e.g., at least one light emitting device).
In another exemplary embodiment of the present disclosure, after acquiring the order information from the communication unit 101, the control unit 102 may control the communication unit 101 to transmit a signal indicating an avatar of the vehicle 100 to the electronic device of the user. The avatar of the vehicle 100 may be a picture of a person, animal, or cartoon representing the vehicle 100. Preferably, the avatar of the vehicle 100 may be a virtual representation of the appearance of its virtual driver.
The user's electronic device may display an image representing the avatar of the vehicle 100 based on the received signal. A possible visual effect of the user's electronic device displaying an image representing the avatar of the vehicle 100 is illustrated in FIG. 6A. In this case, the user's electronic device may be a smart watch 603. The smartwatch receives a signal indicative of an avatar of the vehicle 100 and holographically displays an image 601 representing the avatar. The avatar of the vehicle 100 is a virtual representation of the appearance of a person, specifically a driver. The displayed image 601 representing the avatar may "speak" like a real person, e.g., as indicated by reference numeral 602 in fig. 6A, which says "i will pick you in 5 minutes". The utterance may be provided visually, or acoustically, or both visually and acoustically.
In response to a signal received by the communication unit 101 from the electronic device of the user, the control unit 102 may control the visual signal providing unit 103 to provide an image representing an avatar of the vehicle 100 as the first visual signal on one or more windows and/or one or more exterior surfaces of the vehicle 100. When the user approaches or approaches the vehicle 100, the user's electronic device may transmit a signal that the vehicle 100 is expected to respond to.
A possible visual effect of the vehicle 100 providing an image representative of the avatar of the vehicle 100 is illustrated in fig. 6B. In this case, an image 604 representing the avatar of the vehicle 100 is provided (e.g., holographically) on one or more windows of the vehicle 100. The avatar of the vehicle 100 is a virtual representation of the appearance of a person, specifically a driver. The virtual representation of the person may appear in a window and greet the user. The displayed image 604 representing the avatar may "speak" like a real person, for example, as labeled by reference numeral 605 in FIG. 6B, which says "hello, LUKO! ". The utterance may be provided visually, or acoustically, or both visually and acoustically. Such utterances from the displayed visual representation of the person may further greet the user in order to enhance the user's experience.
In this embodiment, the displayed image representing the avatar of the vehicle 100 conveys to the user the meaning that the virtual driver recognizes him/her and greets him/her, so that the user can be relieved and the degree of trust in the vehicle 100 can be increased. In addition, because both the image provided on the window and/or exterior surface of the vehicle 100 and the image displayed in the user's electronic device represent an avatar of the vehicle 100, that is, the two images are associated with each other (and may even be the same), the user may more easily identify the vehicle 100 with the help of the displayed images.
Although the image 604 representing the avatar of the vehicle 100 shown in FIG. 6B is on a window of the vehicle 100, it may be provided on one or more exterior surfaces of the vehicle 100. Alternatively or additionally, in such a case, greeting information for the user may also be provided on one or more windows and/or one or more exterior surfaces of the vehicle 100.
Alternatively or additionally, the detection unit 104 may detect a distance between the user and the vehicle 100. In response to detecting that the user is near the vehicle 100, the control unit 102 may control the visual signal providing unit 103 to provide an image representing an avatar of the vehicle 100 as the first visual signal on a window and/or an outer surface of the vehicle. Alternatively or additionally, in such a case, greeting information for the user may also be provided on one or more windows and/or one or more exterior surfaces of the vehicle 100.
In another exemplary embodiment of the present disclosure, after acquiring the order information, the control unit 102 may control the visual signal providing unit 103 to display a picture or a user name predefined by the user, and/or information customized by the user on a window and/or an outer surface of the vehicle 100. The picture or username predefined by the user may be a username used by the user in the social network, a picture or username defined by the user on the platform, or the like. The information customized by the user may be any type of characters and/or images. In the event that a picture or user name predefined by the user and/or information customized by the user is displayed on a window and/or exterior surface of the vehicle 100, the user may post something that is desired to self-express and/or participate in social interaction. In addition, because the displayed content is predefined or customized by the user, the user is so familiar with the displayed content or the displayed content is so appealing that the user can more easily identify the vehicle 100 (perhaps for the first eye).
Possible visual effects of the vehicle 100 displaying a picture or user name predefined by the user and/or information customized by the user are illustrated in fig. 7A-7B. In these cases, a picture 701 predefined by the user or a photograph 702 customized by the user is displayed on the side window of the vehicle 100. It should be understood that they may additionally or alternatively be displayed on an exterior surface of the vehicle 100. The vehicle 100 enables the user to post something that is desired to self-express and/or participate in social interaction as shown in fig. 7A-7B. For example, the vehicle 100 enables the user to post as a picture 701 or a photograph 702 a game video he/she has recently played, a movie poster he/she likes, a photograph of a idol of a concert he/she will go to this evening, and so on.
In this embodiment, the displayed content may be presented after the control unit 102 acquires the order information. That is, the above content need not be displayed until the vehicle 100 arrives or is about to arrive at the boarding location. For example, the vehicle 100 may display a picture or user name predefined by the user and/or information customized by the user during its automatic travel, so the vehicle 100 with a strong visual indicator may become a vehicle showing a visible identity in the traffic, making it more convenient for the user to self-express and participate in social interactions. Alternatively or additionally, after receiving the user, and during travel for the user, the vehicle 100 may always display the above for the user to self-express and participate in social interaction.
In another exemplary embodiment of the present disclosure, when it is determined that the vehicle arrives or is about to arrive at the pick-up location, the control unit 102 may control the communication unit 101 to transmit an arrival notification to the electronic device of the user. The method by which the control unit 102 determines whether the vehicle 100 arrives at or is about to arrive at the pick-up location is the same as the method described above. The arrival notification transmitted by the communication unit 101 enables the user to know that the vehicle 100 has arrived or is about to arrive.
In particular, the arrival notification includes information about the location of the vehicle 100, which may indicate a particular location of the vehicle 100. The position of the vehicle 100 may be detected by the detection unit 104 as described above.
Alternatively or additionally, video and/or audio information about the surroundings of the vehicle 100 may be sent to the user's electronic device by being included in the arrival notification. The video information about the surroundings of the vehicle 100 may be pictures or video captured by a camera. For example, video information that the vehicle 100 is stopping next to a convenience store or several parking spaces, etc. may be displayed to the user, so the user will find the vehicle 100 more easily with the help of the video information. Alternatively or additionally, audio information may also provide such assistance. The audio information about the surroundings of the vehicle 100 may be recorded sound captured by a receiver or video captured by a camera. The audio information may assist the user in determining a particular location of the vehicle 100, particularly for users who are visually insensitive thereto. For example, if the audio information represents a significant echo(s), the user may determine that the vehicle 100 is likely in an indoor space. With the help of video and/or audio information about the surroundings of the vehicle 100, the user can find the vehicle 100 more easily.
In another exemplary embodiment of the present disclosure, the first visual signal may vary with a distance between the user and the vehicle. In one embodiment, when the distance detected by the detection unit 104 is less than a certain value, the control unit 102 may control the first visual signal to change so that the user may more easily recognize the vehicle 100. For example, the user notices that when he/she moves from location a to location B, the visual signal changes from the first format to the second format. Then, possibly after a few trips thereto and back, he/she can determine that the vehicle providing the visual signal is the vehicle he/she has placed the order to pick up.
In another exemplary embodiment of the present disclosure, in case the detection unit 104 detects that the user is in the vicinity of the vehicle 100, the control unit 102 may control the communication unit 101 to transmit a signal requiring a user response to the electronic device of the user. When the detection unit 104 detects that the user is near the vehicle 100, the vehicle 100 may actively transmit a signal to the user's electronic device. The user may feel the signal through vibration of the user's electronic device. And then used to know that he/she is in the vicinity of the vehicle 100. To more easily find the vehicle 100, the user may respond to the signal by sending a response signal to the vehicle 100. After receiving the response signal from the electronic device of the user, the control unit 102 may control the visual signal providing unit 103 to provide the second visual signal. In one embodiment, the second visual signal is different from the first visual signal. The user may more easily identify the vehicle based on interactions between the vehicle 100 and the user's electronic device.
Referring back to fig. 2, fig. 2 illustrates a flow chart showing a method 200 of controlling an autonomous vehicle according to an exemplary embodiment of the present disclosure. The method 200 may be performed by, for example, the control unit 102 described above in fig. 1 or other devices. The steps of method 200 presented below are intended to be illustrative. In some embodiments, the method may be implemented with one or more additional steps not described, and/or without one or more of the steps discussed. Additionally, the order in which the steps of the method are illustrated in FIG. 2 and described below is not intended to be limiting. In some embodiments, the methods may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more modules that perform some or all of the steps of the methods in response to instructions stored electronically on an electronic storage medium. The one or more processing modules may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for carrying out one or more of the steps of the methods.
As shown in fig. 2, at step 210, order information may be obtained. The order information contains at least information about the user to be loaded and the loading place.
At step 220, it is determined whether the vehicle is arriving or is about to arrive at the ride location.
At step 230, after determining that the vehicle is at or about to reach the pick-up location, the vehicle may be caused to provide a first visual signal that allows the vehicle to be recognized by the user. In one embodiment, the visual signal providing unit 103 of the vehicle may be controlled by the control unit 102 to provide the first visual signal.
In some embodiments, the first visual signal may comprise at least one of: greeting information for the user, personal information of the user, a picture or username predefined by the user, information customized by the user, colored and/or flashing lights, and an image of an avatar representing the vehicle.
In another embodiment, the first visual signal may vary with distance between the user and the vehicle.
In other embodiments, in the event that the distance between the user and the vehicle is detected to be less than a threshold, the vehicle may be caused to perform at least one of: providing greeting information for a user as the first visual signal on a window and/or an exterior surface of a vehicle; and emitting colored and/or scintillating light visible from outside the vehicle as the first visual signal.
In other embodiments, in response to a signal from the user's electronic device and/or detecting that the user is near the vehicle, the vehicle is caused to provide an image representing an avatar of the vehicle and/or greeting information for the user as the first visual signal on a window and/or an exterior surface of the vehicle.
In some implementations, the method 200 can further include, after obtaining the order information, causing the vehicle to transmit a signal indicative of an avatar of the vehicle to the electronic device of the user.
In another implementation, the method 200 may further include displaying a picture or username predefined by the user and/or information customized by the user on a window and/or exterior surface of the vehicle after obtaining the order information.
In other implementations, the method 200 can further include, in an instance in which it is determined that the vehicle is arriving or is about to arrive at the ride location, causing the vehicle to send an arrival notification to the electronic device of the user. In some examples, the arrival notification includes information about the location of the vehicle, and/or video and/or audio information about the surroundings of the vehicle.
In other implementations, the method 200 may further include: in the event that the user is detected to be in the vicinity of the vehicle, causing the vehicle to transmit a signal to the electronic device of the user requesting a response from the user; and causing the vehicle to provide a second visual signal and/or an audible signal after receiving the response signal from the electronic device of the user.
Fig. 3 illustrates a block diagram of a device (e.g., controller 102 as shown in fig. 1) for controlling an autonomous vehicle according to an exemplary embodiment of the present disclosure. The blocks of the device 300 may be implemented in hardware, software, firmware, or any combination thereof to implement the principles of the present disclosure. Those skilled in the art will appreciate that the blocks described in fig. 3 may be combined or divided into sub-blocks to implement the principles of the present disclosure as described above. Thus, the description herein may support any possible combination or division or further definition of the blocks described herein.
Referring to fig. 3, an apparatus 300 for controlling an autonomous vehicle may include: an acquisition unit 301 for acquiring order information, wherein the order information includes at least information about a user to be picked up and a pickup place; a determination unit 302 for determining whether a vehicle arrives or is about to arrive at a boarding location; and a vehicle control unit 303 for causing the vehicle to provide a first visual signal allowing the vehicle to be recognized by the user after determining that the vehicle arrives or is about to arrive at the boarding location.
In an example of this embodiment, the first visual signal may comprise at least one of: greeting information for the user, personal information of the user, a picture or user name predefined by the user, information customized by the user, colored and/or flashing lights, and an image of an avatar representing a vehicle.
Note that various units in the device 300 may be configured to perform various operations as discussed above in the method 200 of fig. 2, and therefore their details are omitted here.
FIG. 8 illustrates a general hardware environment in which the present disclosure is applicable, according to an exemplary embodiment of the present disclosure.
Referring to fig. 8, a hardware environment 800 will now be described, the hardware environment 800 being an example of a hardware apparatus that may be applied to aspects of the present disclosure. Hardware environment 800 may be any machine configured to perform processing and/or computing, and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a smart phone, an on-vehicle computer, or any combination thereof. The aforementioned control unit 102 or apparatus for controlling an autonomous vehicle 300 may be implemented wholly or at least in part by a hardware environment 800 or similar device or system.
Hardware environment 800 may include elements connected to or in communication with (possibly via one or more interfaces) a bus 802. For example, hardware environment 800 may include a bus 802, one or more processors 804, one or more input devices 806, and one or more output devices 808. The one or more processors 804 may be any kind of processor and may include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors (such as special purpose processing chips). Input device 806 may be any kind of device that can input information to a computing device and may include, but is not limited to, a mouse, a keyboard, a touch screen, a microphone, and/or a remote control. Output device 808 may be any kind of device that can present information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer. Hardware environment 800 may also include or be connected with non-transitory storage 810, which may be any storage device that is non-transitory and that may enable data storage, and may include, but is not limited to, hard disk drives, optical storage, solid state storage, floppy disks, flexible disks, hard disk drives, and the like,Magnetic tape or any other magnetic medium, CD or any other optical medium, ROM (read only memory), RAM (random access memory), cache memory and/or any other memory chip or cartridge, and/or any other medium from which a computer can read data, instructions and/or code. The non-transitory storage 810 may be removable from the interface. The non-transitory storage device 810 may have data/instructions/code for implementing the methods and steps described above. Hardware environment 800 may also include a communications device 812. The communication device 812 may be any kind of device or system capable of communicating with external devices and/or with a network, and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset, such as BluetoothTMDevices, 1302.11 devices, WiFi devices, WiMax devices, cellular communications facilities, and the like.
When hardware environment 800 is used as an on-vehicle device, it may also be connected to external devices, such as a GPS receiver, sensors for sensing different environmental data (such as acceleration sensors, wheel speed sensors, gyroscopes, etc.). As such, hardware environment 800 may, for example, receive location data and sensor data indicative of travel conditions of a vehicle. When hardware environment 800 is used as an on-board device, it may also be connected to other facilities for controlling the travel and operation of a vehicle (such as an engine system, wipers, anti-lock braking system, etc.).
Additionally, the non-transitory storage 810 may have map information and software elements such that the processor 804 may perform route guidance processing. Additionally, the output device 806 may include a display for displaying a map, a location marker of the vehicle, an image indicating the travel of the vehicle, and a visual signal. The output device 806 may also include a speaker for audio output.
The bus 802 may include, but is not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA (eisa) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus. Specifically, for devices on a vehicle, the bus 802 may also include a Controller Area Network (CAN) bus or other architecture designed for application on an automobile.
Hardware environment 800 may also include a working memory 814, which may be any kind of working memory that can store instructions and/or data useful for the operation of processor 804, and which may include, but is not limited to, random access memory and/or read only memory devices.
Software elements may be disposed in the working memory 814, including, but not limited to, an operating system 816, one or more application programs 818, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 818, and the aforementioned instructions for controlling the unit 102 or units of the device 300 may be implemented by the processor 804 reading and executing the instructions of the one or more applications 818. More specifically, the aforementioned apparatus 300 or control unit 102 may be implemented, for example, by the processor 804 when executing an application 818 having instructions for performing the steps of the method 200. Additionally, the aforementioned vehicle control unit 303 of the device 300 may be implemented, for example, by the processor 804 when executing the application 818 having instructions for performing step 230 of the method 200. Other elements of the aforementioned apparatus 300 may also be implemented, for example, by the processor 804 when executing the application 818 having instructions for performing one or more of the aforementioned steps. Executable code or source code for the instructions of the software elements may be stored in a non-transitory computer-readable storage medium, such as the storage device(s) 810 described above, and may be read into the working memory 814, may be read into the working memory 814 by compilation and/or installation. Executable code or source code for the instructions of the software elements may also be downloaded from a remote location.
It will be apparent to those skilled in the art from the foregoing embodiments that the present disclosure may be implemented in software in combination with necessary hardware, or in hardware, firmware, etc. Based on such an understanding, embodiments of the present disclosure may be implemented, at least in part, in software. The computer software may be stored in a readable storage medium, such as a floppy disk, hard disk, optical disk, or flash memory of the computer. The computer software includes a series of instructions that cause a computer (e.g., a personal computer, a service station, or a network terminal) to perform a method or a portion thereof according to a respective embodiment of the present disclosure.
The disclosure being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the disclosure, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the following claims.

Claims (18)

1. An autonomous vehicle, characterized in that the autonomous vehicle comprises:
a communication unit configured to receive order information, wherein the order information includes at least information about a user to take and a taking place;
a visual signal providing unit configured to provide one or more visual signals; and
a control unit configured to:
acquiring order information from the communication unit;
determining whether the vehicle arrives or is about to arrive at a ride location; and is
After determining that the vehicle arrives at or is about to arrive at a boarding location, controlling the visual signal providing unit to provide a first visual signal that allows the vehicle to be recognized by a user,
the control unit is further configured to:
in a case where the detection unit detects that the user is in the vicinity of the vehicle, controlling the communication unit to transmit a signal that requires a response of the user to the electronic device of the user; and is
Controlling the visual signal providing unit to provide a second visual signal after the communication unit receives a response signal from the electronic device of the user.
2. The vehicle of claim 1,
the first visual signal comprises at least one of: greeting information for the user, personal information of the user, a picture or username predefined by the user, information customized by the user, colored and/or flashing lights, and an image representing an avatar of the vehicle, and
the visual signal providing unit includes at least one of:
at least one display device configured to display one or more characters and/or one or more images on a window and/or an exterior surface of the vehicle;
at least one light emitting device configured to emit colored and/or scintillating light visible from outside the vehicle; and
at least one adaptation mechanism configured to adapt an appearance of the vehicle to provide a visual signal, wherein the appearance comprises at least a color and/or a shape.
3. The vehicle of claim 2, wherein the at least one display device comprises one or more of: a flat panel display device, a curved surface display device, a flexible display device, a projection display device, and a holographic display device.
4. The vehicle according to any one of claims 1-3,
the detection unit is configured to detect a distance between the user and the vehicle,
wherein the control unit is configured to control the visual signal providing unit to perform at least one of the following if the distance is less than a threshold value:
providing greeting information for the user as the first visual signal on a window and/or an exterior surface of the vehicle; and
emitting colored and/or flickering light visible from outside the vehicle as the first visual signal.
5. The vehicle according to any one of claims 1-3, wherein the control unit is further configured to:
controlling the communication unit to transmit a signal indicating an avatar of the vehicle to the electronic device of the user after acquiring the order information; and is
In response to a signal received by the communication unit from an electronic device of the user and/or the detection unit detecting that the user is in the vicinity of the vehicle, controlling the visual signal providing unit to provide an image representing an avatar of the vehicle and/or greeting information for the user as the first visual signal on a window and/or an exterior surface of the vehicle.
6. The vehicle according to any of claims 1-3, wherein the control unit is further configured to control the visual signal providing unit to display a picture or a user name predefined by the user and/or information customized by the user on a window and/or an exterior surface of the vehicle after the order information is obtained.
7. The vehicle according to any one of claims 1-3, wherein the control unit is further configured to control the communication unit to transmit an arrival notification to the electronic device of the user in the event that it is determined that the vehicle arrives or is about to arrive at a ride.
8. Vehicle according to claim 7, wherein the arrival notification comprises information about the location of the vehicle and/or video and/or audio information about the surroundings of the vehicle.
9. A computer-implemented method for controlling an autonomous vehicle, the method comprising:
acquiring order information, wherein the order information at least comprises information about a user to be picked up and a pickup place;
determining whether the vehicle arrives or is about to arrive at a ride;
after determining that the vehicle arrives or is about to arrive at a ride location, causing the vehicle to provide a first visual signal that allows the vehicle to be recognized by the user;
in the event that the user is detected to be in proximity to the vehicle, causing the vehicle to send a signal to an electronic device of the user that requires a response by the user; and is provided with
Causing the vehicle to provide a second visual signal after receiving a response signal from the user's electronic device.
10. The method of claim 9, wherein the first visual signal comprises at least one of: greeting information for the user, personal information of the user, a picture or username predefined by the user, information customized by the user, colored and/or flashing lights, and an image representing an avatar of the vehicle.
11. The method of claim 9 or 10, wherein the step of causing the vehicle to provide a first visual signal comprises: in the event that it is detected that the distance between the user and the vehicle is less than a threshold, causing the vehicle to perform at least one of:
providing greeting information for the user as the first visual signal on a window and/or an exterior surface of the vehicle; and
emitting colored and/or flickering light visible from outside the vehicle as the first visual signal.
12. The method according to claim 9 or 10, wherein the method further comprises:
after obtaining the order information, causing the vehicle to transmit a signal indicative of an avatar of the vehicle to the electronic device of the user,
wherein the step of causing the vehicle to provide a first visual signal comprises causing the vehicle to provide an image representing an avatar of the vehicle and/or greeting information for the user as the first visual signal on a window and/or an exterior surface of the vehicle in response to a signal from an electronic device of the user and/or detecting that the user is in the vicinity of the vehicle.
13. The method according to claim 9 or 10, wherein the method further comprises causing the vehicle to display a picture and/or a user name predefined by the user and/or information customized by the user on a window and/or an outer surface of the vehicle after acquiring the order information.
14. The method of claim 9 or 10, wherein the first visual signal varies with distance between the user and the vehicle.
15. The method of claim 9 or 10, further comprising, in the event that it is determined that the vehicle is at or about to reach a ride location, causing the vehicle to send an arrival notification to an electronic device of the user.
16. The method according to claim 15, wherein the arrival notification comprises information about the location of the vehicle and/or video and/or audio information about the surroundings of the vehicle.
17. A system for controlling an autonomous vehicle, the system comprising:
one or more processors; and
one or more memories configured to store a series of computer-executable instructions,
wherein the series of computer-executable instructions, when executed by the one or more processors, cause the one or more processors to perform the method of any one of claims 9-16.
18. A non-transitory computer-readable medium having instructions stored thereon, which, when executed by one or more processors, cause the one or more processors to perform the method of any one of claims 9-16.
CN201780088607.1A 2017-04-19 2017-04-19 Autonomous vehicle and control method thereof Active CN110431604B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/081078 WO2018191886A1 (en) 2017-04-19 2017-04-19 Autonomous vehicle and control method therefor

Publications (2)

Publication Number Publication Date
CN110431604A CN110431604A (en) 2019-11-08
CN110431604B true CN110431604B (en) 2022-06-21

Family

ID=63855511

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780088607.1A Active CN110431604B (en) 2017-04-19 2017-04-19 Autonomous vehicle and control method thereof

Country Status (2)

Country Link
CN (1) CN110431604B (en)
WO (1) WO2018191886A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200183415A1 (en) * 2018-12-10 2020-06-11 GM Global Technology Operations LLC System and method for control of an autonomous vehicle
GB2595889A (en) * 2020-06-10 2021-12-15 Daimler Ag Methods and systems for displaying visual content on a motor vehicle and method for providing a motor vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103745593A (en) * 2014-01-07 2014-04-23 杭州九树网络科技有限公司 Taxi dynamic information reminding system and method
CN103927789A (en) * 2014-04-29 2014-07-16 刘兴光 Unmanned taxi system
CN203733303U (en) * 2014-01-07 2014-07-23 杭州路招网络科技有限公司 Taxi dynamic information reminding system
CN105046942A (en) * 2015-06-05 2015-11-11 卢泰霖 Internet-based unmanned electric automobile service system
CN105711486A (en) * 2014-12-17 2016-06-29 宝马股份公司 Communication Between a Vehicle and a Road User in the Surroundings of a Vehicle
CN105818735A (en) * 2016-04-01 2016-08-03 蔡洪斌 Vehicle-mounted electronic display screen prompting method for indicating passenger to take reserved vehicle
CN105957377A (en) * 2016-05-03 2016-09-21 北京新能源汽车股份有限公司 Intelligent traffic control system and method based on unmanned electric automobile
CN106549981A (en) * 2017-01-13 2017-03-29 邹城众达知识产权咨询服务有限公司 A kind of directionless disk intelligent network about car system and its method for running based on big data cloud computing service

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3759896B2 (en) * 2001-10-31 2006-03-29 邦道 高田 Dispatch system
US7698033B2 (en) * 2006-04-12 2010-04-13 General Motors Llc Method for realizing a preferred in-vehicle chime
CN101826256A (en) * 2010-04-29 2010-09-08 联华电信股份有限公司 Vehicle dispatching method and vehicle dispatching system
CN102426780A (en) * 2011-11-16 2012-04-25 深圳欧奇网络技术有限公司 Paging system used for summoning taxis and paging method thereof
CN103000024A (en) * 2012-07-26 2013-03-27 苏州大通多宝软件技术有限公司 Taxi reservation calling method and taxi reservation calling system
KR101714514B1 (en) * 2014-11-24 2017-03-09 현대자동차주식회사 Car emergency system and method of emergency measures using the same
US9663025B2 (en) * 2015-09-18 2017-05-30 Clearpath Robotics, Inc. Lighting control system and method for autonomous vehicles

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103745593A (en) * 2014-01-07 2014-04-23 杭州九树网络科技有限公司 Taxi dynamic information reminding system and method
CN203733303U (en) * 2014-01-07 2014-07-23 杭州路招网络科技有限公司 Taxi dynamic information reminding system
CN103927789A (en) * 2014-04-29 2014-07-16 刘兴光 Unmanned taxi system
CN105711486A (en) * 2014-12-17 2016-06-29 宝马股份公司 Communication Between a Vehicle and a Road User in the Surroundings of a Vehicle
CN105046942A (en) * 2015-06-05 2015-11-11 卢泰霖 Internet-based unmanned electric automobile service system
CN105818735A (en) * 2016-04-01 2016-08-03 蔡洪斌 Vehicle-mounted electronic display screen prompting method for indicating passenger to take reserved vehicle
CN105957377A (en) * 2016-05-03 2016-09-21 北京新能源汽车股份有限公司 Intelligent traffic control system and method based on unmanned electric automobile
CN106549981A (en) * 2017-01-13 2017-03-29 邹城众达知识产权咨询服务有限公司 A kind of directionless disk intelligent network about car system and its method for running based on big data cloud computing service

Also Published As

Publication number Publication date
CN110431604A (en) 2019-11-08
WO2018191886A1 (en) 2018-10-25

Similar Documents

Publication Publication Date Title
US20200278957A1 (en) Method and Apparatus For Social Telematics
US10214145B2 (en) Vehicle control device and vehicle control method thereof
EP3319063B1 (en) Method and apparatus for launching start-stop function
US8914014B2 (en) Phone that prevents concurrent texting and driving
KR101502013B1 (en) Mobile terminal and method for providing location based service thereof
US20130210406A1 (en) Phone that prevents texting while driving
CN109649268B (en) Intelligent voice assistant system, device and method for vehicle
JP6826940B2 (en) Electronics, roadside units, operating methods and control programs and transportation systems
US11181386B2 (en) Navigation device, destination guiding system, and non-transitory recording medium
US10708700B1 (en) Vehicle external speaker system
US11082819B2 (en) Mobility service supporting device, mobility system, mobility service supporting method, and computer program for supporting mobility service
US20180251067A1 (en) Systems and methods for streaming video from a rear view backup camera
CN110431604B (en) Autonomous vehicle and control method thereof
JP2017116991A (en) Portable terminal and vehicle
KR20160114486A (en) Mobile terminal and method for controlling the same
US11302304B2 (en) Method for operating a sound output device of a motor vehicle using a voice-analysis and control device
US10645535B2 (en) Electronic apparatus, control device and computer-readable non-transitory recording medium for selectively transmitting information based on indoor/outdoor specification
CN108732539A (en) A kind of labeling method and device of whistle vehicle
CN109064722A (en) arrival reminding method and device
WO2019041339A1 (en) Apparatus and method for one-shot wifi connection in vehicle
JP2021033944A (en) Communication device, communication method and program
KR20140128800A (en) An method for determinating a direction and an appratus using it
JP7166419B1 (en) Boarding intention estimating device, vehicle control system, boarding intention estimating program, and boarding intention estimating method
KR101838820B1 (en) Smart door scuff
WO2024046353A2 (en) Presentation control method, device for in-vehicle glass of vehicle, and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant