Method supporting passengers of autonomous driving vehicles
FIELD OF THE INVENTION
The invention relates to a method supporting passengers of autonomous driving vehicles. The method avoids disorientation of the passengers and improves control of the vehicle.
BACKGROUND OF THE INVENTION
Until relatively recently autonomous vehicles were nothing more than science fiction. However, advancements over the past decade have resulted in the realization of autonomous vehicles, at least on a very limited scale. For example, Google has developed a self-driving car used for collection of information for its mapping program.
Current implementations of autonomous vehicles are based on retrofitting
conventional vehicles with extra equipment to provide the automation to the vehicle. Thus, these vehicles have a design based on a conventional driver/passenger paradigm.
SUMMARY OF THE INVENTION
It has been recognized that autonomous vehicles provide the opportunity for an entirely new driving paradigm. For example, in a conventional vehicle there is a single driver and possibly one or more passengers. Apart from training vehicles equipped with two sets of driving controls, it is not currently possible with conventional vehicles for anyone other than the driver to control the vehicle. Because autonomous vehicles do not require a person to actually be able to see and react to the current driving environment, autonomous vehicles can allow any person in the vehicle to control any aspect of the vehicle.
Further, due to safety considerations, controls in conventional vehicles are designed to be as unobtrusive as possible so as to not distract the driver from his/her main task of controlling the vehicle. In an autonomous vehicle, however, an entirely immersive control design can be provided because the person(s) within the vehicle is/are not required to pay
attention to the safety aspects of the driving, including following traffic laws and spacing from surrounding vehicles and obstacles.
Moreover, autonomous vehicles allow for a more varied seat layout so that it is no longer necessary for all persons in a vehicle to be facing forward. However, riding in a vehicle while facing backwards can be discomforting to some people. Accordingly, immersive displays can be provided to address and reduce this discomfort.
Due to the above, it is therefore the object of the present invention to provide enhanced control and orientation to passengers in an autonomous driving mode of a vehicle.
This object is solved by the subject matter of the independent claims.
Further advantageous modifications of the present invention are defined in the dependent claims.
According to an aspect a method for producing abstract reproductions of the
surroundings of a vehicle involves producing the abstract reproductions on display screens arranged in the front, sides, and rear of the vehicle (Intelligent Particles). The abstract reproductions can move around and among the display screens based on movement of the vehicle relative to its surroundings or movement of the surroundings relative to the vehicle. The abstract reproductions can also move and react to movements of vehicle occupants. Further the system is suitable to communicate upcoming vehicle maneuvers involve adjusting a color of the abstract representations to reflect the upcoming maneuver (Ambient Blinking Based on Vehicle Maneuvers). The color can be adjusted to mimic those of conventional turn signals. Further, the color can be pulses between the adjusted color and the regular color to mimic the flashing of conventional turn signals.
The method comprises the following steps: detecting, using sensors installed on or in the vehicle, objects in or surrounding the vehicle; and reproducing abstract representations of the detected objects in the vehicle surroundings on at least two of a front, side, and rear
display screens of the vehicle, wherein the abstract representations move around and among the at least two of a front, side, and rear display screens based on relative movement of the vehicle and the detected objects.
According to another aspect a method to control a driving style of the vehicle involves sensing passenger movement and adjusting a display based on the sensed passenger movement (Conducted Driving). The display provides feedback to the passenger to indicate when the passenger is gazing at the display, when the passenger has selected an indicator on the display, and when the passenger and selected a particular driving style. Further, the display can be adjusted to indicate the approach of a point of interest, as well as provide additional drive time and position information regarding the location of the point of interest relative to the autonomous vehicle.
The method for controlling an autonomous vehicle comprises the following steps:
providing, by the autonomous vehicle, a display of a vehicle surrounded by an indicator; detecting, by sensors of the autonomous vehicle, a passenger movement related to the display; and adjusting the display in response to the detected passenger movement.
In a further embodiment the passenger movement is the passenger gazing at the display.
Furthermore, the indicator is a circle and the adjustment of the display is a change in brightness of the circle.
Furthermore, the adjustment of the display is an enlargement and change in brightness of the displayed vehicle.
An alternative provides a display including an indicator of a first driving style arranged in front of the displayed vehicle and an indicator of a second driving style arranged behind the displayed vehicle.
A further variant of the method comprises detecting, by the sensors, a further passenger movement related to the display, wherein the further passenger movement involves an indication that a driving style of the vehicle should be adjusted; and further adjusting the display in response to the further passenger movement so that the displayed vehicle is displayed closer to the indicator of the first or second driving style depending upon the detected further passenger movement.
A further variant of the method comprises detecting, by additional sensors of the autonomous vehicle, an approaching point of interest; and adjusting the display in response to the detection of the approaching point of interest.
In the further variant, the adjustment of the display in response to the detection of the approaching point of interest comprises: displaying a focused circle in front of the displayed vehicle, wherein a distance between the focused circle with respect to the displayed vehicle decreases as the autonomous vehicle approaches the point of interest; replacing the focused circle with an indicator of a particular type of point of interest when the autonomous vehicle is within a predetermined distance of the point of interest; and adjusting a lateral position of the displayed indicator of the particular type of point of interest to indicate a direction in which the autonomous vehicle will turn to move towards the point of interest.
A modified method comprises the steps: detecting, by the sensors, a passenger selection of the indicator of the particular type of point of interest; and displaying, in response to the passenger selection of the indicator of the particular type of point of interest, an increase in an amount of total driving time should the autonomous vehicle adjust its route to include passing the selected point of interest.
In a further variant, the display is mounted in a dashboard or a vehicle door. Further the display is on a passenger smartphone, tablet, wearable, or laptop.
According to another aspect, a method for reassigning control of an autonomous vehicle involves providing an indicator on a display of which passenger is currently controlling the autonomous vehicle (Commander Module). A passenger currently controlling the vehicle can reassign control to another passenger that accepts control or another passenger can request control from the currently controlling passenger. The passenger displays can change during the reassignment process so that an interior seating arrangement of the vehicle is displayed with highlighting of the passenger position currently controlling the autonomous vehicle.
The method for controlling an autonomous vehicle comprises: assigning control of the autonomous vehicle to a first passenger; providing an indication on a display associated with the first passenger that the first passenger is in control of the autonomous vehicle; receiving a reassignment instruction to assign control of the vehicle to a second passenger; changing control of the vehicle to the second passenger; and providing an indication on a display associated with the second passenger that the second passenger is in control of the autonomous vehicle.
In a further variant, receiving the reassignment instruction comprises receiving an indication that the first passenger wishes to assign control to the second passenger; and receiving an indication that the second passenger accepts control.
In a variant, receiving the reassignment instruction comprises receiving a request from the second passenger for control of the autonomous vehicle; and receiving, from the first passenger, an authorization to reassign control to the second passenger.
In a further variant, receipt of a reassignment instruction comprises receiving an indication that control will be reassigned; adjusting the display to display vehicle seats within a vehicle reproduced on the display, wherein a vehicle seat associated with the first passenger is highlighted; receiving an indication of the reassignment to the second passenger; and adjusting the display so that a seat associated with the second passenger is highlighted.
In a further variant, the display associated with the first or second passenger is mounted in a dashboard or a vehicle door. The display associated with the first or second passenger is on smartphone, tablet, wearable, or laptop.
According to a further aspect a method for providing feedback to passengers in an autonomous vehicle involves displaying a pulse cycle of varying levels of detail of the environment surrounding the vehicle (Vehicle Display Including Extended Sense Pulse). The pulse cycle can include a first display of the immediate foreground of the vehicle, a second display of the further in front of the vehicle, a third display even further in front of the vehicle, and a fourth display. The first through third displays can provide representations of vehicles, pedestrians, and building topography, whereas the fourth display only provides representations of vehicles and pedestrians.
The method for producing a display in an autonomous vehicle comprises: detecting, using sensors installed in or on the autonomous vehicle, surroundings of the vehicle; and reproducing, on a display of the autonomous vehicle, a series of displays in a cycle, wherein a first one of the series of displays reproduces vehicles and buildings detected by the sensors and second one of the series of displays reproduces only vehicles detected by the sensors.
In a further variant, the first and second ones of the series of displays also reproduce driver controls and driving state information.
In a further variant, the driver controls include drive style preference control, sound control, or interior temperature control.
In a further variant the first and second ones of the series of displays also reproduce points-of-interest (POI).
In an alternative method, the reproduced POIs are limited to those within predefined categories defined by a passenger of the autonomous vehicle.
In a further variant, the first one of the series of displays reproduces vehicles in a wireframe representation and the second one of the displays reproduces vehicles in a symbolic representation.
In an advantageous method, the symbolic representation are dots.
In an alternative variant, the first and second ones of the series of displays each include compass bearing indications.
In a further variant, the series of displays further includes a third and fourth display, wherein the first, third, and fourth displays reproduce vehicles and buildings based on distance from the autonomous vehicle, the first display reproduces vehicles and buildings in an immediate vicinity of the autonomous vehicle, the fourth display reproduces vehicles and buildings that are furthest from the autonomous vehicle, and the third display reproduces vehicles and buildings that are between those reproduced in the first and fourth displays.
A further variant of the method comprises detecting whether a vehicle occupant is currently viewing the display of the autonomous vehicle; and terminating the reproduction of the series of displays when it is detected that the vehicle occupant is not currently viewing the display.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be described hereinafter in more detail with reference to the accompanying drawing, in which:
Figure 1 a illustrates a standard dash display when a person's eye gaze has not been detected in accordance with exemplary embodiments of the present invention;
Figure 1 b illustrates a dash display when a person's eye gaze has been detected, which causes the ring to be slightly brightened, in accordance with exemplary embodiments of the present invention;
Figure 1 c illustrates a dash display when a person's eye gaze and hand have been detected in accordance with exemplary embodiments of the present invention;
Figure 1 d illustrates a dash display when a person's eye gaze has detected and it has been detected that a person's hand has "grabbed" vehicle icon (or alternatively the vehicle icon is activated by touch control) in accordance with exemplary embodiments of the present invention;
Figure 2a illustrates a dash display with a point of interest (POI) detected within an indication radius in accordance with exemplary embodiments of the present invention;
Figure 2b illustrates a dash display with a POI approaching in accordance with exemplary embodiments of the present invention;
Figure 2c illustrates a dash display with a POI that is a scenic route and that continues to approach in accordance with exemplary embodiments of the present invention;
Figure 2d illustrates a dash display with a scenic route indicated as being on the right and that is still approaching in accordance with exemplary embodiments of the present invention;
Figure 2e illustrates a dash display with a scenic route on the right and that is still approaching but very close in accordance with exemplary embodiments of the present invention;
Figure 2f illustrates a dash display with a scenic route to the right is imminent in accordance with exemplary embodiments of the present invention;
Figure 2g illustrates a dash display with a scenic route on the right, and which indicates that the scenic route will add 27 minutes to the trip, in accordance with exemplary
embodiments of the present invention;
Figure 3a illustrates a dash display with a basic pass icon to pass slower vehicle on the left in accordance with exemplary embodiments of the present invention;
Figure 3b illustrates a dash display with an enhanced pass icon to pass a slower vehicle on the left in accordance with exemplary embodiments of the present invention;
Figure 4a illustrates basic icon showing a restaurant to the left and that an imminent turn-off is necessary;
Figure 4b illustrates a dash display with an enhanced icon showing a restaurant to the left and that an imminent turn-off is necessary in accordance with exemplary embodiments of the present invention;
Figure 5 illustrates a door panel display in accordance with exemplary embodiments of the present invention;
Figure 6 illustrates a passenger vehicle command screen in accordance with exemplary embodiments of the present invention;
Figure 7 illustrates a conductor screen with the vehicle icon selected in accordance with exemplary embodiments of the present invention;
Figure 8 illustrates a conducting request indicated on the conductor screen in accordance with exemplary embodiments of the present invention;
Figure 9 illustrates a full dashboard display paused mid-pulse in accordance with exemplary embodiments of the present invention;
Figures 10a- 10d illustrate a dashboard display showing the progression of a pulse cycle in accordance with exemplary embodiments of the present invention;
Figures 11a-11d illustrate a heads-up display (HUD) showing the stages of a pulse cycle in accordance with exemplary embodiments of the present invention;
Figure 12 illustrates a menu display screen superimposed over intelligent particles in accordance with exemplary embodiments of the present invention;
Figure 13a illustrates an intelligent particle control display standard white background in accordance with exemplary embodiments of the present invention;
Figure 13b illustrates an intelligent particle display with varying background color activated in accordance with exemplary embodiments of the present invention;
Figure 13c illustrates an intelligent particle display with a black background in accordance with exemplary embodiments of the present invention;
Figure 14 illustrates an intelligent particle display activated by proximity and touch in accordance with exemplary embodiments of the present invention;
Figure 15 illustrates a selection of the 2nd concentric ring of the intelligent particle system in accordance with exemplary embodiments of the present invention;
Figure 16 illustrates the "effect distance" of the third level selected for communicating upcoming vehicle maneuvers of the intelligent particle system in accordance with exemplary embodiments of the present invention;
Figure 17 illustrates the "effect distance" of the outer circle selected for sensing and far away vehicles of the intelligent particle system in accordance with exemplary embodiments of the present invention; and
Figure 18 illustrates an intelligent particle display of an impending left turn as shown on front left door panel display in accordance with exemplary embodiments of the present invention.
DETAILED DESCRIPTION OF THE DRAWINGS
The autonomous vehicle employed as part of the present invention includes a number of cameras and sensing devices arranged inside and outside of the vehicle. Those located inside are used to detect movements and eye gazes of the vehicle occupants. Those located outside of the vehicle are designed to perform the same tasks typically performed by a driver's hands, eyes, and brain. These tasks normally performed by a driver are replaced with autonomous driving features, such as adaptive cruise control, pedestrian detection, lane keeping assist, and the like to allow a vehicle to drive safely down the road without the necessity of a driver.
In the description below the conventional terms "driver" and "passenger" have meanings deviating from how these terms are conventionally used in connection with vehicles. Because the present invention is directed to autonomous vehicles the term driver is used as the person who can currently control the vehicle, which unlike in the conventional sense, is not limited to a person located in a particular seating position. Instead, the driver, which is also referred to herein as the conductor, can be located in any seating position within the vehicle and the person designated as the driver or conductor can be freely assigned to any other vehicle passenger. The vehicle may or may not include a conventional steering wheel and/or acceleration and brake pedals (or other types of acceleration and braking mechanisms)
Although certain embodiments are described in connection with a display on a dashboard or vehicle door, these displays can also be provided on a smartphone, tablet, wearable, and/or a laptop.
I. Conducted Driving
Vehicles have typically been controlled by a single driver from an assigned seat utilizing a steering wheel. Exemplary embodiments of the present invention are directed to a new paradigm in which vehicle control does not necessarily require a steering wheel. Instead, vehicle control is facilitated by interactive screens mounted within and around the interior of the vehicle cabin. The description that follows assumes that a passenger or passengers have already entered the vehicle and have entered a destination.
Exemplary embodiments of the present invention utilize known autonomous vehicle driving schemes and provide the driver and passengers of a vehicle with a novel way in which to depict and control the driving style of the vehicle.
Figure 1a illustrates an exemplary control display on the dashboard of the vehicle in accordance with the present invention.
In an autonomous vehicle the steering and speed control is dictated by GPS and a variety of sensors that, among other things, take account of terrain, traffic, pedestrians and
road shape. Although much of the travel of the vehicle from beginning to destination is automatically controlled, the passenger(s) of the vehicle control how they get to their destination.
The display of Figure 1 a provides a simplified layout because it is no longer necessary for passengers to always be presented with a speedometer, tachometer, temperature gauge, etc. Instead the present display provides an interface that can be controlled by touch or gesture to manipulate the driving style of the vehicle on a range between "dynamic" which is more aggressive, and "relaxed", all while staying within the confines of the law and what are considered acceptable driving dynamics.
To vary the driving style of the vehicle, a passenger can either simply touch and drag the vehicle icon to a desired position between the two extremes noted on the bar (i.e., dynamic or relaxed), or can utilize multiple cameras that sense "driver" hand position and the "driver" (person in the seat who is presently in control of the vehicle) can indicate with a gesture, such as a double air-tap motion with an open palm, that they wish to "grab" the vehicle icon, i.e., select the icon for further manipulation. The driver can then proceed to move their hand up or down depending on their desired driving style to move the icon, then double air-tap again to release the icon in place. The hand position sensors are mounted for potential vehicle control in each of the seating positions of the vehicle.
As an enhanced alternative, Figure 1 a can be the first display in a sequence of displays that provide intuitive control information to the driver. In this preferred embodiment, Figure 1 a would be the standard dashboard display that simply shows the driver-chosen driving style of the vehicle. Figure 1a would be shown when the driver isn't focusing on the dashboard and is, for example, facing the rear of the vehicle to converse with other passengers. In an autonomous vehicle this could be facilitated by a fully swiveling captain's chair.
If the driver is facing forward and looking at the dashboard display, the display in Figure 1 b could be provided, on which the circular portion surrounding the displayed vehicle is brightened compared to that of Figure 1 a.
Eye/gaze following technology can be deployed within the cabin of the vehicle to detect when the driver is looking at the dashboard display in order to provide the enhanced brightness by automatically switching the display reproduction from Figure 1 a to Figure 1 b. This provides easier readability for the driver but also allows the vehicle to consume less energy when the driver is not viewing the display.
If the driver is interested in altering the driving style of the vehicle with gesture controls, the driver must first raise their hand for the previously mentioned sensors to recognize their hand in a manner as is known in the art. Upon the driver raising their hand, the display will change from Figure 1 b to Figure 1 c, which shows a further enhanced ring, this time a solid inner ring appears to indicate recognition of the sensed hand.
Finally, still utilizing hand gestures, the driver can perform the aforementioned double air-tap gesture (or^any other alternate gesture to gain the desired control result) to "grab" the car icon. To signify that the icon has indeed been "grabbed", Figure 1 c would then be replaced by Figure 1 d.
Figure 1 d^shows a highlighted and slightly enlarged car icon. With the driver's hand still in the air, the can change the position of the car icon to their desired drive style. Once the desired driving style is selected the driver can, for example, double air-tap again to deselect the icon causing the icon to remain in the selected position but returning the icon to its earlier size and lose its highlighting. Upon lowering of the hand the enhanced solid inner ring of Figure 1 c will disappear and further if the gaze of the driver is no longer on the display, the brighter display of the ring will revert back to the display shown in Figure 1 a.
If touch control is desired as opposed to gesture control, upon the driver placing their finger on the screen the display would change directly from Figure 1 b to Figure 1 d.
To stop the vehicle using gesture controls, a separate gesture can be used to facilitate this action, or upon "grabbing" the vehicle icon the user can hold the icon over the "hand up" image for a selected amount of time (e.g. 5 seconds) to direct the vehicle to pull over as soon as is safe to do so.
In addition to control of the driving style, the dashboard display can also provide information to the driver while the vehicle traverses the road such as how near points of interest, scenic routes, exits, etc. are.
As the vehicle moves down the road, icons signifying various tasks and points of interest will come into view on the display in real time. The closer and more pronounced the icons become on the display, the closer those points of interest, or their exits, will be. The points of interest can be a restaurant, an exit for a scenic route, and in the event of a more slowly moving car ahead of the autonomous vehicle in the present invention, an icon depicting the option of passing the more slowly moving vehicle. The points of interest can be personalized to the passengers of the vehicle in the event that they are usually more interested in a specific coffee shop, gas station, restaurant, etc.
Figures 2a-2g show selected snapshots of the display while the vehicle moves toward a point of interest, in this case a scenic route. These figures are only a selection of what the display shows, as the display is a continuous animation.
Upon first approaching a potential point of interest (POI), e.g. within 5 miles of such a POI or its exit, the display will show a small icon that gradually comes into focus as depicted in Figures 2a-2c. Specifically, Figures 2a and 2b illustrate the POI as a generic in-focus circle in front of the vehicle.
Figure 2c shows that this particular POI is actually a scenic route. The display can show multiple POIs at once at their respective positions relative to the vehicle.
As the vehicle continues to approach the turn/exit for the scenic route, the icon becomes larger and also moves to the side of the vehicle where the vehicle will need to turn or exit as shown in Figures 2d and 2e.
When the exit for the scenic route is closest to the vehicle the icon is shown in its largest form as can be seen in Figure 2f. Additionally, if at any time the driver wants more information relating to the POI they can either touch the display or utilize a specific gesture to activate the display to provide more information such as what is shown in Figure 2g. In this case the display provides the scenic route highway number and the time that will be added to an estimated time of arrival should this alternate route be selected.
While moving in the vehicle and approaching POIs, the POIs move at a fairly slow and manageable rate on the display to provide the passengers with enough time to decide if they would like to take a detour or stop at a particular POI. Once the POI or its exit has passed, however, the icon quickly shrinks in size and follows the path of the ring down until it exits out of the bottom center of the display, again as a small and out-of-focus circle similar to the depiction shown in Figure 2a but at the bottom of the display.
As can be appreciated, many other types of icons can be utilized. Figures 3a and 3b show the basic icon and enhanced icon for passing a more slowly moving vehicle. Again activation of this passing action can be facilitated by a specific gesture as is known in the art and as can be programmed for the individual driver or vehicle in general. Note that in the passing icon scenario the vehicle of Figures 3a and 3b would pass the approaching vehicle on the left, as indicated by the icon as well as the position of the icon on the left side of the ring.
Figures 4a and 4b illustrate displays similar to those above but illustrate the basic and enhanced versions of making a stop at a restaurant or coffee shop.
Figure 5 provides a display that would be available on interior side (door) panels of the vehicle for what would be traditionally considered passenger positions of the vehicle. It should be recognized that Figure 5 illustrates the display on a door panel in the rear left seat of the
vehicle in a left hand-drive vehicle. Mirrored images would be provided to the passenger that sits in the rear right seat. The front right passenger would see a similar display to those shown in the figure sets of Figures 1 -4 on the dashboard, or would also see such a mirrored image as that shown in Figure 5 on their door panel. In any event, the display is configured so that the front of the vehicle in the image would be pointed toward the front of the actual vehicle where the "Dynamic" drive style option is toward the front and the "Relaxed" drive style is toward the rear of the vehicle.
The rear door passenger display in Figure 5 can be controlled using the same control and gesture mechanics described above to activate a desired drive style that most closely fits with the "driver's" drive style. The display also provides icons to depict various POIs and gesture/touch activation of the icons can result in gaining more information about the POI such as is shown in Figures 2g, 3b and 4b.
II. Commander Module
As discussed above, control of the vehicle need not be fixed to a single vehicle position but instead can be assigned to any one of the passengers could manage the destination and driving style of the vehicle. Exemplary embodiments of the invention include a commander module that provides an effective method of sending or requesting command of the vehicle to and from the various passenger seats of an autonomous vehicle.
Figure 6 illustrates a display screen that may be available to any passenger in the vehicle. The illustrated display is oriented for a screen mounted on the rear left door panel as it depicts a vehicle moving forward from left to right. A similar display could be shown on a screen mounted on the right hand doors of a four door vehicle but in a mirrored fashion where the vehicle would be going forward in a right to left fashion, given that the left of the screen would be toward the front of the vehicle. This display could also be shown on a "smart" table mounted within the vehicle, or transferred or transmitted to a consumer electronic device such as a smartphone, tablet, wearable, or laptop.
The display screen of Figure 6 further depicts a ring around a vehicle icon on a variable bar meter that can be manipulated with touch or gesture controls to change the driving style of the vehicle in the manner described above. As the vehicle travels down a path it will approach and pass points of interest (POIs) such as restaurants, alternate routes, and the option to pass slower moving vehicles, among other types of vehicle action. This interface can be considered the standard view of the display, and in its current form in Figure 6 the vehicle is approaching a scenic route on the right, has the option to pass a slower moving vehicle by going around it on the left, and the current vehicle controller has chosen a slightly more dynamic or aggressive drive style.
Though the display is in a static form, it is a continual and real-time animation that is affected by the speed and location of the vehicle as well as its real-time awareness of its immediate and upcoming surroundings.
The display of Figure 6 is similar to that of Figure 5 with the addition of an indication icon in the top left corner that provides verification of who is conducting or controlling the vehicle. In the example above, given that this display would be shown on the screen mounted on the interior of the rear left door of the vehicle, it would mean that the rear left passenger would be conducting the vehicle. The term "conducting" means that this passenger would be able to control the driving style of the vehicle, whether or not the vehicle would take an alternate route to a destination or stop at a POI. As discussed above, the act of driving to a destination would be controlled by the autonomous controls of the vehicle, the "driver" or conductor would not need a physical element to steer the vehicle such as a steering wheel. The conductor would instead use the commander module to digitally control how aggressive or cautious the vehicle drives, maneuvers, brakes and accelerates using the techniques described above in Section I.
The current conductor of the vehicle is able to pass control of the vehicle to any other passenger that has requested control. Figure 7 provides the same standard screen as Figure 6 but now the car icon has become enlarged and enhanced to show the individual seats within
the vehicle. This enhanced icon is displayed in response to a tap on the vehicle icon of Figure 6 to result in the display of Figure 7. Gesture controls could also be utilized such as a double air-tap motion with an open palm or any other well-known gesture control motion.
If the current conductor activates the enhanced vehicle icon, they will see that their current conducting seat is highlighted as shown in Figure 7. If the current conductor would like to pass control to another passenger, they can simply tap or gesture to a seat position, which will then cause the display to highlight the selected seat. For instance, if the current conductor in the front left seat double taps and activates the enhanced vehicle icon then taps on the rear left seat in the icon, the rear left seat will be highlighted in a pulsating manner on their display. The rear left seat display, as mounted on the interior door panel or an unattached consumer device, will then query if that passenger would like to accept conducting control privileges of the vehicle. If the rear passenger accepts, control immediately transfers to that passenger and the front left seat in the enhanced vehicle icon will be gray instead of highlighted and the rear left seat will be highlighted in a solid manner.
A non-conducting passenger can also request control. In this scenario if the front left passenger was the current conductor and the rear left passenger wanted to be the conductor, they would utilize their commander module display screen by activating the enhanced vehicle icon and tapping or gesturing on their seat position. The front left passenger would be notified as shown in Figure 8 that the passenger in the rear left seat position would like to conduct the car. This notification could be presented as a pulsating highlight of the rear left seat position in the enhanced vehicle icon of the front left passenger's display. Additionally, as shown in Figure 8, the front left passenger can be presented with a further "Conducting Request" indication icon.
The front left conductor or controller of the vehicle can now choose whether or not to give up their control of the vehicle by selecting (again by touch or gesturing to) either the "X" to the right of the "Conducting Request" indication icon to deny the request or the check mark to
pass control of the vehicle to the front left passenger. If the current controller rejects the conducting request then their display reverts to that shown in Figure 6 where the display continues to indicate that the front left passenger is conducting. If the front left conductor accepts the conducting request from the rear left requestor then the screen mounted on the interior front left door will display the same display as Figure 6 but without the "Conducting" indication icon in the top left corner.
III. VEHICLE DISPLAY WITH EXTENDED SENSE PULSE
Because there are only passengers in autonomous vehicles it is necessary to provide an informative and intuitive display method that gives the passengers the feeling that they are safe and that they can feel comfortable relinquishing control of the vehicle. As discussed below, this can be achieved by providing the display with an extended sense pulse to show the passengers what the vehicle "sees" in a real-time, continually streaming format.
Figure 9 illustrates an exemplary dashboard display in accordance with the present invention. The vehicle is depicted in the center of the display, the immediate background shows the continually sensed topographical surroundings of the vehicle, and the far background includes light dots representing sensed vehicles. As sensed vehicles come closer to the vehicle, the depiction in the display is no longer as a dot but as the outline of the vehicle. The surroundings of the vehicle in the immediate background are shown in a pulsed fashion that cycles through the displays illustrated in Figures 10a- 10d and/or 1 1a-1 1d. The reproduction of any particular display within those illustrated in Figures 10a-10d or 1 1a-1 1d correspond to the travel of a sensing pulse sent from the vehicle to detect its surroundings. The pulse cycle can been set to occur every two seconds, which can be adjusted based on passenger preference. The pulse can also be completely switch off so that only surrounding moving objects such as cars, pedestrians, and bicyclists are depicted in the screen. It should be recognized that the displays of Figures 9, 10a-10d, and 1 1a-1 1d are from the perspective of the vehicle's direction of travel.
In the foreground of the display are exterior and interior elements of the vehicle that can be controlled by the passengers of the vehicle or provide useful information to the passengers. From left to right these are: drive style preference control; sound control; speedometer; interior temperature control; and a counter displaying the amount of time or percentage of time passengers use the vehicle for activities such as working, socializing, entertainment, or relaxation. These activities are charted based on the utilities available within the vehicle, which include connectivity to social applications, video and music streaming, video teleconferencing capabilities, etc.
The dots along the top of the display represent a compass with the larger, brighter bot (which can be green) indicating the current direction, while the other dots indicate other compass points (which can be gray). As the vehicle turns, the dots rotate in real time with the vehicle directional change. The dots showing 8 directions N, NE, E, SE, S, SW, W, NW and will run over all screens within the car (not all shown) that are set up in a 360° configuration across the interior dash, doors and rear of the vehicle. Control elements are represented with rings around them. These elements can be controlled using either touch controls or gesture controls that are well known in the art, as well as the control methods described above and below.
Figure 10a shows the display at the end of the last pulse cycle depicts only the sensed vehicles in the vicinity. The sensed vehicles may be recognized by GPS beacon or by a car2x or car2car communication signal. If pedestrians are in the sensed vicinity of the vehicle, these will also be depicted as light dots.
Figure 10b shows the display upon the pulse initially emanating from the vehicle. The immediate foreground now begins to transform into a wireframe depiction of the topography and landscape around the vehicle beginning with that "closest" to the display.
Figure 10c shows the pulse as it begins to travel further away from the vehicle. In particular other vehicles now begin to come into view that are within the vicinity or in front of
the present autonomous vehicle of the invention. The topography and landscape continues to move forward as well while that in the immediate background begins to fade away.
Figure 10d is a depiction of near the end of the pulse cycle. The immediate background has all but disappeared except for the topography of the buildings well ahead of the vehicle of the invention, and the outline of the vehicles in the vicinity of the vehicle of the invention are also fading away.
Figures 10a-10d are only snapshots of what is a continual animation of the sensed surroundings. The surroundings are sensed by a combination of known devices such as 3D image sensing, LIDAR, optical cameras, infrared proximity sensors, ultrasonic sensors, etc. and communicated to electronic control units (ECUs) where they are processed to control the vehicle and provide the display of the present invention.
Additionally, a head-up display (HUD) can be provided from a first person perspective shown in Figures 1 1 a-1 1d, as opposed to the third person perspective of Figures 10a-10d.
The first person perspective of the HUD provides the passengers with a street level view of the pulsed display discussed above but with augmented reality depicting points of interest (POIs) which can be interactive, similar to those discussed in Sections I and II above.
The points of interest can be restaurants, shops, gas stations, etc., and a vehicle occupant can interact with them utilizing gestures known in the field, such as those described above in connection with Sections I and II above. Passengers can utilize these gestures to direct the vehicle to any particular POI desired.
The display methods described in connection with Figures 9, 10a-10d, and 1 1 a-1 1d provide a way to inform the passengers of the vehicle that the vehicle sees all potential obstacles and provides assurance that the passengers are safe. The information is provided in a hierarchical manner so as not to overwhelm the passengers. Vehicle interior information is always provided as well as POIs, but similar to how the human eye scans the landscape for potential issues or danger, the vehicle provides a pulsed display that mimics this scanning
action. The passenger(s) can decide whether or not to have the pulse or POIs displayed. Additionally, a passenger could choose which particular POIs they would like to see, such as only restaurants or scenic views, hardware stores or gas stations. While the pulse can be turned on or off it can also be adjusted to utilize eye or head-tracking of a passenger where, depending on where a particular passenger is looking, only that section of the display will be pulsed. The display could also keep that viewed section highlighted longer for the eye or head tracked passenger to be able to fully investigate the particular viewing area.
IV. INTELLIGENT PARTICLES
In a traditional left-hand drive vehicle such as a four door sedan, there are normally 5-6 seats available for passengers, and most often these seats face the same direction as the vehicle is driven. In an autonomous vehicle, however, there is the possibility that the traditional seating arrangement does not need to be used because it is no longer necessary for all or any of the passengers to face the front of the vehicle. Thus, an autonomous vehicle provides freedom for the design of the interior to be one that is more communal and conversational. Moreover, the interior layout design could be more versatile depending on the needs of the passengers. The seating layout could be such that, in a 4-door autonomous sedan, a rear bench for three passengers or two bucket seats could face the front of the vehicle while the front seats could be captain's chairs that can face forward and can fully swivel 360° to allow the front passengers to face the rear of the vehicle, or more precisely, face the rear passengers.
In an interior vehicle configuration where e.g. four passengers face each other, this perspective would be considered normal for the rear passengers who would face the front of the vehicle, but the front passengers could become discomfited by the sometimes disorienting perspective of having one's back to the direction of the travel of the vehicle. In an effort to alleviate motion sickness or disorientation, exemplary embodiments of the present invention continually provide passengers with a frame of reference within the vehicle to easily determine
surroundings, direction of travel, and upcoming maneuvers to be made by the vehicle. Thus, the present invention intuitively displays vehicle motion and surroundings to passengers of an autonomous vehicle.
In the contemplated autonomous vehicle of the present invention, instead of only the traditional display screens being available to the passengers, such as on the dashboard and perhaps another device such as e.g. a head unit, there may now be a multitude of display screens around the circumference of the interior of the vehicle including on the interior door panels, across the rear of the interior either below above or over where traditionally a rear window would be located, as well as on other surfaces such as a table centrally located within the interior or even pushed to mobile connected devices within the vehicle such as passengers' tablets, smartphones, or wearable devices. There can also be a large dashboard display showing information potentially different than the following invention display, such as those described in Sections l-lll above. However, the present invention can also use the dashboard display.
Figure 12 shows an embodiment of the menu screen that would be available on for instance the interior door panel of the front left door. In a traditionally driven vehicle this would be considered the driver door for a left-hand drive vehicle. The menu provides options that may be controllable via touch or gesture control which are from left to right: guided path, particle control, virtual world, games, connected devices, and conducted driving. The present invention is concerned only with the particle control option, but to provide context, the conducted driving option is described above Section I. The guided path option provides passengers with the ability to scroll through menu information about points of interest (POI) on the present path of the vehicle. The virtual world option allows passengers to display different landscapes on the display screens other than what is actually being passed in reality. The games option would allow passengers to display and play hardcoded, cloud or connected device games on the display screens, and the connected devices option would allow
passengers to access other information from their smartphones, tablets, wearables, computers, etc. such as music, movies or work files to in effect turn the autonomous vehicle into anything from a mobile cinema to a mobile office.
The particles of Figure 12 are not stationary but instead are a continually moving general representation of objects that the vehicle senses or detects utilizing a multitude of microphones, cameras, and/or proximity sensors mounted on the interior and about the circumference of the exterior of the vehicle. It is not necessary that the particles have a 1 :1 ratio with objects that are detected. In other words, there can be more or less particles displayed at any given time compared to how many objects are detected. The particles are also displayed in three-dimensional space on the display screens as well as across three- dimensional space within the cabin of the vehicle. Within the display, larger particles represent objects that are closer to the vehicle while smaller particles represent objects that are farther away from the vehicle.
As the particles represent actual detected objects outside of the vehicle the particle representation on the left side displays will be different than the particle representation on the right side displays and rear display. If the vehicle moves in a forward direction the particles will stream from the front and side displays to the rear side and rear displays. Consider, for example, the perspective of a passenger sitting in the rear left seat of the vehicle. As the vehicle moves in a straight-lined forward direction, if the passenger were to look directly forward they would see in their peripheral vision on both their left and right sides particles appearing to stream "around" them on the interior door panel displays. If the vehicle were to change lanes to the right, this same passenger would notice that most of the particles in the left displays would get smaller because the vehicle would be further away from the detected objects. Likewise, the particles on the right displays would become larger because those detected and depicted objects would now be closer to the vehicle. During the lane change some detected objects may actually move from being to the right of the vehicle to being to the
left of the vehicle. The best example to consider in this case would be a vehicle in front of the present autonomous vehicle. As the forward vehicle shifts from being to the right of the autonomous vehicle to being to the left of it, the representative particle will actually shift from the right display to the left display. In this way the multiple interior display screens are depicting a real-time three-dimensional space to inform the passengers of their entire surroundings in a subtle and intuitive manner. Depending on equipment, driving situation and relative movement of the vehicle the particles move around and among two, three or more displays arranged in the vehicle.
Figure 13 shows the intelligent particles control display. In this display, which can be touch or gesture controlled, the color of the background and particles can be changed using the slider on the right. The button near the top of the display that reads "Ambient Intelligence", when pressed or gestured to, allows a passenger to return to the main menu screen of Figure 12.
In the position of Figure 13a the background is white and the particles are colored varying shades on a blue gradient. As the "Color" slider is moved to the left the background goes through the varying shades of the blue gradient as shown in Figure 13b until it is completely black as shown in Figure 13c. At the same time, as the slider is moved to the left, the particles transition from a blue gradient into a white-gray gradient. As an example, a particle that was dark blue in Figure 13a would transition to a brighter white particle in Figure 13c. As can best be seen in Figure 13c, the color cloud surrounding the color slider shows the passengers the currently selected color gradient, which can be further changed utilizing a settings menu. This settings menu (not shown) can be activated by double tapping or double gesturing on the icon to the left of the "Color" slider indicator or a dedicated settings icon (not shown) in a corner of the display screen. The passenger can then select any desired color gradient available in the standard visible spectrum, which would change the background color gradient of at least one but ideally all of the display screens.
The reactivity of the particles can also be controlled using the left slider called "Effect Distance," which allows a passenger to alter which detected objects the particles react to or at least generally represent. There are four separate levels of reactivity as represented by the four concentric circles of the "Effect Distance" slider. The larger the radius selected on the "Effect Distance" slider the more represented the outside world will become on the internal display screens to the. passengers of the vehicle in the form of the particles and background color(s). Each next level, from small radius to large radius on the slider, means that the display screens will show additional information further and further away from the vehicle. The features of the first level or of the smallest concentric circle continue to be displayed and utilized in the subsequent levels thus providing a feature scheme that builds and is inclusive from one level to the next, from the smallest concentric circle to the largest.
If the first concentric circle is selected the particles will react to passengers detected within the vehicle. Detection is facilitated by a combination of motion and proximity sensors, as well as sound activated microphones. For instance, if a passenger moves or speaks within the vehicle, the particles will gently and subtly drift in the aforementioned three-dimensional space towards the moving or speaking party. The interior of the vehicle is outfitted with a multitude of detectors and sensors to identify the location of the source of movement or sound to make the particles react accordingly. In this way the vehicle is able to communicate to the passenger(s) that the vehicle is aware of the passenger presence, that they are detected or sensed. In the past only person-to-vehicle interaction was available, but with the present invention there is opened a method of communication.
Further, as shown in Figure 14, as a passenger's hand or finger approaches the display screen to control a slider, the particles react to the passenger's close proximity by moving towards the source of proximate movement.
When detecting and reacting to this movement the particles can become darker based on the selected color gradient as they follow the detected object. The movement of the
particles may only be facilitated when the detected hand proximity may be very close to the screen as this anticipates a passenger touching the screen or activating a slider with gesture controls. Once the touch or gesture control is activated the particles will become their deepest gradient shade and follow the controlling finger or gesturing hand until control is released. Upon release, the particles will drift back to their original positions (if the vehicle is stationary) as shown in Figures 12 and 13. If the vehicle is moving the particles will continue streaming by unless activated by touch or gesture, when they will be "attracted" to the activation and remain there until released, at which point they will fall back into the stream of particles passing by.
The second level of the intelligent particle display as depicted by the second smallest concentric circle of the "Effect Distance" controller shown in Figure 13a reacts to objects that are in a close proximity to the exterior of the vehicle. The concentric circle that is chosen is highlighted with a colored ring that is the same shade as the color cloud of the color slider discussed above.
When the second innermost concentric circle is selected on the "Effect Distance" slider as shown in Figure 15, if the vehicle is stationary, the particles will, in addition to reacting to passengers as discussed above, also react to and depict passing pedestrians and/or vehicles as detected by various cameras and proximity sensors mounted on the exterior of the vehicle. If the vehicle is in motion the particles will begin to stream by on the displays as discussed above, but if for instance a vehicle is passed or a vehicle passes the vehicle of the present invention, there will be depicted a subtle disturbance or wave within the displayed particles to indicate to the passengers what is happening in the near surroundings of the exterior of the vehicle. Using the same example of a vehicle passing the vehicle of the present invention, the depiction can be done by showing a single particle moving in the opposite direction of the stream while at the same time particles that are passed by the particle depicting the passing vehicle move slightly away from the passing particle in a reactionary sense.
Selecting the second ring also activates a feature where the particles communicate specific upcoming maneuvers of the vehicle to the passengers. This method of maneuver communication is further described below in Section V.
By moving the "Effect Distance" slider to the third position or third concentric ring from the inside as shown in Figure 16, the passenger is further activating a feature that
automatically alters the background color and gradient displays.
Instead of a user-selected background color gradient, the background colors of the display would then mimic the colors of detected exterior objects using specific cameras that are activated when the third ring or position is selected. The color detection cameras transfer the detected exterior palette data to the displays and create a real-time shifting background. Outlines of specific elements would not necessarily be shown, but more of an abstraction of the detected elements through color communication such as an out of focus image or impressionist-style display that continually streams by and changes with the detected exterior. For instance, if the third position is selected, if the vehicle were to pass a green-space with grass and trees, the background would be of a green gradient with interspersed shades of brown to represent detected tree trunks and possibly a few other shades such as yellows and purples depending on wild flowers growing in the green-space. If the vehicle were to then get beyond the green-space and begin detecting deep yellows, oranges and purples due to a detected sunset, the background colors and gradient would transition to mimicking these shades and transition away from the greens and browns of the green-space.
As there are color-detecting cameras on all sides of the vehicle, the 360° display would not be entirely uniform but would depict the real-time detected passage of colors in all directions and display them on the corresponding interior screens respective of screen direction. For instance, if a green-space was on the left of the vehicle and a sunset was on the right of the vehicle, the left mounted screens would depict the green-space and the right mounted screens would depict the sunset. Corresponding transition background colors would
be shown on the rear and front screens according to their respective cameras' detected colors. This display feature may or may not be utilized for the front dashboard display.
By moving the "Effect Distance" slider to the fourth position, or outermost concentric ring as shown in Figure 17, the passenger is selecting the option that the particles depict vehicles that are detected from a very wide radius around the vehicle. The particles in this instance, while still not being representative of an exact 1 :1 ratio of detected vehicles meaning there may be more or less particles on the display than actually detected by the vehicle, can be detected utilizing cameras, proximity sensors, car2x communication protocols or other vehicle intercommunication systems. In the dashboard display (not shown) there is a continual depiction of vehicles in the surrounding vicinity of the autonomous vehicle of the present invention as more fully described above in Section III. When the outermost ring is selected, the side and rear displays will provide a more visible connection to dots in the dashboard representing vehicular traffic, where the particles of the side and rear displays can be seen as an extension of these dots, particularly when the autonomous vehicle is moving. If the autonomous vehicle is not moving, the particles of the side and rear displays may simply float or slightly move on the display screens in a benign and generally calming manner.
As can be seen in Figure 17, the depth of the particle display appears greater than that shown in the preceding figures. Additionally Figure 17 shows a more flattened perspective of particles mimicking the landscape surrounding the vehicle on which the surrounding vehicles are detected. The detected vehicles are those that are within multiple city blocks away from the vehicle of the present invention. In this way the passengers can gain a sense of traffic patterns in the general vicinity of the vehicle.
The animated transition from Figure 16 to Figure 17 is one where the particles appear to swirl around the display to finally end in the more flattened configuration of Figure 17. This transition is effected to better indicate to the passengers of the vehicle that the display is
making a dramatic shift from closely detected items, obstacles and vehicles, to showing vehicles that are far afield of their vehicle.
The particles as shown can be ever-present on the display. Regardless of the "Effect Distance" selected, the particles can be the only depiction shown on the display or they can have other display elements superimposed over them such as a video teleconference feed, navigation information, or connected device menus. In all instances the particles are not meant to be entirely stationary even if the vehicle is. Some detected elements will likely be in constant motion and can be represented as such. Other detected elements may simply be stationary and can be represented with stationary particles.
The present display is meant to be an abstraction of the surrounding elements as detected by the sensing and optical devices mounted on and throughout the vehicle. The different levels of "Effect Distance" discussed above can help orient passengers, regardless of their seating position or the direction they are facing, to understand generally what is happening around the vehicle at any given time and to give them confidence that the vehicle is safe and understands what to do for passengers to maintain this confidence. Some passengers may suffer from motion sickness particularly if the vehicle is dark inside and they are not aware of upcoming changes in direction. It is necessary for the passenger to be able to synchronize their visual understanding of the movement of the vehicle with the physical and kinetic movement of their body to relieve symptoms of motion sickness such as disorientation, anxiety, and nausea and this steady stream or alleyway of passing particles, subtle and subdued, can help to relieve these symptoms.
V. AMBIENT BLINKING BASED ON VEHICLE MANEUVERS
As discussed in section IV. above in connection with Figure 15, selecting the second ring of the "Effect Distance" controller activates a feature where the particles communication specific upcoming maneuvers of the vehicle to the passengers. As also discussed above in
connection with Figure 12, the particle control option is reached by selecting the corresponding menu item in Figure 12.
Some passengers may suffer from motion sickness particularly if the vehicle is dark inside and they are not aware of upcoming changes in direction. It is necessary for the passenger to be able to sync their visual understanding of the movement of the vehicle with the physical and kinetic movement of their body to relieve symptoms of motion sickness such as disorientation, anxiety, and nausea and this steady stream or alleyway of passing particles, subtle and subdued, can help to relieve these symptoms.
The second concentric circle level activates a feature of the vehicle that provides communication by the vehicle to the passengers of the vehicle, particularly of impending or imminent movement to be made by the vehicle. To facilitate this communication, for example as the vehicle is about to make a left turn, the front left and optionally rear left portions of the interior displays alter their color palette to mimic a traditional turn signal shade which is often of an amber hue. This maneuver communication feature can also present in the subsequent third and fourth concentric circle levels of the "Effect Distance" slider control.
While the particles continue to travel by on the displays in the aforementioned steady stream representing the movement of the autonomous vehicle through detected elements, Figure 18 shows the moment that an external turn signal has been activated on the autonomous vehicle, which is additionally represented on the displays by the amber colored particles. Figure 18 is an example of the front left door interior panel display when the autonomous vehicle signals an upcoming left turn. To best represent that a left turn is imminent, the continually streaming particles transition their color palette from amber to blue, or any other chosen particle color palette other than blue, depending on their proximity to the front left side of the vehicle. As the particles continue to stream toward the rear of the interior display, those particles on the left side of the display can then transition back to amber to mimic the presence of the left turn signal at the rear exterior of the vehicle. When the turn
signal is activated, the amber hue of the particles is temporary and pulsed given the blinking nature of turn signals. In this example then, as the particles stream along the left display, their color palette will both mimic the blinking of the exterior turn signal/signals and continually transition color depending on their location on the multiple display system.
The background gradient or tint can also automatically change or pulse to better accent the pulsating effect of the mimicked turn signal. For example, when the turn signal of the vehicle is activated and the turn signal or light begins to blink, the mimicked turn signal on the display would show the front left particles as a pulsating amber hue while the background could become a pulsating darker gradient of blue to better contrast with the amber color. At the same time, the particles near the center of the display could transition or pulse to a white- gray gradient as the background pulses to a dark blue so all features and elements are properly visible at any given time.
Continuing with the same left turn scenario, representing a turn signal with particle color transition is only a portion of the communication method. It is still necessary for the display to represent visually what is occurring physically to the passengers so they can maintain orientation. To do this it is necessary for the displayed particles to properly represent the relative speed of objects detected in the vicinity of the vehicle as it maneuvers through traffic and turns. In the given left turn scenario, relatively speaking, the detected objects to the left of the vehicle will move more slowly as perceived by a passenger of the vehicle than the detected objects on the right of the vehicle. To best depict this relative motion phenomenon in a left turn scenario, as the vehicle makes a left turn, the display will show the stream of particles moving more slowly in the left interior displays and more quickly in the right interior displays. For a right turn the opposite would be true: the particles on the right interior displays would stream slowly and the particles on the left interior displays would stream quickly to mimic the views and detected objects around the vehicle. In this manner, regardless of what a
passenger can or cannot see outside the vehicle, their eyes will be able to anticipate thus their bodies can physically prepare properly for maneuvers made by the vehicle.
This display and communication method is best understood in the context of a 90° turn made by the vehicle but it can be facilitated to varying degrees to properly and continually depict the actual autonomous driving experience. The displays can seamlessly transition from straight-line movement to lane changes, 90° turns, U-turns and all variations in between depicting the proper degree of relative motion difference between the left and right internal displays as well as their combination within the rear internal display of the autonomous vehicle. This subtle visual effect can be especially helpful to passengers of the front seats of the vehicle who may have the ability to turn their seats around to face the rear of the vehicle. Such a lighting effect can help them to maintain orientation and visual communication with the vehicle even while the vehicle travels in the opposite direction they are facing and most likely looking. This visual communication method can also be used in combination with audio signals such as a soft series of chimes representing different types of maneuvers, or automated spoken words can be used.
The foregoing disclosure has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.