GB2588639A - Method and system for automatically adapting drive mode in a vehicle - Google Patents

Method and system for automatically adapting drive mode in a vehicle Download PDF

Info

Publication number
GB2588639A
GB2588639A GB1915751.0A GB201915751A GB2588639A GB 2588639 A GB2588639 A GB 2588639A GB 201915751 A GB201915751 A GB 201915751A GB 2588639 A GB2588639 A GB 2588639A
Authority
GB
United Kingdom
Prior art keywords
vehicle
information
mode
driver
learning model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1915751.0A
Other versions
GB201915751D0 (en
Inventor
Li Lichi
Michael Krell Mario
Voigt Martin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB1915751.0A priority Critical patent/GB2588639A/en
Publication of GB201915751D0 publication Critical patent/GB201915751D0/en
Publication of GB2588639A publication Critical patent/GB2588639A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/182Selecting between different operative modes, e.g. comfort and performance modes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/082Selecting or switching between different modes of propelling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/043Identity of occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/049Number of occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/05Type of road, e.g. motorways, local streets, paved or unpaved roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/18Braking system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/20Steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/22Suspension systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

A method of automatically adapting a drive mode in a vehicle (101), the method comprising receiving information associated with at least one of the vehicle (101), a driver (103), and geographical details; determining a type of a trip segment for an upcoming trip for navigating the vehicle (101) based on the information; providing the information and the type of the trip segment as an input to a first trained learning model; and adapting the drive mode from one or more drive modes (104) and values of one or more parameters (104) of the vehicle (101) based on an output generated by the first trained learning model. Therefore, a driving experience based on preferences of the driver (103) is achieved.

Description

F ORM 2 THE PATENTS ACT, 1970 (39 of 1970) The patent Rule, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
TITLE OF THE INVENTION
METHOD AND SYSTEM FOR AUTOMATICALLY ADAPTING DRIVE MODE IN A VEHICLE
[00011 PREAMBLE TO THE DESCRIPTION:
[0002] The following specification particularly describes the invention and the manner in which it is to be performed:
[0003] DESCRIPTION OF THE INVENTION:
[0004] Technical field
[0005] The present disclosure relates to the field of automobiles. More particularly, but not specifically, the present disclosure relates to automatically adapting a drive mode in a vehicle.
[0006] Background of the disclosure
[0007] Generally, modem vehicles have an in-vehicle user interface which provides a driver with numerous control actions. For example, the user interface may be used to choose a variety of control actions such as listen to music, make a phone call, adjust the temperature in the vehicle, select a drive mode (i.e. normal, economical, sport, and the like), and adjust individual parameters of the vehicle (i.e. throttle, brake response etc). Generally, as the user interface is a screen of small dimension, all the control actions are not displayed at once. Hence, the control options may be grouped under different categories. For example, the drive mode and the individual parameters may be under a main menu of vehicle controls.
Therefore, the driver needs to traverse through various menus to select a desired control action leading to inconvenience and distraction while driving. Several efforts have been made to increase the usability of user interfaces.
[0008] US20160321545A1 discloses method for predicting an interface control action of a user. The user interface collects and stores vehicle data and user data based on the interactions of the driver with the user interface. Further, a likelihood for the control actions is determined and based on the likelihood, most likely used control action is predicted and provided to the driver via the user interface. However, this conventional art described is limited to providing the control actions for selection to the driver via the user interface.
[0009] US20170297586A1 discloses an autonomous vehicle system to navigate the vehicle using historic preferences or control actions of the driver. In a learning mode, the autonomous vehicle system stores or updates the preferences of the driver based on the measurements from a plurality of sensors. An autonomous mode operates the vehicle according to stored preferences of the driver. In a manual mode the vehicle does not update the preferences of the driver and does not operate the vehicle. However, this conventional art described is limited to the inputs from the driver for updating the preferences of the user. Further, the conventional art described is limited to updating the preferences of the driver only in the learning mode.
[0010] The present disclosure is directed to overcome one or more limitations stated above or any other limitations associated with the prior art.
[0011] The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
[0012] SUMMARY OF THE DISCLOSURE
[0013] In an embodiment, the present disclosure relates to a method for automatically adapting a drive mode in a vehicle, the method includes receiving information associated with at least one of the vehicle, a driver, and geographical details. Further, the method includes determining a type of a trip segment for an upcoming trip for navigating the vehicle, based on the information, where the information and the type of the trip segment is provided as an input to a first trained learning model. Finally, the method includes adapting the drive mode from one or more drive modes and values of one or more parameters of the vehicle based on an output generated by the first trained learning model.
[0014] hi an embodiment, the present disclosure discloses an Electronic Control Unit (ECU) of a vehicle for automatically adapting a drive mode in the vehicle, the ECU includes a processor and a memory, communicatively coupled to the processor, storing processor executable instructions, which, on execution causes the processor to receive information associated with at least one of the vehicle, a driver, and geographical details. Further, the processor is configured to determine a type of a trip segment for an upcoming trip for navigating the vehicle, based on the information, wherein the information and the type of the trip segment is provided as an input to a first trained learning model. Finally, the processor is configured to adapt the drive mode from one or more drive modes and values of one or more parameters of the vehicle based on an output generated by the first trained learning model.
[0015] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
[0016] BRIEF DESCRIPTION OF DRAWINGS
[0017] The novel features and characteristic of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements and in which: [0018] Figure 1 shows an exemplary environment for automatically adapting a 25 drive mode in a vehicle in accordance with some embodiments of the present disclosure; and [0019] Figure 2 shows an exemplary flow chart illustrating method steps for automatically adapting a drive mode in a vehicle in accordance with some 30 embodiments of the present disclosure.
[0020] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0021] DETAILED DESCRIPTION
[0022] In the present document, the word "exemplary" is used herein to mean 10 "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0023] While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It. should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
[0024] The terms comprises", "includes", "comprising", "including", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by "comprises... a", "including.. a" does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
[0025] Embodiments of the present disclosure relates to a method of automatically adapting a drive mode in a vehicle. The vehicle comprises of one or more drive modes and one or more parameters which can be tuned/ adjusted to suit the requirements of a driver. An Electronic Control Unit (ECU) of the vehicle receives information regarding the vehicle, and geographical details from one or more sensors associated with the vehicle and receives driver inputs regarding the one or more parameters. A first trained learning model is provided with the information to learn and generalize the driver requirements for each trip segment. Further, the ECU automatically adapts the drive mode and the one or more parameters based on the output of the first trained learning model.
[0026] Figure 1 is indicative of exemplary environment for automatically adapting a drive mode in a vehicle (101). The vehicle (101) navigating on a road (102) includes one or more drive modes (104) and one or more parameters (104) tunable/ adjustable for providing a personalized driving experience to the driver (103). A drive mode from the one or more drive modes (104) is selected by the driver (103) to obtain a desired driving experience. The one or more drive modes (104) includes at least one of a normal mode, a sports mode, an economical mode, an adaptive mode and the like. For example, in economical mode, the vehicle speed may be limited to save fuel, vehicle may not reach maximum acceleration. In another example the vehicle may reach maximum acceleration. Further, the diver (103) may tune or adapt values of the one or more parameters (104) to obtain a desired driving experience. The one or more parameters (104) includes at least one of a response from a brake (108), a ride height, a steering response and the like. An Electronic Control Unit (ECU) (105) in the vehicle (101) modifies a behavior of the vehicle (101) based on the selected one or more drive modes (104) and the values of the one or more parameters (104) by the driver (103), to provide the desired driving experience to the driver (103). In existing vehicles, the one or more drive modes (104) are selected by the driver (103). The present disclosure provides an option of the adaptive mode. In an embodiment, a separate icon may be presented on a user interface of the vehicle (101). Upon selecting the automatic drive mode, the vehicle (101) may automatically change between the available one or more drive modes based on vehicle details, geographical details or driver inputs. The adaptive mode provides the desired driving experience to the driver (103) by eliminating manual intervention of the driver (103) to select or adapt the drive mode and tune or adapt the one or more parameters (104). In an embodiment, the ECU (105) may use a first trained learning model for automatically selecting a drive mode from the one or more drive modes.
[00271 The ECU (105) receives infommtion associated with at least one of the vehicle (101), the driver (103) and geographical details while the vehicle (101) is navigating on the road (102). For example, the information may be amount of fuel drawn by the engine of the vehicle (101), a current. drive mode, temperature of an environment surrounding the vehicle (101), location information, traffic light (109) information, street name and the like. The information may be received from at least one of one or more sensors (106) associated with the vehicle (101), a database (107) associated with the vehicle (101) and the driver (103) via a user interface (not shown in Figure) housed in the vehicle (101). The database (107) may be communicatively connected to the ECU (105) via a communication network (not shown in Figure) using a wired or a wireless interface. In an embodiment, the received information may be stored in a memory (not shown in Figure 1) in the ECU (105) and processed by a processor (not shown in Figure 1) in the ECU (105).
[00281 The ECU (105) determines a type of trip segment for an upcoming trip based on the information using a second trained learning model. The trip segment is indicative of short distance navigated (or to be navigated) by the vehicle (101). For example, stopping at a traffic light (109) by applying brake (108) from a distance of 50 meters where the time from application of the brake (108) till the stopping of the vehicle (101) may be considered as the trip segment. The type of the trip segment may include a parameter from the one or more parameters associated with the vehicle (101) for traversing the trip segment. For example, to stop the vehicle (101). the parameter is an amount of braking applied by the driver (103) in an interval of time. The second trained learning model may implement a clustering technique for example K-Means algorithm, vector quantization algorithm, and the like to determine the type of trip segment.
[00291 The ECU (105) provides the information and the type of nip segment to the first trained learning model. The first trained learning model includes a classification technique and a regression technique, for example, hierarchical Bayesian model and the like. The classification model determines the drive mode among the one or more drive modes (104) based on the information and the type of trip segment. Further, the regression technique generates the values of the one or more parameters (104) as an output based on the information and the type of trip segment. In an embodiment, the first trained learning model and the second trained learning model may be trained for a plurality of trips while the user (103) manually selects the drive mode and adjusts the one or more parameters. The first and second trained learning models may receive the information during the plurality of trips as inputs during a training phase. In an embodiment, during the training phase, the ECU (105) may not automatically select the drive mode. In a further embodiment, in real-time, the ECU (105) may automatically select the chive mode and adjust the one or more parameters while still learning from every trip to increase accuracy of the selection of drive mode and adjustment of the one or more parameters. Further, in real-time, the ECU (105) automatically adapts the drive mode and the values of the one or more parameters (104) based on the output generated by the first trained learning model. For example, considering the vehicle (101) navigating in the sports mode, the ECU (105) may detect a traffic light (109) (with a "RED" light) and may automatically adapt the normal mode and increases the response of the brake (108) enabling the driver (103) to stop the vehicle (101) with application of the brake (108). In this example, the ECU (105) may determine the terrain type, evaluate traffic data and change the drive mode from sports mode to normal mode. In an embodiment, the present disclosure also provides safety by automatically changing modes. For example, if a driver has selected sports mode in a free-way and enters a city, the driver may have to immediately change drive modes while still concentrating on the driving. The present disclosure provides a solution to this problem by changing the mode while the driver concentrating on the driving.
[0030] Consider a scenario where the driver (103) is navigating the vehicle (101) on the road (102). The ECU (105) upon detecting a traffic jam on the road (102) using the information from the one or more sensors (106) and the geographical details, automatically increases the seat (110) height of the driver (103) based on the output of the first trained learning model as per historic preference of the driver (103). Therefore, the ECU (105) provides a personalized driving experience to the driver (103). The adjustment may be made to other parameters as well, for example, brake response, torque, suspension. A person skilled will appreciate that the above description can be equally applicable to the other parameters (examples mentioned above), and should not be construed as a limitation.
[0031] Figure 2 shows an exemplary now chart illustrating method steps for automatically adapting a drive mode in the vehicle (101) in accordance with some embodiments of the present disclosure.
[0032] As illustrated in Figure 2, the method 200 may comprise one or more steps for automatically adapting a drive mode in a vehicle (101) in accordance with some embodiments of the present disclosure. The method 200 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
[0033] The order in which the method 200 is described is not intended to be construed as a limitation, and any number of the desciibed method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof [0034] As shown in Figure 2, at the step 201, the ECU (105) receives information associated with at least one of the vehicle (101), the driver (103), and geographical details. The information is received from at least one of, one or more sensors (106) associated with the vehicle (101), a database (107) associated with the vehicle (107), and the driver (103).
[0035] The information associated with the vehicle (101) is received from the one or more sensors (106) (for example, an image capturing sensor, a weight sensor, a fuel injection sensor, Light Detection and Ranging (LiDAR), a brake sensor, an accelerometer, ultrasonic sensors, infrared sensors, and the like). For example, the information associated with the vehicle (101) may be an amount of fuel used per combustion cycle, amount of throttle provided by the driver (103), amount of brake (108) applied by the driver (103), stiffness of the suspension, proximity to other vehicles or obstacles around the vehicle (101), one or more road signs, traffic lights, presence and height of a speed breaker, number of passengers in the vehicle (101), steering action performed by the driver (103) and the like. The information associated with the driver (103) is received from the user interface housed in the vehicle (101). For example, the information associated with the driver (103) may be the drive mode selected by the driver (103), values of the one or more parameters (104) set by the driver (103) and the like. The information associated with the geographical details may be received using offline / online maps and the database (107). For example, the information associated with the geographical details may be location of the vehicle (101), road (102) topography, speed limit, street name, date and time, traffic details, weather conditions (i.e. precipitation, humidity, temperature, rain, hail, storm, snowing and the like). In one embodiment, the one or more sensors (106) may also provide the geographical details. For example, a temperature sensor may provide temperature around the vehicle (101).
[00361 At the step 202 the ECU (105) determines a type of a trip segment for an upcoming trip for navigating the vehicle (101), based on the information, where the information and the type of the trip segment is provided as an input to the first trained learning model. The ECU (105) segregates the information using a time series model, where the time smies model includes historic information. The second trained learning model determines the type of the trip segment based on the segregated information and the historic information using a second trained learning model.
[00371 The ECU (105) stores the information as a time series model denoted as "Y", Y = Ye: t E TI, where "T" is the index set comprising of data and time, Ye denotes the information at the time instant "t". The index set "T" comprising date and time acts as a label to a time series data "Ye". For example, "Ye" may indicate the change of the drive mode from the economical mode to the sports mode at the time instant "t". The time series data is segregated into a sequence of segments where each segment is denoted as the trip segment. The time series data is segregated based on a change-point detection. For example, the change-point detection may he application of brake (108) by the driver (103), increased application of throttle by the driver (103), changing the drive mode from a sports mode to a slippery mode, modifying the ride height and the like. The trip segment is indicative of short distance journey navigated (or to be navigated) by the vehicle (101). The type of the trip segment is determined using the second trained learning model. The second trained learning model is a clustering technique for example K-Means algorithm, vector quantization algorithm, and the like. The second trained learning model identifies the parameter from the one or more parameters associated with the vehicle (101) for traversing the trip segment. The parameter is indicative of the one or more type of information received from the vehicle (101) using the one or more sensors (106). For example, consider increasing an acceleration speed of the vehicle (101) for a duration of 5 seconds by the driver (103) to manoeuvre an overtaking of other vehicles ahead of the vehicle (101). The di stance travelled for the duration of 5 seconds is segregated as the trip segment and the type (or logical component) of trip segment is determined as the amount of throttle applied by the driver (103). In an embodiment, the second trained learning model is trained using a historical information collected from one or more vehicles with different drivers navigating the one or more vehicles and stored in a memory associated with the ECU (105). In another embodiment, the second trained learning model may be trained in real-time using the information by the ECU (105) of the vehicle (101).
I-00381 The type of the trip segment and the information is provided to the first trained learning model. The first training model includes the classification model and the regression model for example, hierarchical Bayesian model, a logistic regression and the like. The hierarchical Bayesian model is a statistical model including sub-models at multiple levels arranged in a hierarchical form. The hierarchical Bayesian model estimates parameters of a posterior distribution using the Bayesian method. The posterior distribution is the output of the hierarchical Bayesian model. The hierarchical Bayesian modelling uses hyperparameters (i.e. parameters of the prior distribution) and hyperpriors (i.e. distributions of hyperparameters) for deriving the posterior distribution.
[0039] The classification model of the first trained learning model is used to adapt the drive mode among the one or more drive modes (104) based on the information and the type of the trip segment. The regression model of the first trained learning model is used to adapt the values of the one or more parameters (104) based on the information and the type of the trip segment. For example, if the type of the upcoming trip segment ahead of the vehicle (101) is determined as a downhill road trajectory, and the information from the one or more sensors (106) and the geographical details indicate the presence of rain, the classification model of the first trained learning model based on the historical preferences of the driver (103), generates an output indicating the change of the drive mode from the sports mode to a slippery mode. The regression model of the first trained learning model based on the historical preferences of the driver (103) generates an output indicating an increase in the response of the brake (108) from (60% to 75%).
[0040] At the step 203, the ECU (105) adapts the drive mode from the one or more drive modes (104) and the values of the one or more parameters (104) of the vehicle (101) based on the output generated by the first trained learning model. The one or more drive modes (104) includes at least one of a normal mode, an economical mode, a sports mode, a comfort mode, a sports plus mode, an up-hill mode, and a slippery mode. The one or more parameters (104) includes at least one of suspension, steering response, transmission shift, brake (108) response, throttle, fuel supply, ride height, scat (110) height of the driver (103) and volume of exhaust.
[0041] The driver (103) navigating the vehicle (101) selects the adaptive mode among the one or more drive modes (104) for automatically adapting the drive mode and the values of the one or more parameters (104). The ECU (105), based on the type of the trip segment and the information, automatically adapts the drive mode of the vehicle (101) and the values of the one or more parameters (104) to quit the historical preference of the driver (103) and provides a personalized driving experience to the driver (103). The adaption of the drive mode and the values of the one or more parameters (104) is indicated to the driver (103) via the user interface housed in the vehicle (101). For example, a change from the economical mode to sports mode and the change in the stiffness of suspension from 10% to 45% is indicated to the driver (103) via the user interface.
[90421 In a first example, consider the driver (103) navigating the vehicle (101) in the economical mode. At an intersection, while waiting for the traffic light (109) to turn "Green", the driver (103) switches the drive mode from the economical mode to the sports mode for achieving quick acceleration. Further, the driver (103) upon reaching the speed limit of the road (102) switches the drive mode from the sports mode to the economical mode. When the driver (103) selects the adaptive mode as the drive mode among the one or more drive modes (104), the ECU (105) imitates the behaviour of the driver (103). When the vehicle (101) stops at an intersection, the ECU (105) automatically adapts the drive mode (from economical mode to sports mode and vice versa upon reaching the speed limit) to provide an optimal driving experience to the driver (103), eliminating the need of manual change of the drive mode by the driver (103).
[0043] In a second example, consider the driver (103) navigating the vehicle (101) on a freeway. The driver (103) in the absence of passengers in the vehicle (101) modifies the one or more parameters (104) (i.e. increases the ride height, and seat (110) height of the driver (103)) to get an overview of the traffic and to pass through a series of manoeuvres. The driver (103) after a series of manoeuvres changes the drive mode to sport plus mode and lowers the ride height and seat (110) height of the driver (103). In the presence of passengers in the vehicle (101), the driver (103) navigates the vehicle (101) in the economical mode with the ride height at maximum and the exhaust of the vehicle (101) is silent. The ECU (105) in the adaptive mode imitates the behaviour of the driver (103) based on the detection of the presence or absence of the passengers in the vehicle (101) to provide an optimal driving experience to the driver (103), eliminating the need of manual change of the drive mode by the driver (103).
[0044] Further, the ECU (105) trains the first trained learning model using a feedback from the driver (103) to adapt the drive mode and the values of the one or more parameters (104). For example, the ECU (105) changes the drive mode from the sports mode to economical mode, but the driver (105) is in a hurry to reach a destination and changes the drive mode back to sports mode from die economical mode. The ECU (105) trains the first trained learning model using a feedback from the driver (103).
[0045] The adaptive drive mode in the vehicle (101) automatically adapts the drive mode and the values of the one or more parameters (104) based on the historic preference of the driver (103). The adaptive drive mode enables the driver (103) to concentrate on navigating the vehicle (101) without distractions. Further, the safety of the passengers in the vehicle (101) is increased by the adaptive drive mode. The adaptive drive mode eliminates the need of manual interaction with the user interface of the vehicle (101).
[0046] The described operations may be implemented as a method, system or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The described operations may be implemented as code maintained in a "non-transitory computer readable medium", where a processor may read and execute the code from the computer readable medium. The processor is at least one of a microprocessor and a processor capable of processing and executing the queries. A non-transitory computer readable medium may comprise media such as magnetic storage medium (e.g., hard disk drives, floppy disks, tape, etc.), optical storage (CD-ROMs, DVDs, optical disks, etc.), volatile and non-volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic, etc.), etc. Further, non-transitory computer-readable media comprise all computer-readable media except for a transitory. The code implementing the described operations may further be implemented in hardware logic (e.g., an integrated circuit chip, Programmable Gate Array (PGA), Application Specific integrated Circuit (ASIC), etc.).
[00471 Still further, the code implementing the described operations may be implemented in "transmission signals", where transmission signals may propagate through space or through a transmission media, such as an optical fiber, copper wire, etc. The transmission signals in which the code or logic is encoded may further comprise a wireless signal, satellite transmission, radio waves, infrared signals, Bluetooth, etc. The transmission signals in which the code or logic is encoded is capable of being transmitted by a transmitting station and received by a receiving station, where the code or logic encoded in the transmission signal may be decoded and stored in hardware or a non-transitory computer readable medium at the receiving and transmitting stations or devices. An "article of manufacture" comprises non-transitory computer readable medium, hardware logic, and/or transmission signals in which code may be implemented. A device in which the code implementing the described embodiments of operations is encoded may comprise a computer readable medium or hardware logic. Of course, those skilled in the art will recognize that many modifications may be made to this configuration without departing from the scope of the invention, and that the article of manufacture may comprise suitable information bearing medium known in the art.
[0048] The terms "an embodiment", "embodiment", "embodiments", "the 15 embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
[0049] The terms "including", "comprising", "having" and variations thereof mean "including but not limited to", unless expressly specified otherwise.
[0050] The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
[0051] The terms "a", "an' and "the" mean "one or more", unless expressly specified otherwise. A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
[0052] When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
[0053] The illustrated operations of Figure 2 show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.
[0054] Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected in delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but. rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
[0055] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims.
REFERRAL NUMERALS:
Reference number Description
101 Vehicle 102 Road 103 Driver 104 One or more drive modes and one or more parameters Electronic Control Unit (ECU) 106 Sensors 107 Database 108 Brake 109 Traffic Light Seat

Claims (10)

  1. [0056] Claims: We claim: 1 A method for automatically adapting a drive mode in a vehicle (101), the method comprising: receiving, by an Electronic Control Unit (ECU) (105) of the vehicle (101), information associated with at least one of the vehicle (101), a driver (103), and geographical details; determining, by the ECU (105), a type of a trip segment for an upcoming trip for navigating the vehicle (101), based on the information, wherein the information and the type of the trip segment is provided as an input to a first trained learning model; and adapting, by the ECU (105), the drive mode from one or more drive modes (104) and values of one or more panuneters (104) of the vehicle (101) based on an output generated by the first trained learning model.
  2. 2. The method as claimed in claim 1, wherein the information is received from at least one of, one or more sensors (106) associated with the vehicle (101), a database (107) associated with the vehicle (101), and the driver (103).
  3. 3. The method as claimed in claim 1, wherein determining the type of the trip segment comprises: segregating the information using a time series model, wherein the time series model comprises historic information; and determining the type of the trip segment based on the segregated information and the historic information using a second trained learning model.
  4. 4. The method as claimed in claim 1, wherein the one or more drive. modes (104) comprises at least one of a normal mode, an economical mode, a sports mode, a comfort mode, a sports plus mode, an up-hill mode, and a slippery mode.
  5. 5. The method as claimed in claim 1, wherein the one or more parameters (104) comprises at least one of suspension, steering response, transmission shift, brake (108) response, throttle, fuel supply, ride height, and volume of exhaust.
  6. 6. The method as claimed in claim 1, wherein the first trained learning model is further trained using a feedback from the driver (103) to adapt the drive mode and the values of the one or more parameters (104).
  7. 7. An Electronic Control Unit (ECU) (105) of a vehicle (101) for automatically adapting a drive mode in the vehicle (101), comprising: a processor; and a memory, communicatively coupled to the processor, storing processor executable instructions, which, on execution causes the processor to: receive information associated with at least one of the vehicle (101), a driver (103), and geographical details; determine a type of a trip segment for an upcoming trip for navigating the vehicle (101), based on the information, wherein the information and the type of the trip segment is provided as an input to a first trained learning model; and adapt the dri ve mode from one or more drive modes (104) and values of one or more parameters (104) of the vehicle (101) based on an output generated by the first trained learning model.
  8. 8 The ECU (105) as claimed in claim 7, wherein the processor is configured to receive the information from at least one of, one or more sensors (106) associated with the vehicle (101), a database (107) associated with the vehicle (101), and the driver (103).
  9. 9. The ECU (105) as claimed in claim 7, wherein the processor is configured to determine the type of the trip segment comprises: segregating the information using a time series model, wherein the time series model comprises a historic information; and determining the type of the trip segment based on the segregated information and the historic information using a second trained learning model.
  10. 10. A method for automatically adapting a drive mode in a vehicle (101), the method comprising: receiving, by an Electronic Control Unit (ECU) (105) of the vehicle (101), information associated with at least one of the vehicle (101), a driver (103), and geographical details, wherein the information is received from at least one of, one or more sensors (106) associated with the vehicle (101), a database (107) associated with the vehicle (101), and the driver (103); determining, by the ECU (105), a type of a trip segment for an upcoming trip for navigating the vehicle (101), based on the information, wherein the information and the type of the trip segment is provided as an input to a first trained learning model, wherein determining the type of the trip segment comprises segregating the information using a time series model, wherein the time series model comprises historic information and determining the type of the trip segment based on the segregated information and the historic information using a second trained learning model; and adapting, by the ECU (105), the drive mode from one or more drive modes (104) and values of one or more parameters (104) of the vehicle (101) based on an output generated by the first trained learning model, wherein the one or more drive modes (104) comprises at least one of a normal mode, an economical mode, a sports mode, a comfort mode, a sports plus mode, an uphill mode, and a slippery mode, wherein the one or more parameters (104) comprises at least one of suspension, steering response, transmission shift, brake (108) response, throttle, fuel supply, ride height, and volume of exhaust, wherein the first trained learning model is further trained using a feedback from the driver (103) to adapt the drive mode and the values of the one or more parameters (104).
GB1915751.0A 2019-10-30 2019-10-30 Method and system for automatically adapting drive mode in a vehicle Withdrawn GB2588639A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1915751.0A GB2588639A (en) 2019-10-30 2019-10-30 Method and system for automatically adapting drive mode in a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1915751.0A GB2588639A (en) 2019-10-30 2019-10-30 Method and system for automatically adapting drive mode in a vehicle

Publications (2)

Publication Number Publication Date
GB201915751D0 GB201915751D0 (en) 2019-12-11
GB2588639A true GB2588639A (en) 2021-05-05

Family

ID=68768944

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1915751.0A Withdrawn GB2588639A (en) 2019-10-30 2019-10-30 Method and system for automatically adapting drive mode in a vehicle

Country Status (1)

Country Link
GB (1) GB2588639A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022000185A1 (en) 2022-01-18 2023-07-20 Mercedes-Benz Group AG Method for determining a user-specific driving profile for an automated driving of a vehicle

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111103871A (en) * 2020-01-03 2020-05-05 圣点世纪科技股份有限公司 Automobile auxiliary driving control method based on finger vein recognition
CN114461281B (en) * 2021-12-30 2023-08-22 惠州华阳通用智慧车载系统开发有限公司 Car machine mode switching method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160252903A1 (en) * 2014-12-07 2016-09-01 Toyota Motor Engineering & Manufacturing North America, Inc. Mixed autonomous and manual control of autonomous vehicles
US20160321545A1 (en) 2013-12-19 2016-11-03 Daimler Ag Predicting an interface control action of a user with an in-vehicle user interface
US20170297586A1 (en) 2016-04-13 2017-10-19 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for driver preferences for autonomous vehicles
US20180196433A1 (en) * 2016-04-01 2018-07-12 Uber Technologies, Inc. Configuring an autonomous vehicle for an upcoming rider
DE102018202146A1 (en) * 2018-02-12 2019-08-14 Volkswagen Aktiengesellschaft Method for selecting a driving profile of a motor vehicle, driver assistance system and motor vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160321545A1 (en) 2013-12-19 2016-11-03 Daimler Ag Predicting an interface control action of a user with an in-vehicle user interface
US20160252903A1 (en) * 2014-12-07 2016-09-01 Toyota Motor Engineering & Manufacturing North America, Inc. Mixed autonomous and manual control of autonomous vehicles
US20180196433A1 (en) * 2016-04-01 2018-07-12 Uber Technologies, Inc. Configuring an autonomous vehicle for an upcoming rider
US20170297586A1 (en) 2016-04-13 2017-10-19 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for driver preferences for autonomous vehicles
DE102018202146A1 (en) * 2018-02-12 2019-08-14 Volkswagen Aktiengesellschaft Method for selecting a driving profile of a motor vehicle, driver assistance system and motor vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102022000185A1 (en) 2022-01-18 2023-07-20 Mercedes-Benz Group AG Method for determining a user-specific driving profile for an automated driving of a vehicle
WO2023138826A1 (en) 2022-01-18 2023-07-27 Mercedes-Benz Group AG Method for determining a user-specific driving profile for an automated journey of a vehicle

Also Published As

Publication number Publication date
GB201915751D0 (en) 2019-12-11

Similar Documents

Publication Publication Date Title
US11643086B2 (en) Method and system for human-like vehicle control prediction in autonomous driving vehicles
US20220185295A1 (en) Method and system for personalized driving lane planning in autonomous driving vehicles
US10203031B2 (en) System and method for changing driving modes using navigation and control
CN105320128B (en) Crowdsourcing for automated vehicle controls switching strategy
US9459107B2 (en) System and method for providing adaptive vehicle settings based on a known route
JP4781104B2 (en) Driving action estimation device and driving support device
US8788113B2 (en) Vehicle driver advisory system and method
US10787174B2 (en) Automatic vehicle driving mode system
US9507413B2 (en) Tailoring vehicle human machine interface
KR101601890B1 (en) Method and module for determining of at least one reference value for a vehicle control system
EP1780089B1 (en) Vehicle control system and vehicle control method
GB2588639A (en) Method and system for automatically adapting drive mode in a vehicle
CN111465824A (en) Method and system for personalized self-aware path planning in autonomous vehicles
KR101604063B1 (en) Method and module for determining of at least one reference value for a vehicle control system
US11027764B2 (en) Vehicle driving mode safety system
CN111433101A (en) Method and system for personalized motion planning in autonomous vehicles
JP5800381B2 (en) Other vehicle information provision device
CN111433103A (en) Method and system for adaptive motion planning in autonomous vehicles based on occupant reaction to vehicle motion
CN108803596A (en) Notice system for motor vehicles
US20230174077A1 (en) Method for operating a driving dynamics system facility of a motor vehicle while driving with the motor vehicle, control device, and motor vehicle
CN111433566A (en) Method and system for human-like driving lane planning in an autonomous vehicle
JP6443323B2 (en) Driving assistance device
CN111433565A (en) Method and system for self-performance aware path planning in autonomous vehicles
CN116691701A (en) Method for estimating vehicle speed curve and navigation device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)