DE102012222972A1 - Method for determining trajectory of driving maneuver, involves inputting symbol on touch-sensitive display device by user, where target pose is recognized depending on input symbol - Google Patents

Method for determining trajectory of driving maneuver, involves inputting symbol on touch-sensitive display device by user, where target pose is recognized depending on input symbol

Info

Publication number
DE102012222972A1
DE102012222972A1 DE201210222972 DE102012222972A DE102012222972A1 DE 102012222972 A1 DE102012222972 A1 DE 102012222972A1 DE 201210222972 DE201210222972 DE 201210222972 DE 102012222972 A DE102012222972 A DE 102012222972A DE 102012222972 A1 DE102012222972 A1 DE 102012222972A1
Authority
DE
Germany
Prior art keywords
trajectory
vehicle
touch
display device
sensitive display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE201210222972
Other languages
German (de)
Inventor
Christian Heigele
Holger Mielenz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to DE201210222972 priority Critical patent/DE102012222972A1/en
Publication of DE102012222972A1 publication Critical patent/DE102012222972A1/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/14Adaptive cruise control
    • B60W30/143Speed control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/027Parking aids, e.g. instruction means
    • B62D15/028Guided parking by providing commands to the driver, e.g. acoustically or optically
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/029Steering assistants using warnings or proposing actions to the driver without influencing the steering system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3614Destination input or retrieval through interaction with a road map, e.g. selecting a POI icon on a road map
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders, dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/10Input devices or features thereof
    • B60K2370/12Input devices or input features
    • B60K2370/143Touch sensitive input devices
    • B60K2370/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2550/00Input parameters relating to exterior conditions
    • B60W2550/40Involving external transmission of data to or from the vehicle
    • B60W2550/402Involving external transmission of data to or from the vehicle for navigation systems

Abstract

The method involves inputting a symbol (5) on a touch-sensitive display device (1) by a user. A target pose is recognized depending on the input symbol. A trajectory from a current position of the vehicle to the target pose is determined. A target trajectory is determined depending on the input symbol. A degree of compliance of the target trajectory in determining the trajectory is determined depending on the input speed of the symbol. Independent claims are included for the following: (1) a computer program for executing the driving maneuver trajectory determining method; and (2) a device for determining a trajectory of a driving maneuver.

Description

  • State of the art
  • The invention relates to a method and a device for determining a trajectory of a driving maneuver.
  • The invention also relates to a computer program, which is set up in particular for carrying out the method.
  • Driver assistance systems are known which offer the user assistance functions in a defined field of application, for example parking assistants. In the process, parking spaces are measured and a path, a so-called trajectory, is planned into the parking space. Each field of application, for example cross or longitudinal parking spaces, requires its own interpretation of the driver assistance system and its own algorithms. By applying to strictly defined situations, the path planning algorithms can be optimized and at the same time realized on particularly favorable control devices.
  • Systems are desirable which have an increasing automation of driver assistance and are capable of more complex interpretations of the vehicle environment.
  • The DE 10 2009 046 726 A1 shows a method for detecting parking spaces and for selecting a parking space, wherein the selection of the parking space is made by simply tapping a parking space shown on a touch-sensitive screen. In this case, the driver selects a target position to be reached, which has previously been determined fully automatically by the system.
  • From the DE 10 2010 030 436 A1 a method for assisting a driver of a motor vehicle in a driving maneuver is known, in which a desired end position of the vehicle is input by the driver. The input of the desired end position via a touch-sensitive screen, wherein the image of the vehicle to be maneuvered is marked by pressure with a finger on the touch-sensitive screen and with a movement of the finger toward the desired end position, the image of the vehicle to be maneuvered follows the finger. To enter the desired orientation of the vehicle to be maneuvered, the image of the vehicle to be maneuvered is rotated with a second finger, while the image remains marked by pressure with the first finger. The release of the fingers from the screen confirms the input.
  • Disclosure of the invention
  • In the method according to the invention, it is provided that a user, for example a driver of the vehicle, inputs a character on a touch-sensitive display device, a target pose is recognized on the basis of the inputted character and a trajectory is determined from a current position of the vehicle to the target pose.
  • The method according to the invention makes it possible for the driver of a vehicle, on a touch-sensitive display device, in particular on a so-called touchscreen display, to specify a target position and an orientation of the vehicle at the target position. The combination of the target position and the orientation of the vehicle at the target position is referred to as a target pose within the scope of the invention.
  • With the proliferation of smartphones and tablet PCs, users are becoming increasingly used to the presence and use of touchscreens. Advantageously, the invention provides a new field of application for such touchscreens in the vehicle by being used for setting the target pose in a driver assistance system. The input on the touchscreen is quick and easy to perform, which increases the attractiveness of the driver assistance system.
  • In general, the specification of position and orientation in the desired position requires an explicit input of the three degrees of freedom in two-dimensional space. The target pose is preferably determined on the basis of a single character via the user-sensitive display device. Based on the entered character, the target pose is recognized and the trajectory derived.
  • The inputted character is a gesture input with the finger on the touch-sensitive display, and may be interpreted as a straight line, a so-called "swipe", or a drawing. The gestures have in common that they each have a start and an end position of a finger of the user. The derivative of the target position of the vehicle can be done by reading the start position or by reading the end position of the finger. The target position of the vehicle is preferably determined by reading the end position. It may also be provided to compare the position of the start position and the end position with respect to the position of the current position of the vehicle and to select the point which is farther away from the current position of the vehicle than the target position. The target position may be defined, for example, with reference to the center of the rear axle, to the center of the front axle, or to the vehicle center.
  • The orientation of the vehicle in the target position is determined in the case of the straight line over the angle of the line with respect to a vehicle axle. In some embodiments of the invention, provision is made for the orientation of the vehicle in the target position to be corrected on the basis of kinematic boundary conditions of the vehicle during the virtual traversing of the drawn trajectory.
  • When wiping, the user wipes in the direction of the target position that he wishes. Above all, this functionality focuses on specifying a goal. Therefore, according to some embodiments, the alignment in the endpoint is not exactly predetermined by the user, but is subsequently determined by the system. In particular, an alignment can take place, for example, on surrounding objects, detected parking markings or the like.
  • Alternatively, the input character may also be a drawing. The orientation can be calculated over an angle, which results from a last section of the drawing near the target position, for example from a defined number of last support points.
  • If the target pose is known, the path, that is path planning algorithms, can be used. H. determine the trajectory on which the user is then led to the system. The system preferably determines a valid, mobile and collision-free trajectory and offers, depending on the system characteristics, the driver support by a transverse and possibly also by a longitudinal guide.
  • The measures listed in the dependent claims advantageous refinements and improvements of the independent claim method are possible.
  • According to one embodiment, it is provided that, based on the input character, a default trajectory or usable parts of default trajectories are furthermore determined. The input character can thus comprise as extractable information parts of a trajectory, a coarse trajectory or an exact trajectory leading to the target pose. The drawing and the swipe can be used as so-called first-guess and integrated into the trajectory planning. In addition to this, an environment-based adaptation of the trajectory can take place, for example by means of a potential field approach, according to which the vehicle is guided in the immediate vicinity of the trajectory predefined by the driver, but only deflected so far from this that no collisions can occur with environment objects.
  • According to one embodiment, it is provided that, based on an input speed of the character, it is determined whether there is a straight line, a swipe or a drawing. It may also be provided to decide on the basis of the input speed whether the input contains the entire trajectory, parts of the trajectory or only the target pose. It can also be provided to determine a degree of compliance with the default trajectory when determining the trajectory on the basis of the input speed. Since the trajectory drawn is generally inaccurate due to hand trembling and sampling errors, it must be smoothed, filtered and optimized for driveability by the system. According to one embodiment, the input speed may also be interpreted as a departure speed of the trajectory. Thus, the user has the opportunity to provide the system with a default for the speed of longitudinal driving maneuvers.
  • To determine the trajectory, the default trajectory is preferably divided into equidistant interpolation points. Advantageously, this triggers a jitter of the user's hand when inputting and sampling errors of the touch-sensitive display device.
  • The interpolation points are preferably weighted on the basis of system sizes, properties of the vehicle environment and / or the position within the trajectory. In the case of a touchscreen, a certain inaccuracy in the input must be assumed, which may mean that the first support points generally have a certain lateral offset to the current position of the vehicle. This is not desired by the user and would lead to an unpleasant system behavior. For example, the target position may be considered as a hard default, while the first nodes that are near the current position of the vehicle are weighted less heavily.
  • According to a preferred embodiment, the target pose and, according to some embodiments, the determined trajectory are displayed on the touch-sensitive display device, for example in a top view. The user is thereby communicated how his input is interpreted by the system by displaying a vehicle icon, i. H. a so-called avatar, for example a transparent vehicle, on which the target pose interpreted by the system is placed. Preferably, the current position of the vehicle is also displayed on the touch-sensitive display device.
  • Preferably, correction options are provided for the user. For example, it can be provided that the user can tap on a point of the environment map on which a Vehicle symbol appears, or to a vehicle symbol shown in the determined Zielpose. The user can redefine the orientation of the vehicle by interacting with the touch-sensitive screen. This can be done, for example, by pressing the vehicle icon for a period of time, such as a few seconds, and then rotating it until the user releases the finger from the touch-sensitive display, which is interpreted as an input confirmation. Alternatively, a circle may appear around the vehicle symbol and the orientation can be determined by clicking on a specific position on that circle. If a multi-touch display is present, ie a touch-sensitive display device which can distinguish between two simultaneous touches, then the corrected vehicle orientation can also be indicated by a second finger of the user. Another possibility is to correct the entire trajectory by re-interacting with the touch-sensitive display device.
  • In one embodiment, the method is operable when the vehicle is at a standstill. Alternatively, it can also be provided that the method can be carried out when the vehicle is moving below a speed of about 30 km / h, preferably below a speed of 8 km / h, particularly preferably below a speed of 6 km / h. In particular, when the vehicle is moving slowly, wiping is useful, with the user wiping in the direction of the target position he desires.
  • According to the invention, a computer program is also proposed according to which one of the methods described herein is performed when the computer program is executed on a programmable computer device. The computer program can be, for example, a module for implementing a driver assistance system or a subsystem thereof in a vehicle or an application for driver assistance functions executable on a portable device such as a smartphone or a tablet PC. The computer program may be stored on a machine-readable storage medium, such as on a permanent or rewritable storage medium or in association with a computing device or on a removable and / or portable storage medium such as a CD-ROM, DVD, BluRay, FlashDisc or memory card such as SD card, or on a USB stick. Additionally or alternatively, the computer program may be provided for download on a computing device, such as a server or a cloud system, for example, over a data network, such as the Internet, or a communication link, such as a telephone line or wireless connection.
  • According to a further aspect of the invention, a device for determining a trajectory of a driving maneuver comprises a touch-sensitive display device, a unit for determining a target pose based on a character input on the touch-sensitive display device and a unit for determining a trajectory from a current position of the vehicle to the target pose.
  • A preferred embodiment results from a driver assistance system which has an environment detection device with a geometric detection area for detecting a region which can be painted over by the vehicle in the future, for example a stereo video system. The driver assistance system also advantageously comprises components that are configured to display the environment detected by the environment detection device on the touch-sensitive display device, for example to generate a top view, i. a plan view of the detected environment.
  • According to a preferred embodiment, the driver assistance system furthermore comprises an integration unit which is set up to transfer data recorded in time by the environment detection device to historically validated environment data since the sensor area of the environment detection device is often restricted and complex driving situations can only be described by a combination of temporally successive measurements ,
  • Advantages of the invention
  • It was described a way to offer a user an inspiring and intuitive function. The vehicle is guided on the trajectory which is intuitively predetermined by the driver by means of a sign, for example a line, a wipe or a detailed drawing, and which is optimized for drivability and acceptance. Optionally, in addition to this, an alignment according to surrounding objects, parking lots, ramps or the like can take place. The advantages from a systemic point of view are above all that no overly complex situation interpretation must be carried out. This combination offers a highly attractive feature for the user, which does not require expensive computing units.
  • Brief description of the drawings
  • Embodiments of the invention are illustrated in the drawings and explained in more detail in the following description.
  • Show it:
  • 1 a situation of inputting a character on a touch-sensitive display device,
  • 2 a plan view of a situation with a vehicle in a vehicle environment and
  • 3 to 5 further plan views of the situation with the vehicle in the vehicle environment with displayed user input.
  • Embodiments of the invention
  • 1 shows a touch-sensitive display device 1 with an input field 2 and a housing 3 , One hand 4 a user is in a situation of input on the touch-sensitive display device 1 , The user has two examples here 5 entered on the touch-sensitive display device. The display device 1 For example, a fixed installed in a vehicle display device 1 or be on mobile devices. The display device 1 can also be set up as a display device of other driver assistance systems, for example as a display device of a parking assistant or a navigation system.
  • 2 shows in a plan view one on an input field 2 a touch-sensitive display 1 illustrated situation with a vehicle 6 in an environment. The vehicle 6 has in this illustrated embodiment, environment detection units, namely a front camera 7 as well as ultrasonic sensors 8th , The front camera 7 for example has a coverage area 10 with a so-called FOV (Field of View) of 40 °. The front camera can be a monocular camera or a stereo video camera system and form part of a camera system, the further cameras, in particular rear cameras, BSD (Blind Spot Detection) cameras, SVA (Side View Assistant) cameras and / or SVS - (Surround View System) includes cameras, which may be used for example by other driver assistance systems for other purposes. As a result, it is ideally possible to capture the entire environment around the motor vehicle. The front camera 7 is positioned in this embodiment so that its detection area 10 symmetrical about a central axis determined by the vehicle axis 11 is arranged. In the coverage area 10 are detected objects 12 ,
  • The environment detection device of the vehicle 6 may include other environment detection sensors, such as ultrasonic sensors, radar sensors, infrared sensors and / or Lidarsensoren, exemplary four ultrasonic sensors 8th are shown. Other detected objects 12 are located in the detection range of one of the ultrasonic sensors 8th side of the vehicle 6 ,
  • 3 shows the situation of 2 after the user manually signs 5 on the touch-sensitive display 2 entered. The sign 5 is a straight line in this case 15 , which is a starting point 17 and an endpoint 16 having. In the illustrated embodiment, the endpoint 16 the straight line 15 further from a reference point 18 of the vehicle 6 removed as the starting point 17 the straight line 15 , The reference point 18 of the vehicle 6 For example, by the position of the front camera 7 Be determined by the center of the front of the vehicle 6 or the center of the vehicle 6 , In the illustrated embodiment, the reference point 18 of the vehicle 6 through the center of the vehicle 6 certainly. It is conceivable that the user has made an entry in which the starting point 17 closer to the reference point 18 of the vehicle 6 lies as the endpoint 16 , In both cases, the system recognizes the user request and determines based on the input 14 the user a Zielpose. The target pose is through a target position 20 and by an angle 19 with reference to the vehicle axle 11 Are defined. It may also be provided in other embodiments, that the target pose by the target position 20 and defining an orientation determined by world-fixed coordinates, wherein the orientation may be determined, for example, by means of a compass or GPS information. The system determines a trajectory 9 from the current position 21 to the target position 20 , The target pose 19 . 20 is shown in preferred embodiments and can be corrected by the user.
  • 4 shows another sign 5 which is on the input field 2 the touch-sensitive display 1 in the situation of 2 for example, was entered by a user. The sign 5 is a wipe 22 , which in turn is a starting point 17 and an endpoint 16 having. At the wipe 22 the user has merely indicated the target position and paid less attention to the orientation of the vehicle within the target position. Nevertheless, based on the entered character 5 recognized by the driving assistance system, the target pose, which includes the target position and the orientation of the vehicle in the target position. The target position 20 can, for example, as a central position 13 between the endpoint 16 the wipe 22 surrounding detected environment objects 12 be determined. The determination of the orientation in the target position can here, for example, also by an orientation on the surrounding objects 12 respectively. Based on the wipe 22 can also have a default trajectory 14 be determined, for example, by placing a centerline from the starting point 17 to the endpoint 16 is calculated. A path planning, ie a determination of the trajectory 9 can be done so that the wipe 22 is used as a first estimate.
  • 5 shows a further embodiment of the invention, in which the on the touch-sensitive display device 1 entered characters 5 a drawing 23 is. The drawing is also considered a default trajectory 14 recognized. The drawing 23 can against the in 4 described wipe 22 For example, be delimited by the fact that the input speed is lower. The input speed can be determined for example by a quotient, which by the time difference of the starting point 17 to the endpoint 16 divided by a length of the entered character 5 is determined. The input speed can be used, for example, a degree of compliance with the default trajectory 14 or define a departure speed of the trajectory.
  • The following is a possible method for determining the trajectory based on the drawing 23 described. From the input speed and the sampling time of the touch-sensitive display 1 is a number n of vertices { 24a . 24b . 24c , ... 24n }, which can be characterized for example by two-dimensional coordinates {(x 1 , y 1 ) (x 2 , y 2 ), ..., (x n , y n )}. By evaluating environmental data that can be provided for example by environment detection units, in particular by evaluation of camera data, the course of the road surface can be determined. From the n two-dimensional coordinates {(x 1 , y 1 ) (x 2 , y 2 ), ..., (x n , y n )}, three-dimensional coordinates {(x 1 , y 1 , z 1 ) ( x 2 , y 2 , z 2 ), ..., (x n , y n , z n )}, which describe a three-dimensional trajectory on the road surface.
  • The n three-dimensional vertices are uniformly discretized, so that the course of the trajectory does not or as little as possible depends on the resolution of the touch screen. For this purpose, the n 3D intersection points are supplemented by further m 3D interpolation points, so that with a distance measure d we have d ((x i , y i , z i ), (x i-1 , y i-1 , z i + 1 )) < d max ; i = 1..n + m-1. The distance d max can here be based on vehicle system conditions, for example on the speed, on environmental variables, for example by defining a minimum distance to detected obstacles, and on a trajectory profile, for example on the implicitly specified curvature , be determined. The additional interpolation points can be determined, for example, via a linear interpolation between two original interpolation points which have a distance> d max . Now there are n + m 3D interpolation points S disc = {(x i , y i , z i ), i = 1..n + m}, which each have a maximum distance d max to their nearest or previous interpolation point. One of the additional support points with the reference numeral 25a is an example between the bases 24b and 24c shown.
  • Due to an inaccurate and non-uniform scanning of the touchscreen and possibly additionally existing tremors in the user's hand, the entered trajectory very often have a very rough course. This roughness is undesirable in driving behavior and is therefore preferably smoothed out by means of known methods. As a result there are n + m 3D-bases S smooth = {(x i , y i , z i ), i = 1..n + m}. To each of the n + m three-dimensional vertices { 24a . 24b . 24c , ... 24m + n} can also be an angle { 19a . 19b , ..., 19n + m} to the reference axis 11 of the vehicle 6 or with respect to a world coordinate system, which is in 5 by way of example at a base 24g as an angle 19g is shown. Alternatively or additionally, angles may be used 26 between vertices and previous and / or subsequent vertices relative to the reference axis 11 of the vehicle 6 or with respect to a world coordinate system, which is in 5 by way of example at the bases 24f and 24g as an angle 26g is shown.
  • Subsequently, by means of a vehicle model x '= f (x, u, constraints...) In addition to the system states stored in the system vector x, for example a speed, a yaw rate and / or constraints specified in the constraint vector constraints, such as a maximum curvature, a maximum Curvature change and / or an accepted lateral acceleration considered. This will gradually try to follow up with the model and virtual manipulated variables u of the trajectory by the vehicle model, the next or next k support point (s) anfährt.
  • This can for example be realized by the steering angle g ε u is set to an angle so that it is present between the current vehicle position and the desired Zwischenzielpose between the next k support points. If this steering angle is greater than the maximum steering angle, the maximum possible steering angle is used. Such restrictions are found in the vector constraints.
  • The vehicle model is applied to the nodes until the specified target position (xn + m , yn + m , zn + m ) is reached close enough. Not only is the entered trajectory tested for physical accessibility via the vehicle model, it is also expanded by the variables of curvature and yaw angle required for a transverse controller at each point. An optimal control behavior is achieved by the Models used for driveability analysis and controller design are matched, for example, using the same dynamics model for the vehicle.
  • In addition to the pure "starting" of the interpolation points, individual regions of the trajectory can be weighted differently. Thus, for example, the target position can be used as a hard specification of the target pose via the angle which can be calculated from the last two interpolation points, for example according to ψ = arctan (yn + m -yn + m-1 , xn + m -x n + m-1 ), where numerically stable calculation methods of the angle are preferred. It can also be provided, for example, to determine the straight line through the two points in a relative position to the ego vehicle axis or in a relative position to the world coordinate system. Alternatively or additionally, for example, the first interpolation points can be weighted less heavily.
  • The invention is not limited to the embodiments described herein and the aspects highlighted therein. Rather, within the scope given by the claims a variety of modifications are possible, which are within the scope of expert action.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • DE 102009046726 A1 [0005]
    • DE 102010030436 A1 [0006]

Claims (10)

  1. A method for determining a trajectory of a driving maneuver, wherein a user on a touch-sensitive display device ( 1 ) a sign ( 5 ), based on the entered character ( 5 ) a target pose ( 19 . 20 ) and a trajectory ( 9 ) from a current position ( 21 ) of the vehicle ( 6 ) to the target pose ( 19 . 20 ) is determined.
  2. Method according to claim 1, characterized in that on the basis of the input character ( 5 ), a default trajectory ( 14 ) is determined.
  3. A method according to claim 2, characterized in that based on an input speed of the character ( 5 ) a degree of compliance with the default trajectory ( 14 ) in the determination of the trajectory ( 9 ).
  4. Method according to claim 2 or 3, characterized in that for determining the trajectory ( 9 ) the default trajectory ( 14 ) in equidistant interpolation points ( 24 . 25 ) is dissected.
  5. Method according to claim 4, characterized in that the equidistant support points ( 24 . 25 ) on the basis of system sizes, vehicle environment characteristics and / or the position within the default trajectory ( 14 ) are weighted.
  6. Method according to one of the preceding claims, characterized in that the target pose ( 19 . 20 ) on the touch-sensitive display device ( 1 ) is presented and correction options for the user are provided.
  7. Method according to one of the preceding claims, characterized in that the current position ( 21 ) of the vehicle ( 6 ) on the touch-sensitive display device ( 1 ) is pictured.
  8. Method according to one of the preceding claims, characterized in that the method at a travel speed of below 30 km / h is feasible.
  9. A computer program for performing one of the methods of any one of claims 1 to 8, wherein the computer program is executed on a programmable computer device.
  10. Device for determining a trajectory of a driving maneuver, comprising a touch-sensitive display device ( 1 ), a unit for determining a target pose ( 19 . 20 ) on the touch-sensitive display device ( 1 ) character ( 5 ) and a unit for determining a trajectory ( 9 ) from a current position ( 21 ) of the vehicle ( 6 ) to the target pose ( 19 . 20 ).
DE201210222972 2012-12-12 2012-12-12 Method for determining trajectory of driving maneuver, involves inputting symbol on touch-sensitive display device by user, where target pose is recognized depending on input symbol Pending DE102012222972A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE201210222972 DE102012222972A1 (en) 2012-12-12 2012-12-12 Method for determining trajectory of driving maneuver, involves inputting symbol on touch-sensitive display device by user, where target pose is recognized depending on input symbol

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE201210222972 DE102012222972A1 (en) 2012-12-12 2012-12-12 Method for determining trajectory of driving maneuver, involves inputting symbol on touch-sensitive display device by user, where target pose is recognized depending on input symbol

Publications (1)

Publication Number Publication Date
DE102012222972A1 true DE102012222972A1 (en) 2014-06-12

Family

ID=50778205

Family Applications (1)

Application Number Title Priority Date Filing Date
DE201210222972 Pending DE102012222972A1 (en) 2012-12-12 2012-12-12 Method for determining trajectory of driving maneuver, involves inputting symbol on touch-sensitive display device by user, where target pose is recognized depending on input symbol

Country Status (1)

Country Link
DE (1) DE102012222972A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014107302A1 (en) * 2014-05-23 2015-11-26 Valeo Schalter Und Sensoren Gmbh Method for the at least semi-autonomous maneuvering of a motor vehicle along a user-definable driving trajectory and driver assistance device and motor vehicle
DE102015014614A1 (en) 2015-11-12 2016-05-12 Daimler Ag Method for carrying out an autonomous drive of a vehicle
DE102014018108A1 (en) 2014-12-06 2016-06-09 Daimler Ag Method for determining a driving maneuver of a vehicle
DE102016003308B3 (en) * 2016-03-17 2017-09-21 Audi Ag Method for operating a driver assistance system of a motor vehicle and motor vehicle
WO2018077647A1 (en) * 2016-10-26 2018-05-03 Volkswagen Aktiengesellschaft Method and system for the external control of an autonomous vehicle
US10217297B2 (en) 2017-04-19 2019-02-26 Ford Global Technologies, Llc Control module activation to monitor vehicles in a key-off state
US10232673B1 (en) 2018-06-01 2019-03-19 Ford Global Technologies, Llc Tire pressure monitoring with vehicle park-assist
US10234868B2 (en) 2017-06-16 2019-03-19 Ford Global Technologies, Llc Mobile device initiation of vehicle remote-parking
US10281921B2 (en) 2017-10-02 2019-05-07 Ford Global Technologies, Llc Autonomous parking of vehicles in perpendicular parking spots
US10336320B2 (en) 2017-11-22 2019-07-02 Ford Global Technologies, Llc Monitoring of communication for vehicle remote park-assist
US10369988B2 (en) 2017-01-13 2019-08-06 Ford Global Technologies, Llc Autonomous parking of vehicles inperpendicular parking spots
US10378919B2 (en) 2017-04-19 2019-08-13 Ford Global Technologies, Llc Control module activation of vehicles in a key-off state to determine driving routes
US10384605B1 (en) 2018-09-04 2019-08-20 Ford Global Technologies, Llc Methods and apparatus to facilitate pedestrian detection during remote-controlled maneuvers
DE102018207964A1 (en) * 2018-05-22 2019-11-28 Zf Friedrichshafen Ag Method and control device for controlling a vehicle

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009046726A1 (en) 2009-11-16 2011-05-19 Robert Bosch Gmbh Method for detecting and selecting e.g. longitudinal parking spaces, for aiding driver of car for transporting persons or materials, involves displaying surfaces as possible parking spaces, and selecting suitable parking spaces
EP1756522B1 (en) * 2004-05-19 2011-07-13 Honda Motor Co., Ltd. System and method for displaying information
DE102010030463A1 (en) * 2010-06-24 2011-12-29 Robert Bosch Gmbh Method for assisting a driver of a motor vehicle
DE102010030436A1 (en) 2010-06-23 2011-12-29 Thyssenkrupp Elevator Ag Elevator system
DE102011086215A1 (en) * 2011-11-11 2013-05-16 Robert Bosch Gmbh Method for assisting driver of motor car, involves determining trajectory for reaching end position e.g. parking space and maneuvering statements to follow around trajectory or automatic maneuver along trajectory into end position

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1756522B1 (en) * 2004-05-19 2011-07-13 Honda Motor Co., Ltd. System and method for displaying information
DE102009046726A1 (en) 2009-11-16 2011-05-19 Robert Bosch Gmbh Method for detecting and selecting e.g. longitudinal parking spaces, for aiding driver of car for transporting persons or materials, involves displaying surfaces as possible parking spaces, and selecting suitable parking spaces
DE102010030436A1 (en) 2010-06-23 2011-12-29 Thyssenkrupp Elevator Ag Elevator system
DE102010030463A1 (en) * 2010-06-24 2011-12-29 Robert Bosch Gmbh Method for assisting a driver of a motor vehicle
DE102011086215A1 (en) * 2011-11-11 2013-05-16 Robert Bosch Gmbh Method for assisting driver of motor car, involves determining trajectory for reaching end position e.g. parking space and maneuvering statements to follow around trajectory or automatic maneuver along trajectory into end position

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014107302A1 (en) * 2014-05-23 2015-11-26 Valeo Schalter Und Sensoren Gmbh Method for the at least semi-autonomous maneuvering of a motor vehicle along a user-definable driving trajectory and driver assistance device and motor vehicle
DE102014018108A1 (en) 2014-12-06 2016-06-09 Daimler Ag Method for determining a driving maneuver of a vehicle
DE102015014614A1 (en) 2015-11-12 2016-05-12 Daimler Ag Method for carrying out an autonomous drive of a vehicle
DE102016003308B3 (en) * 2016-03-17 2017-09-21 Audi Ag Method for operating a driver assistance system of a motor vehicle and motor vehicle
WO2018077647A1 (en) * 2016-10-26 2018-05-03 Volkswagen Aktiengesellschaft Method and system for the external control of an autonomous vehicle
US10369988B2 (en) 2017-01-13 2019-08-06 Ford Global Technologies, Llc Autonomous parking of vehicles inperpendicular parking spots
US10217297B2 (en) 2017-04-19 2019-02-26 Ford Global Technologies, Llc Control module activation to monitor vehicles in a key-off state
US10378919B2 (en) 2017-04-19 2019-08-13 Ford Global Technologies, Llc Control module activation of vehicles in a key-off state to determine driving routes
US10234868B2 (en) 2017-06-16 2019-03-19 Ford Global Technologies, Llc Mobile device initiation of vehicle remote-parking
US10281921B2 (en) 2017-10-02 2019-05-07 Ford Global Technologies, Llc Autonomous parking of vehicles in perpendicular parking spots
US10336320B2 (en) 2017-11-22 2019-07-02 Ford Global Technologies, Llc Monitoring of communication for vehicle remote park-assist
DE102018207964A1 (en) * 2018-05-22 2019-11-28 Zf Friedrichshafen Ag Method and control device for controlling a vehicle
US10232673B1 (en) 2018-06-01 2019-03-19 Ford Global Technologies, Llc Tire pressure monitoring with vehicle park-assist
US10384605B1 (en) 2018-09-04 2019-08-20 Ford Global Technologies, Llc Methods and apparatus to facilitate pedestrian detection during remote-controlled maneuvers

Similar Documents

Publication Publication Date Title
US9605971B2 (en) Method and device for assisting a driver in lane guidance of a vehicle on a roadway
JP5261554B2 (en) Human-machine interface for vehicles based on fingertip pointing and gestures
EP1926004A2 (en) Method, apparatus, and medium for controlling mobile device based on image of real space including the mobile device
US9346471B2 (en) System and method for controlling a vehicle user interface based on gesture angle
CN102589565B (en) Display over the entire windshield autonomous vehicle and a vehicle operating control system
JP5210497B2 (en) Navigation device
US20120274550A1 (en) Gesture mapping for display device
EP2258587A1 (en) Operation input device for vehicle
KR20110117966A (en) Apparatus and method of user interface for manipulating multimedia contents in vehicle
US20130204457A1 (en) Interacting with vehicle controls through gesture recognition
EP2124139A1 (en) User interface device
CN102239068B (en) Display input device
US9477400B2 (en) Method and apparatus for navigation system for selecting icons and application area by hand drawing on map image
JP4274997B2 (en) Operation input device and operation input method
EP1998996B1 (en) Interactive operating device and method for operating the interactive operating device
US9656690B2 (en) System and method for using gestures in autonomous parking
US20130063336A1 (en) Vehicle user interface system
KR20150022436A (en) Apparatus, method and system for parking control
JP2009205191A (en) Parking space recognition system
CN103162691A (en) Display system, display method and display program supporting a lane change in a vehicle
US20150301727A1 (en) Map information display device, map information display method and program
EP2080668A1 (en) Driving assist device, method and computer program product for a vehicle
DE102010017931A1 (en) Gesture-actuated information systems and methods for interesting details
CN102540468A (en) Virtual cursor for road scene object lelection on full windshield head-up display
EP1840522B1 (en) Navigation device and method for operating a navigation device

Legal Events

Date Code Title Description
R163 Identified publications notified
R084 Declaration of willingness to licence