GB2481536A - Trajectory assistance for driving manoeuvre - Google Patents
Trajectory assistance for driving manoeuvre Download PDFInfo
- Publication number
- GB2481536A GB2481536A GB1110751.3A GB201110751A GB2481536A GB 2481536 A GB2481536 A GB 2481536A GB 201110751 A GB201110751 A GB 201110751A GB 2481536 A GB2481536 A GB 2481536A
- Authority
- GB
- United Kingdom
- Prior art keywords
- vehicle
- end position
- manoeuvred
- driver
- environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 abstract description 17
- 230000033001 locomotion Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 238000002604 ultrasonography Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/027—Parking aids, e.g. instruction means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/027—Parking aids, e.g. instruction means
- B62D15/0285—Parking performed automatically
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Steering Control In Accordance With Driving Conditions (AREA)
Abstract
The invention relates to a method for supporting a driver of a motor vehicle 11 in a driving manoeuvre, especially parking, wherein, in a first step, the environment of the motor vehicle 11 is captured and a two-dimensional representation of the environment of the motor vehicle 11 is indicated, in a second step a desired end position 45 of the vehicle 11 is input by the driver, the end position 45 being marked in the two-dimensional representation and, in a concluding step, a trajectory 49 for attaining the end position 45 is determined and instructions are output to the driver to follow the trajectory 49, or an automatic driving manoeuvre is performed, in which the vehicle is manoeuvred along the trajectory 49 into the end position 45. To aid definition, the driverâs fingers may translate and rotate an image on a touch screen.
Description
Description Title
Method for supporting a driver of a motor vehicle
Prior art
The invention relates to a method for supporting a driver of a motor vehicle in a driving manoeuvre.
Methods for supporting a driver of a motor vehicle comprise, for example, methods that support the driver in complex shunting manoeuvres, for example in parking in a parking space. In this case, parking in a parking space can be effected either forwards or in reverse. Usually, reverse parking operations are supported.
In the case of the systems known from the prior art that support the driver in driving manoeuvres, a distinction is made between those that merely indicate to the driver the distance in relation to objects in the environment of the vehicle, the indication generally being effected optically and/or acoustically, and those that indicate to the driver either necessary interventions or, also, automatically assume the steering. In addition, the longitudinal guiding of the vehicle can also be assumed.
Even in the case of systems in which the steering and, if appropriate, additionally the longitudinal guiding is assumed by the assistance system, it is advantageous to indicate to the driver distances in relation to objects in the environment of the vehicle. The indication in this case is usually effected acoustically, through repeating signal tones, the pause between two tones decreasing as a distance in relation to an object likewise decreases. If a predefined minimum distance is not maintained, a continuous tone sounds to prompt the driver to stop the vehicle.
Alternatively, or additionally, an optical indication can also be effected, for example by means of LEDs. In that case, usually, the number of LEDs that are illuminated increases as a distance decreases. Furthermore, if a predefined distance is not maintained, it is also possible to use differently coloured LEDs. Also possible, in addition to the indication by LEDs, is a two-dimensional, plan-view representation on an indicating unit of an on-board computer, for example on a monitor.
To enable the driver to be supported in the driving manoeuvre, it is necessary for the driver to communicate the planned driving manoeuvre to the assistance system. At present, this is effected, for example, by confirming to the assistance system an identified parking space.
A method wherein a suitable parking space is selected from a plurality of parking spaces, in that the driver directs his view towards the parking space, is described in US-A 2009/0118900. In this case, however, selection of the parking space requires an elaborate system used to detect the direction of view of the driver, and the destination position is determined on the basis of the direction of view.
Disclosure of the invention
Advantages of the invention A method, according to the invention, for supporting a driver of a motor vehicle in a driving manoeuvre comprises the following steps: (a) capturing the environment of the motor vehicle, and indicating a two-dimensional representation of the environment of the motor vehicle, (b) inputting of a desired end position of the vehicle by the driver, the end position being marked in the two-dimensional representation, (c) determining a trajectory for attaining the end position, and outputting instructions to the driver to follow the trajectory, or performing an automatic driving manoeuvre, in which t'he vehicle is manoeuvred along the trajectory into the end position.
The method according to the invention enables the driver to easily select one from a plurality of possible destination positions, for which the necessary trajectory, along which the vehicle must be moved in order to attain this position, is then calculated. Thus, with an assistance system used to perform the method, the driver can easily communicate the end position of the vehicle, both in respect of the position and in respect of the alignment of the vehicle.
To enable the desired end position to be input, it is first necessary for the current position of the vehicle and the environment of the vehicle to be represented on a suitable indicating device. To enable the environment to be represented, it must first be captured. Any methods suitable for environment capture can be used to capture the environment. Thus, the environment can be captured, for example, by means of sensors mounted on the vehicle.
Suitable sensors that can be used for this purpose are, for example, ultrasound sensors, radar sensors, LIDAR sensors, capacitive sensors or video-based sensors. An advantage of the use of video-based sensors is that these can also be used to capture carriageway markings, in addition to objects. For this reason, video-based sensors are particularly preferred.
In addition to sensors attached to the vehicle, it is also possible, for the purpose of capturing the environment of the vehicle, to use information from digital maps, from positioning sensors for determining the current position of the vehicle, for example GPS sensor or inertial sensors, information from a data memory, in which, for example, environment data is stored, or information via telemedia services, for example communication means having transmitters in the environment.
The individual items of information can each be used singly or in combination. Particularly preferred is the combination of data captured by means of sensors attached to the vehicle, and additional information, for example from digital maps or from data stored in a data memory and captured, for example, during journeys with the vehicle.
In order that both carriageway markings and objects that constitute obstacles can be indicated in a two-dimensional representation, it is preferred to first generate a spatial image of the environment from the maps. The generation of a spatial image makes it possible to distinguish clearly between markings, objects that do not constitute an obstacle, for example kerb edges, and objects that do constitute obstacles, for example parked vehicles, moving objects, such as other road users, walls or plants. The spatial image can then be transformed into a two-dimensional representation. In order to indicate clearly to the driver which objects in the two-dimensional representation are obstacles and which objects can be driven over, it is possible, for example, to indicate differing objects in differing colours. In the two-dimensional representation indicated to the driver, both objects that constitute obstacles and objects that do not constitute obstacles are then each indicated in relation to the driver's vehicle. The indication in this case is effected, for example, via a monitor, for example a monitor of a navigation device. However, any other monitor on which the environment is indicated, and that is within the field of view of the driver, can also be used. It is preferred, however, to use only one monitor, on which any indications can be shown, as specified by the driver, for example the environment of the vehicle, in order to perform an assisted driving manoeuvre, or also representations of the navigation system, or any other data.
To enable a desired end position of the vehicle to be input in an unambiguous manner by the driver, it is particularly preferred if the inputting of the desired end position is effected via a touch-sensitive screen. In this case, the touch-sensitive screen also serves simultaneously as an indicating device for the two-dimensional representation of the environment. Besides the use of a touch-sensitive screen, however, it is also possible to use any other pointing device by which a desired end position can be input in a clear manner. Thus, for example, a touch-pad, a track-ball, a mouse, a multifunction pushbutton, or a gesture interpretation device can also be used. The use of a touch-sensitive screen, track-ball or multifunction pushbutton is particularly suitable in a vehicle.
If the inputting is not effected via a touch-sensitive screen, it is furthermore advantageous to provide a pointer, by which points in the indication can be marked.
The movement1 of the pointer is then effected via the respective input means such as a touch-pad, track-ball, mouse or gesture interpretation device. Besides the said input devices, a voice-controlled input is clearly also conceivable.
In order to make an input relating to the desired end position of the vehicle, in particular in respect of the position of the vehicle and the alignment of the vehicle, it is possible, for example, to mark first a point at a corner on the image of the vehicle to be manoeuvred, and then a point that, after attainment of the end position, is adjacent to the vehicle to be manoeuvred. The point marked on the image of the vehicle to be manoeuvred is, for example, a front or rear corner of the vehicle. The marking of a point that is adjacent after attainment of the end position makes clear whether, in a parking operation, the vehicle is to park, for example, forwards or in reverse, and which parking space is selected.
Alternatively, besides the marking of a point on the image of the vehicle to be manoeuvred and a point that, after attainment of the end position, is adjacent to the vehicle to be manoeuvred, it is also possible for the image of the vehicle to be manoeuvred in the two-dimensional representation to be brought into the end position through being displaced in the indication. In order for the image of the vehicle to be manoeuvred to be displaced to the end position, it is possible, for example, first to mark the image of the vehicle to be manoeuvred. This can be effected, for example, by pressing a pushbutton, or also through pressure by a finger on the touch-pad or on the touch-sensitive screen. Then, in the case of a touch-pad or a touch-sensitive monitor, the finger is moved in the direction of the desired end position. In this case, the image of the vehicle to be manoeuvred follows the finger.
If another input means is used, after the image of the vehicle to be manoeuvred has been marked, it is possible, for example, by moving of the track-ball or by pressing the multifunction pushbutton, for example, for the image of the vehicle to be manoeuvred to be moved in the corresponding direction. Alternatively, for example, a pointer can also be moved to the end position, and the end position marked by pressing a key-button. It is preferred, however, to move the image of the vehicle to be marioeuvred. In order that the desired alignment of the vehicle to be manoeuvred can also be input, it is possible, in the case of a touch-pad or a touch-sensitive screen, to use a second finger to rotate the image of the vehicle to be manoeuvred, while the image continues to remain marked by pressure by the first finger. Also conceivable, however, besides this design, is any other design by which the image of the vehicle to be manoeuvred can be rotated. Thus, it is also possible, for example, to mark the image of the vehicle to be manoeuvred by tapping on the touch-pad or on the touch-sensitive monitor. Once the image is marked, it is possible, for example, for the image of the vehicle to be manoeuvred to be brought to the desired end position through being moved by a finger, even without additional pressure. A movement by two fingers, for example rotation by two fingers, then also enables a rotational movement, for example, to be represented. The marking can then be undone by further tapping. The assistance system, for example, can also be thereby activated at the same time, in order to calculate suitable trajectories for attaining the selected end position.
An input that the image of the vehicle to be rnanoeuvred has been displaced to the desired end position can also be effected, however, through any other input, for example pressing of a key-button or similar.
To make it easier for the driver to displace the image of the vehicle to be manoeuvred, it is possible to rasterize the translation possibilities, i.e. the motion of the vehicle. In this case, moving of the image of the vehicle to be manoeuvred causes the centre point in each case to be moved from one raster point to an adjacent raster point.
The desired rasterization is preferably defined by the user.
Since, because of the system, there is always also the possibility of the driver defining an end position that cannot be attained, it is preferred, in an embodiment of the invention, if information is output to the driver if it is not possible to drive to the desired end position.
In this case, it is possible, for example, for the system to continue also to propose alternative end positions.
Otherwise it is necessary for the driver to consider an alternative end position and to repeat the input.
A device, according to the invention, for executing the method comprises means for capturing the environment of the motor vehicle, means for indicating a two-dimensional representation of the environment of the motor vehicle, means for the inputting of a desired end position of the vehicle by the driver, the end position being marked in the two-dimensional representation, and means for determining a trajectory for attaining the end position and for outputting instructions to the driver to follow the trajectory, or for performing an automatic driving manoeuvre, in which the vehicle is manoeuvred along the trajectory into the end position.
Brief description of the drawings
Exemplary embodiments of the invention are represented in the figures, and are explained more fully in the following
description.
In the figures: Figure 1 shows a flow diagram of the method according to the invention, Figure 2 shows a three-dimensional representation of a situation in which the method according to the invention can be used, Figure 3 shows a simplified three-dimensional representation of the situation represented in Figure 2, -10 -Figure 4 shows a two-dimensional representation of the situation shown in Figure 2, Figure 5 shows a two-dimensional representation of the situation according to Figure 2, with selected corners for the purpose of marking the desired end position of the vehicle, Figure 6 shows a two-dimensional representation showing the vehicle to be manoeuvred in the selected end position, Figure 7 shows a two-dimensional representation of the end position of the vehicle at an alternative end position, Figure 8 shows the image of the vehicle to be manoeuvred being displaced in the two-dimensional representation by a finger, Figure 9 shows the vehicle to be manoeuvred being rotated in the two-dimensional representation through use of two fingers, Figure 10 shows a two-dimensional representation of the start position and end position, and a calculated trajectory for attaining the end position.
Exemplary embodiments of the invention Figure 1 shows a flow diagram of the method according to the invention.
-11 -In a first step 1, the environment of a vehicle to be manoeuvred is captured. The capturing of the environment in this case is effected by means of sensors 1.1. The sensors in this case are usually disposed in the front region and in the rear region of the vehicle, and record an image of the environment of the vehicle. Generally, for this purpose, a signal is emitted by the sensor, and an echo of the signal, which is reflected, for example, on obstacles, is received. The distance in relation to an object can be calculated from the signal propagation time between transmission of the signal and receiving of the echo. Suitable sensors used for capturing the environment are, for example, ultrasound sensors, infrared sensors, radar sensors, LIDAR sensors or, also, video sensors.
Unlike ultrasound sensors, infrared sensors, radar sensors and LIDAR sensors, in the case of video sensors an image of the environment is recorded and the environment is captured through image processing. The use of such sensors for capturing the environment is already known from commercially available driving assistance systems, for example parking assistance systems.
Besides the use of sensors for capturing the environment, it is also possible, alternatively, to make use of digital maps. This happens in step 1.2. If, in step 1.2, use is made of digital maps, these are generally stored in a memory in the vehicle. In this case, for example, the memory of a navigation system is used. Alternatively, however, it is also possible for the digital maps also to be stored in any other memory medium that can be used by the driving assistance system by which the method according to the invention is executed.
-12 -In a step 1.3, use can also be made of self-positioning for the purpose of capturing the environment. For example, GPS data or inertial sensor systems can be used for self-positioning. If a self-positioning 1.3 is performed, it is generally in combination with at least one further step for capturing the environment. Thus, for example, a self-positioning 1.3 can be performed in combination with use of digital maps 1.2 in order to capture the environment.
Furthermore, it is also possible to make use of stored data in step 1.4. For example, environment data captured during a drive-past by the vehicle can be stored in this case. It is also possible, for example, to store data already captured earlier, and to make use of this data during further travel.
Besides the capturing of the environment by means of sensors, the use of digital maps, the use of stored data and the self-positioning, it is also possible to make use of telemedia data. Telemedia data is transmitted, for example, by suitable sensors in the environment of the vehicle. Thus, for example, positioning data can be transmitted from other vehicles to a receiver on the vehicle. It is also possible, for example, for parking garages to be equipped with corresponding transmitters, which can communicate information to the vehicle to be manoeuvred. A spatial image of the environment is compiled, through an environment interpretation 3, from the environment data captured in step 1, the data being able, as described above, to be captured, for example, by means of sensors, to be obtained from digital maps, to be obtained from stored data or to be received through the use of telemedia services, the capturing of the environment being able to comprise one or more of the aforementioned -13 -steps. This image can be indicated to the driver via a suitable indicating element, for example a screen. A suitable screen is, for example, a rnultifunction screen that can indicate differing data according to function.
Alternatively, a separate screen for the driving assistance system is also conceivable. The representation of the environment is effected in a third step, step 5. The representation of the environment in this case can be effected either three-dimensionally or two-dimensionally.
A two-dimensional, plan-view representation is preferred, the vehicle to be rnanoeuvred being shown, in the representation, in the captured environment.
In a fourth step, step 7, the driver of the vehicle selects an end position to be attained through the driving manoeuvre to be performed. In a fifth step, step 9, at least one suitable trajectory is calculated by the driving assistance system on the basis of the start position that has been captured together with the capture of the environment, and on the basis of the selected end position.
The trajectory can then likewise be indicated, for example, in the representation of the environment. In order to attain the end position from the start position, in a first embodiment the control of the vehicle is then assumed by the driving assistance system. In this case, the driving assistance system controls both the steering settings and the longitudinal guiding of the vehicle. Alternatively, it is also possible for only the steering settings to be assumed by the driving assistance system, and for the longitudinal guiding, i.e. accelerating, maintaining speed and braking of the vehicle to continue to be performed by the driver. Alternatively, it is also possible for the driver himself to effect the steering settings and -14 -longitudinal guiding, the necessary steering interventions being indicated on a suitable indicating device. The steering interventions in this case can be indicated, for example, optically, acoustically and/or haptically.
Shown in Figure 2 is a three-dimensional representation of a situation in which the method according to the invention can be used.
A vehicle 11 to be manoeuvred moves along a road 12. The road is bounded on one side by transverse parking spaces 13 and, on the other side, by longitudinal parking spaces 15.
Some of the transverse parking spaces 13 and longitudinal parking spaces 15 are occupied by vehicles 17. Between or adjacent to each of the vehicles 17, however, there are also unoccupied single parking spaces or, also, a plurality of parking spaces. A driver of the vehicle 11 to be manoeuvred, wishing to park the vehicle in a free transverse parking space 13 or free longitudinal parking space 15, can now select a free parking space, in order to park in the latter. The parking manoeuvre is a driving manoeuvre in which the driver of the vehicle 11 to be manoeuvred can be supported by the method according to the invention.
In order for the driver of the vehicle 11 to be manoeuvred to be supported in the driving manoeuvre, the environment of the vehicle 11 to be manoeuvred is captured. The capturing can be effected, for example, through the use of sensors, by means of which the environment is captured. As already described above, the corresponding sensors are usually located in the front region and in the rear region of the vehicle 11 to be manoeuvred. A signal 19 is emitted -15 -by the sensors. In the situation represented in Figure 2, the signal 19 is reflected from an obstacle 21 in the form of a forklift truck. The reflected signal is received as an echo by the sensor. The distance in relation to the obstacle 21 can then be determined from the propagation time between transmitting of the signal and receiving of the echo. In addition, it is possible for the vehicle 11 to be manoeuvred to comprise a receiver for telemedia services. This receiver receives data transmitted by corresponding transmitters. In this way, communication is made possible between the vehicle 11 to be manoeuvred and the environment, and direct data from the environment can be received via the telemedia services. Thus, it is possible, for example, for the vehicle 11 to be manoeuvred to communicate with vehicles in the environment that have corresponding transmitters, and in this manner to identify where a vehicle is parked. Furthermore, as described previously, the environment can also be captured through use of digital maps or through use of stored data. It is particularly preferred if video sensors are used for capturing the environment, since the latter can be used to capture not only three-dimensional objects, but also, for example, carriageway markings with which, for example, the transverse parking spaces 13 and longitudinal parking spaces 15 represented in Figure 2 are marked.
From the received data relating to the environment of the vehicle 11 to be manoeuvred, a simplified spatial image of the environment is compiled in a subsequent step 5. This is shown in Figure 3. In the simplified representation of the environment, spatial objects can be represented, for example, as simple blocks. The vehicle to be manoeuvred is also shown as a block 25. If the simplified three- -16 -dimensional representation is to be indicated to the driver, it is possible for the block 25 for the vehicle 11 to be manoeuvred to be highlighted in a different colour or labelled accordingly, in order that the driver of the vehicle 11 to be manoeuvred can distinguish between his own vehicle and objects that are obstacles.
If carriageway markings are also identified through the capturing of the environment, these markings can likewise be represented, for example as lines 27.
Represented in Figure 4 is a two-dimensional representation of the situation shown in Figure 2. In the two-dimensional representation in Figure 4, the respective objects are represented, in a simplified manner, as rectangles 29.
Here also, it is possible for the vehicle to be manoeuvred, likewise represented as a rectangle 31, to be highlighted, for example, by representation in a different colour or by labelling.
In order to define a possible end position for the vehicle 11 to be manoeuvred, it is possible, in a first embodiment, for example to mark an edge of the vehicle 33 to be manoeuvred, and to mark a point 35 on an object that, after attainment of the end position, is adjacent to the vehicle to be manoeuvred. The edge of the vehicle to be manoeuvred and the object that, after attainment of the end position, is adjacent to the vehicle to be manoeuvred, can be marked, for example, by touching the corresponding edges on a touch-sensitive screen. Alternatively, it is also possible, for example, to move a pointer to the corresponding position by means of a track-ball or a multifunction pushbutton, and to effect marking, for -17 -example, by pressing a key-button. Furthermore, it is also possible to move such a pointer, for example, by means of a touch-pad. The use of a touch-sensitive screen is preferred, however, the marking being effected directly through tapping on the touch-sensitive screen. It is advantageous in this case if the desired edge of the vehicle to be manoeuvred is touched first, and then the edge adjacent to the end position. However, marking in the reverse sequence is also possible. After the edge of the driver's vehicle and the edge of the object that, after attainment of the end position, is adjacent to the vehicle to be manoeuvred have been marked, a driving assistance system used to support the driver calculates a suitable trajectory by which the vehicle 11 to be manoeuvred can be brought into the defined end position. This is shown in Figure 6 for the markings represented in Figure 5.
It is advantageously possible in this case, after the marking of the edge on the vehicle 33 to be manoeuvred and of the edge 35 of the object that is adjacent after attainment of the end position, for the end position of the vehicle to be manoeuvred to be indicated to the driver on the indicating unit, as shown in Figure 6. This enables the driver to see whether the driving assistance system has correctly interpreted the driver's wish. Should this not be the case, it is possible to re-input the end position.
Otherwise, the driver can request support in the driving manoeuvre, for example by pressing a confirmation key-button.
An alternative end position is represented in Figure 7. In this case, the adjacent point is the right marking of the longitudinal parking space 15. The end position of the -18 -vehicle 11 to be manoeuvred is thus located in a free longitudinal parking space 15, and not in a free transverse parking space 13, as in Figure 6. By marking the edge of the corresponding longitudinal parking space, the driver of the vehicle 11 to be manoeuvred can select his desired parking place.
Clearly, the driver of the vehicle 11 to be manoeuvred can also select any other possible end position, besides the end positions represented in Figures 6 and 7. Instead of the right front corner of the vehicle to be manoeuvred, it is also possible to select any other corner of the vehicle to be manoeuvred. In this case, the edge adjacent to the end position of the vehicle to be rnanoeuvred must then also be selected accordingly.
An alternative possibility for selection is represented in Figures 8 to 10.
Thus, it is possible, for example, particularly in the case of use of a touch-sensitive screen, for the vehicle 11 to be inanoeuvred to be marked through pressing with a finger 37 on the touch-sensitive screen. By moving the finger 37 marking the vehicle 11 to be manoeuvred, the vehicle 11 to be manoeuvred, represented as a rectangle 31, can be moved in the representation. As represented by arrows 39, movement in this case is possible forwards, backwards, to the right and to the left. In order additionally to select the desired position of the vehicle, it is possible, for example through use of a second finger 41, to rotate the vehicle to be manoeuvred, represented as a rectangle. In this case, the rectangle 31 for the vehicle to be manoeuvred continues to be marked by means of the finger -19 - 37, and the second finger 41 is used to effect a rotational movement, according to which the rectangle 31 for the vehicle to be manoeuvred is rotated. The rotating is shown in Figure 9 by rotation arrows 43.
Through further displacement by the finger 37, the vehicle can then be brought, for example, into a desired end position 45, as represented in Figure 10.
Besides the end position 45, Figure 10 also shows, represented by broken lines, the start position 47 and a trajectory 49, along which the vehicle can be moved out of the start position 47, into the end position 45.
After the selection of the desired end position 45, which, by way of alternative to the example in Figures 9 and 10, can be performed in corresponding manner, not by means of a touch-sensitive screen, but also, for example, on a touch-pad, the trajectory 49 is calculated, insofar as it is possible to drive to the desired end position 45. Should the driver of the vehicle 11 to be manoeuvred have selected an end position that cannot be attained, this is preferably indicated to the driver.
After the trajectory 49 has been calculated, the driving manoeuvre for attaining the end position 45 is then either performed by the driving assistance system or, alternatively, prompts are given to the driver of the vehicle 11 to be manoeuvred in order that the driver, through corresponding steering instructions and, if appropriate, instructions for the longitudinal guiding of the vehicle, can move independently into the end position 45. It is also possible for the steering setting, for -20 -example, to be assumed by the driving assistance system, but for the longitudinal guiding to remain with the driver of the vehicle 11 to be manoeuvred.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102010030463A DE102010030463A1 (en) | 2010-06-24 | 2010-06-24 | Method for assisting a driver of a motor vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201110751D0 GB201110751D0 (en) | 2011-08-10 |
GB2481536A true GB2481536A (en) | 2011-12-28 |
Family
ID=44485119
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1110751.3A Withdrawn GB2481536A (en) | 2010-06-24 | 2011-06-23 | Trajectory assistance for driving manoeuvre |
Country Status (5)
Country | Link |
---|---|
JP (1) | JP2012006590A (en) |
DE (1) | DE102010030463A1 (en) |
FR (1) | FR2963915A1 (en) |
GB (1) | GB2481536A (en) |
IT (1) | ITMI20111115A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014070886A1 (en) * | 2012-10-30 | 2014-05-08 | Robert Bosch Gmbh | System and method for using gestures in autonomous parking |
CN105118321A (en) * | 2015-09-30 | 2015-12-02 | 上海斐讯数据通信技术有限公司 | Intelligent extraction method and system for vehicle and vehicle |
US10435033B2 (en) | 2015-07-31 | 2019-10-08 | Panasonic Intellectual Property Management Co., Ltd. | Driving support device, driving support system, driving support method, and automatic drive vehicle |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012207644A1 (en) * | 2012-05-08 | 2013-11-14 | Bayerische Motoren Werke Aktiengesellschaft | User interface for driver assistance system in motor vehicle, is adapted to detect movement of finger of user in vehicle transverse direction as user request, and state of automatic driving is activated and deactivated by user interface |
KR101401399B1 (en) * | 2012-10-12 | 2014-05-30 | 현대모비스 주식회사 | Parking Assist Apparatus and Parking Assist Method and Parking Assist System Using the Same |
DE102012222972A1 (en) * | 2012-12-12 | 2014-06-12 | Robert Bosch Gmbh | Method for determining trajectory of driving maneuver, involves inputting symbol on touch-sensitive display device by user, where target pose is recognized depending on input symbol |
DE102013209853A1 (en) | 2013-05-27 | 2014-11-27 | Robert Bosch Gmbh | Parking assistance system and method for performing a semi-automatic parking or Rangiervorgangs a vehicle |
DE102013221201A1 (en) | 2013-10-18 | 2015-05-07 | Robert Bosch Gmbh | Method for assisting a driver when parking |
DE102014107302A1 (en) * | 2014-05-23 | 2015-11-26 | Valeo Schalter Und Sensoren Gmbh | Method for the at least semi-autonomous maneuvering of a motor vehicle along a user-definable driving trajectory and driver assistance device and motor vehicle |
DE102016211179A1 (en) * | 2015-09-08 | 2017-03-09 | Volkswagen Aktiengesellschaft | A method and apparatus for performing automated driving of a vehicle along a provided trajectory |
DE102015121504A1 (en) * | 2015-12-10 | 2017-06-14 | Valeo Schalter Und Sensoren Gmbh | A method for detecting a longitudinal parking space for parking a motor vehicle based on a road marking, driver assistance system and motor vehicle |
DE102016003308B3 (en) * | 2016-03-17 | 2017-09-21 | Audi Ag | Method for operating a driver assistance system of a motor vehicle and motor vehicle |
JP6681604B2 (en) * | 2016-05-31 | 2020-04-15 | パナソニックIpマネジメント株式会社 | Driving support device, driving support system, driving support method, driving support program, and autonomous driving vehicle |
JP6992563B2 (en) * | 2018-02-08 | 2022-01-13 | 株式会社デンソー | Parking support device |
CN110386135B (en) * | 2018-04-16 | 2021-03-26 | 比亚迪股份有限公司 | Automatic parking control device, system, vehicle and method |
DE102018109478A1 (en) * | 2018-04-20 | 2019-10-24 | Valeo Schalter Und Sensoren Gmbh | Determining a parking position |
DE102019219021A1 (en) * | 2019-12-06 | 2021-06-10 | Robert Bosch Gmbh | Method and apparatus for exchanging maneuver information between vehicles |
DE102021131095A1 (en) | 2021-11-26 | 2023-06-01 | Bayerische Motoren Werke Aktiengesellschaft | Method and driver assistance system to support a driver when driving along a trajectory with correction moves |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040260439A1 (en) * | 2003-04-11 | 2004-12-23 | Toyota Jidosha Kabushiki Kaisha | Parking assist apparatus and parking assist method for vehicle |
US20050049767A1 (en) * | 2003-08-28 | 2005-03-03 | Tomohiko Endo | Parking assist apparatus |
US20060190147A1 (en) * | 2003-07-10 | 2006-08-24 | Wei-Chia Lee | Driver-assist device, in particular, for parking a vehicle |
US20090085771A1 (en) * | 2007-09-27 | 2009-04-02 | Jui-Hung Wu | Auto-parking device |
EP2055536A1 (en) * | 2006-12-12 | 2009-05-06 | Toyota Jidosha Kabushiki Kaisha | Parking support device |
WO2009147920A1 (en) * | 2008-06-03 | 2009-12-10 | アイシン精機株式会社 | Parking support device |
WO2010043944A1 (en) * | 2008-10-14 | 2010-04-22 | Toyota Jidosha Kabushiki Kaisha | Parking assistance apparatus and control method thereof |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1950098A1 (en) | 2005-11-17 | 2008-07-30 | Aisin Seiki Kabushiki Kaisha | Parking assisting device and parking assisting method |
-
2010
- 2010-06-24 DE DE102010030463A patent/DE102010030463A1/en not_active Withdrawn
-
2011
- 2011-06-21 IT IT001115A patent/ITMI20111115A1/en unknown
- 2011-06-22 FR FR1155481A patent/FR2963915A1/en not_active Withdrawn
- 2011-06-23 JP JP2011139714A patent/JP2012006590A/en not_active Withdrawn
- 2011-06-23 GB GB1110751.3A patent/GB2481536A/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040260439A1 (en) * | 2003-04-11 | 2004-12-23 | Toyota Jidosha Kabushiki Kaisha | Parking assist apparatus and parking assist method for vehicle |
US20060190147A1 (en) * | 2003-07-10 | 2006-08-24 | Wei-Chia Lee | Driver-assist device, in particular, for parking a vehicle |
US20050049767A1 (en) * | 2003-08-28 | 2005-03-03 | Tomohiko Endo | Parking assist apparatus |
EP2055536A1 (en) * | 2006-12-12 | 2009-05-06 | Toyota Jidosha Kabushiki Kaisha | Parking support device |
US20090085771A1 (en) * | 2007-09-27 | 2009-04-02 | Jui-Hung Wu | Auto-parking device |
WO2009147920A1 (en) * | 2008-06-03 | 2009-12-10 | アイシン精機株式会社 | Parking support device |
WO2010043944A1 (en) * | 2008-10-14 | 2010-04-22 | Toyota Jidosha Kabushiki Kaisha | Parking assistance apparatus and control method thereof |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014070886A1 (en) * | 2012-10-30 | 2014-05-08 | Robert Bosch Gmbh | System and method for using gestures in autonomous parking |
US9656690B2 (en) | 2012-10-30 | 2017-05-23 | Robert Bosch Gmbh | System and method for using gestures in autonomous parking |
US10435033B2 (en) | 2015-07-31 | 2019-10-08 | Panasonic Intellectual Property Management Co., Ltd. | Driving support device, driving support system, driving support method, and automatic drive vehicle |
CN105118321A (en) * | 2015-09-30 | 2015-12-02 | 上海斐讯数据通信技术有限公司 | Intelligent extraction method and system for vehicle and vehicle |
CN105118321B (en) * | 2015-09-30 | 2017-06-16 | 上海斐讯数据通信技术有限公司 | A kind of intelligent extract method of the vehicles, system and the vehicles |
Also Published As
Publication number | Publication date |
---|---|
JP2012006590A (en) | 2012-01-12 |
ITMI20111115A1 (en) | 2011-12-25 |
GB201110751D0 (en) | 2011-08-10 |
DE102010030463A1 (en) | 2011-12-29 |
FR2963915A1 (en) | 2012-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2481536A (en) | Trajectory assistance for driving manoeuvre | |
US11703852B2 (en) | Vehicle remote instruction system | |
US9605971B2 (en) | Method and device for assisting a driver in lane guidance of a vehicle on a roadway | |
US20110082613A1 (en) | Semiautomatic parking machine | |
KR101806619B1 (en) | Parking guide apparatus and method in vehicle | |
US9881502B2 (en) | Method for assisting a driver of a motor vehicle | |
EP3389026A1 (en) | Apparatus and method for road vehicle driver assistance | |
CN106922195A (en) | Method, driver assistance system and motor vehicles for generating the surrounding environment map of the surrounding area of motor vehicles | |
JP7121714B2 (en) | vehicle control system | |
WO2019181260A1 (en) | Parking assistance device | |
JP4566596B2 (en) | Operation instruction device | |
CN113393697B (en) | Parking information management server, parking assistance device, and parking assistance system | |
CN112124092B (en) | Parking assist system | |
CN113525337A (en) | Parking position identification system and parking auxiliary system comprising same | |
US10482667B2 (en) | Display unit and method of controlling the display unit | |
US20220308345A1 (en) | Display device | |
CN112977257B (en) | Display device and parking assistance system for vehicle | |
CN112124090A (en) | Parking assist system | |
CN112977419B (en) | Parking assist system | |
JP7492369B2 (en) | Parking assistance device and parking assistance method | |
WO2021162001A1 (en) | Parking assistance device and parking assistance method | |
US20230286495A1 (en) | Control device, control method, and computer-readable recording medium | |
US20230286526A1 (en) | Control device, control method, and computer-readable recording medium | |
CN112977417B (en) | Parking assist system | |
US20240190415A1 (en) | Parking assist device, control method, and non-transitory storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |