CN113296601A - Method for training and/or optimizing an occupant monitoring system - Google Patents
Method for training and/or optimizing an occupant monitoring system Download PDFInfo
- Publication number
- CN113296601A CN113296601A CN202110188644.8A CN202110188644A CN113296601A CN 113296601 A CN113296601 A CN 113296601A CN 202110188644 A CN202110188644 A CN 202110188644A CN 113296601 A CN113296601 A CN 113296601A
- Authority
- CN
- China
- Prior art keywords
- monitoring system
- occupant monitoring
- vehicle
- person
- touch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
- B60R21/01512—Passenger detection systems
- B60R21/0153—Passenger detection systems using field detection presence sensors
- B60R21/01538—Passenger detection systems using field detection presence sensors for image processing, e.g. cameras or sensor arrays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/037—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
- B60R21/01512—Passenger detection systems
- B60R21/01552—Passenger detection systems detecting position of specific human body parts, e.g. face, eyes or hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Abstract
The present invention relates to a method for training and/or optimizing an occupant monitoring system of a vehicle and such an occupant monitoring system, the occupant monitoring system is used for automatically recognizing the posture and/or gesture of a person in a vehicle, wherein the posture and/or gesture of the person in the vehicle is detected by means of an occupant monitoring system, wherein information about the manipulation of the manipulation element or the touching of the touch-sensitive element by the person in the vehicle detected by a detection device different from the occupant monitoring system is obtained, and wherein in the case of using the obtained information about the manipulation of the manipulation element or the touching of the touch-sensitive element and the information about the posture and/or gesture of the person determined by means of the occupant monitoring system, the occupant monitoring system is trained and/or optimized with regard to the automated recognition of gestures and/or postures of persons in the vehicle, wherein these pieces of information correspond to one another in time.
Description
Technical Field
The present invention relates to a method for training and/or optimizing an occupant monitoring system of a vehicle, to a computing unit for carrying out the method, and to a computer program for carrying out the method.
Background
In modern vehicles, so-called Occupant Monitoring Systems (also referred to as "occupancy Monitoring Systems") are available which can be used for safety-relevant functions, such as Monitoring whether the driver has at least one hand on the steering wheel or checking whether the person is seated well in the rear seat. This can also be used for comfort functions, such as gesture control in a vehicle.
Disclosure of Invention
According to the invention, a method for training and/or optimizing an occupant monitoring system, a computing unit for carrying out the method and a computer program for carrying out the method are proposed with the features of the independent patent claims. Advantageous embodiments are the subject matter of the dependent claims and the subsequent description.
The invention relates to the training and/or optimization of an occupant monitoring system of a vehicle for the automated recognition of the postures and/or gestures of a person in the vehicle or is designed for this purpose, as already briefly mentioned at the outset. Such systems for automatically recognizing gestures or signs of a person or body parts of a person, such as arms or also hands, usually have to be trained before use. To this end, such systems, which typically work with machine learning and e.g. (artificial) neural networks, may be provided with e.g. a number of images or video segments showing a certain gesture or posture. Thereby, in principle, although different gestures or postures or their recognition may be trained, the system accuracy depends on the quality of the data provided for the training. Also problematically may be: the person in the vehicle behaves abnormally, which may lead to a false identification.
Ways to improve the accuracy of such systems are for example online training or real-time training. However, in this case it is difficult to find an exactly known gesture or gesture. However, this is necessary in order to improve the system by means of training.
The invention is starting from this and proposes: for training, information is also used which comes from a detection device different from the occupant monitoring system and which enables a clear determination of the posture or gesture, since it contains information about the actuation of the actuating element or the touching of the touch-sensitive element by a person in the vehicle.
In order to detect the posture and/or the gesture of the person, the occupant monitoring system can have at least one of the following detection devices: cameras, TOF sensors, radar, infrared sensors, microphones, and thermal imagers. A plurality of identical detection means and/or (arbitrary) combinations of these detection means are also conceivable. The component in the vehicle may be, for example, a rotary knob (for example for adjusting the volume of a radio), a push button (for example on a radio or for a power window lifter) or a gear lever. It is easy to understand that: the method can also be performed in the same way for each of these actuating elements.
In the case of using information about the manipulation of the manipulation element or the touching of the touch-sensitive element determined by means of a detection device (which is different from the occupant monitoring system) and information about the posture and/or the gesture of the person determined by means of the occupant monitoring system, which information correspond to one another in time, the occupant monitoring system is trained and/or optimized with regard to the automated recognition of the posture and/or the gesture of the person in the vehicle. This may improve or improve, in particular, the algorithms of the occupant monitoring system for recognizing gestures or gestures. With regard to the specific manner and method of how such algorithms can be trained, it should not be discussed in detail here, but should be referred to the corresponding literature (for example "Online Q-Learning connecting Systems" by Rummery & Niranjan, 1994).
A particular advantage of the invention consists in the flexible use of the fact that; during use of the vehicle, the person manipulates or touches elements which in modern vehicles generally function electrically or electronically and thus provide corresponding signals in order to provide data to the occupant monitoring system for training or optimization. In this respect, it is also suitable that: the element itself or one or more components of the element or also a control unit assigned to the element is used as a detection device which is distinct from the occupant monitoring system.
Here, the information about the manipulation or touching of the element preferably comprises one or more points in time and/or manners and/or amounts of the manipulation or touching of the element. Information that such an element is completely actuated or touched is implicitly contained in the signals generated thereby, for example, on a bus in the vehicle, and the point in time of this actuation or touch (for example, in the event of a brief key press) can be used for the sequential assignment of this actuation or touch and the associated information or data detected by the occupant monitoring system over time.
However, for example, in the case of a long key depression, two points in time of the actuation or touching at the beginning and at the end can likewise be detected and used, for example, in order to obtain more precise information. In the case of a knob, for example, the direction of rotation (as a manner of manipulation) and the angle of being rotated (as an amount of manipulation) may be detected. In the case of a gear lever, for example, the direction in which the gear lever is moved can be detected. In this way, different information can therefore be detected and provided to the passenger monitoring system in order to obtain information that is as convincing as possible for the best possible training or optimization of the passenger monitoring system.
It is also suitable that: the occupant monitoring system is trained and/or optimized with regard to the automated recognition of gestures and/or gestures of a person in the vehicle using information about the position of elements in the vehicle and/or the position relative to a detection device (e.g. a camera) of the occupant monitoring system. Thereby, a better assignment of gestures or gestures in the training data can be achieved. Such position information does not necessarily have to be transmitted to the occupant monitoring system at all times, rather, since it is usually not changed, it can already be registered in the occupant monitoring system and associated with each used element with regard to it. However, if such an element is arranged on the steering wheel, it is also possible for the position to detect information about the current angle of the steering wheel and to transmit this information to the occupant monitoring system.
The computing unit according to the invention, for example, a control unit of a motor vehicle (which control unit, in addition to the mentioned detection device, may be part of an occupant monitoring system), is designed in a program-controlled manner in particular to carry out the method according to the invention.
The subject matter of the invention is also an occupant monitoring system having a computing unit according to the invention and a detection device, such as a camera, for detecting the posture and/or gesture of a person. It should also be noted here that: information about the actuation of the actuating element or the touching of the touch-sensitive element by a person in the vehicle is usually detected by a detection device that is different from the occupant monitoring system during an actuation or touching operation, but has only been used up to now for the function actually provided for this purpose, i.e. the function of the actuating element or the touch-sensitive element. However, this information is now also obtained or read by the occupant monitoring system, for example via a corresponding communication connection in the vehicle, such as a bus.
The implementation of the method according to the invention in the form of a computer program or a computer program product with program code for carrying out all method steps is also advantageous, in particular when the control device which carries out the method is also used for other tasks and is therefore always present, since this results in particularly low costs. Data carriers suitable for providing the computer program are, in particular, magnetic, optical and electronic memories, such as hard disks, flash memories, EEPROMs, DVDs and others. It is also feasible to download the program via a computer network (internet, intranet, etc.).
Further advantages and embodiments of the invention emerge from the description and the drawing.
The invention is schematically illustrated in the drawings and will be described below with reference to the drawings according to embodiments.
Drawings
Fig. 1 schematically shows a vehicle in which the method according to the invention can be carried out.
Fig. 2 shows a schematic representation of the process of the method according to the invention in a preferred embodiment.
Fig. 3 shows an image record of an occupant monitoring system.
Detailed Description
Fig. 1 schematically shows a vehicle 100 in which the method according to the invention can be carried out. An occupant monitoring system 110 is contained in vehicle 100, which has a detection device, which is embodied as a camera 111 in an exemplary manner, and an associated computing unit or control unit 112. In the vehicle 100 there is a driver or a person 150 who places his hand 151 on the steering wheel 130 of the vehicle.
A pushbutton 120 for a power window lifter is also shown as an example as an actuating element, which is connected to an associated control device 121. The control devices 112 and 121 are likewise connected in a data-transmitting manner or are connected in a communication manner via a bus, for example.
Fig. 2 shows a schematic representation of the procedure of the method according to the invention in a preferred embodiment. For this purpose, the time flow is represented by time t. In this case, the interior of the vehicle and thus in particular also the position and/or the gestures of the person present in the vehicle are monitored by means of the occupant monitoring system or its detection device, for example the camera shown in fig. 1. Suitably, the detection 200 is performed continuously or uninterruptedly, as also shown in fig. 2.
For example, the operator now actuates an actuating element, such as a pushbutton for a power window lifter like the one shown in fig. 1. This should be at the point in time t0And t1In the meantime. As is typical for the pushbutton of a power window lifter, the pushbutton is held actuated or pressed for a certain length of time. In this regardIs detected and communicated to the occupant monitoring system 220.
Now, in the occupant monitoring system or in a computing unit of the occupant monitoring system, these pieces of information 220 can correspond to the temporally corresponding pieces of information or data 210 — that is, for example, at the time t0And t1Image or video clips in between — are used together to perform training 230 of the occupant monitoring system with respect to recognizing gestures or gestures.
That is, in the depicted example, the occupant monitoring system possesses an image of a person, for example, in the vehicle, who is at the point in time t0And t1In between placing his hand on the button for the power window lifter. In this case, it can be ensured or inferred with great certainty that this is the case, depending on the information about how long the key was actuated and at what point in time.
Thus, from these images, the occupant monitoring system or its algorithm can be trained, for example, to better or more reliably recognize a similar situation in which a person in the vehicle has also placed his hand in this position.
Additionally, for example, the method can also be used at t0Or t1The preceding and following time ranges relating to the movement of the person's hand towards or away from the key.
In fig. 3, image recordings of an occupant monitoring system for different postures or gestures of a person in a vehicle are shown purely by way of example, which image recordings can be used within the scope of the invention for training or optimizing the occupant monitoring system. In each of the four images, a photograph is shown on the right and an abstract image obtained from the photograph is shown on the left, which is processed in a data-technical manner.
In image (a) the person can be seen placing his left hand on the steering wheel, and in image (b) the person places both hands on the steering wheel. These gestures may be confirmed by a touch sensitive element in the steering wheel.
In image (c), the person places his right hand on the gear lever. This gesture can be confirmed by touch sensitive elements in the steering wheel and in the gear lever.
In image (d), the person places his left hand over the buttons for the power window lifter. This gesture can be confirmed by the operating element "push button (pointer)".
Claims (11)
1. A method for training and/or optimizing an occupant monitoring system (110) of a vehicle (100) for automated recognition of gestures and/or postures of a person in the vehicle (100),
wherein a posture and/or a gesture of a person (150) in the vehicle (100) is detected by means of the occupant monitoring system (110),
wherein information (220) about manipulation of a manipulation element (120) or touching of a touch-sensitive element by a person (150) in the vehicle, detected by a detection device (120) different from the occupant monitoring system (110), is obtained (150), and
wherein the occupant monitoring system (110) is trained and/or optimized with respect to an automated recognition of gestures and/or gestures of a person in the vehicle (100) using the obtained information (220) about the manipulation of the manipulation element (120) or the touching of the touch-sensitive element and the information (210) about the gestures and/or gestures of the person (150) determined by means of the occupant monitoring system (110), wherein the information correspond to each other in time.
2. The method of claim 1, wherein the information (220) about the manipulation of the manipulation element or the touching of the touch-sensitive element comprises one or more points in time (t) of the manipulation element (120) or the touching of the touch-sensitive element0、t1) And/or manner and/or amount.
3. The method according to claim 1 or 2, wherein the occupant monitoring system (110) is further trained and/or optimized with respect to automated recognition of gestures and/or postures of a person in the vehicle (100) using information about the position of the steering element (120) or the touch-sensitive element in the vehicle (100) and/or the position relative to a detection device (111) of the occupant monitoring system (110).
4. The method according to any one of the preceding claims, wherein the occupant monitoring system (110) has, for detecting the posture and/or gesture of the person (150), at least one of the following detection devices: a camera (111), a TOF sensor, a radar, an infrared sensor, a microphone, and a thermal imager.
5. The method according to one of the preceding claims, wherein the operating element (120) or the touch-sensitive element or one or more components thereof or a control unit assigned to the element is used as a detection device (120) which is distinct from the occupant monitoring system (110).
6. The method according to any of the preceding claims, wherein the manipulation element (120) or the touch-sensitive element is selected from the group of: knob, key (120), gear lever, touch-sensitive, in particular capacitive sensor.
7. The method according to any one of the preceding claims, wherein the occupant monitoring system (110) is used to introduce safety measures in the vehicle (100) or for the vehicle (100) when required.
8. A computing unit (112) which is set up to carry out all method steps of a method according to one of the preceding claims.
9. Occupant monitoring system (110) with a computing unit (112) according to claim 8 and a detection device (111) for detecting gestures and/or postures of a person (150) in a vehicle.
10. A computer program which, when being executed on a computing unit (112), causes an occupant monitoring system (110) according to claim 9 to carry out all the method steps of the method according to any one of claims 1 to 7.
11. A machine readable storage medium having stored thereon a computer program according to claim 10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102020202284.4 | 2020-02-21 | ||
DE102020202284.4A DE102020202284A1 (en) | 2020-02-21 | 2020-02-21 | Method for training and / or optimizing an occupant monitoring system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113296601A true CN113296601A (en) | 2021-08-24 |
Family
ID=77176000
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110188644.8A Pending CN113296601A (en) | 2020-02-21 | 2021-02-19 | Method for training and/or optimizing an occupant monitoring system |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113296601A (en) |
DE (1) | DE102020202284A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102022107274B4 (en) | 2022-03-28 | 2024-02-15 | BEAMOTIONS Rüddenklau Hänel GbR (vertretungsberechtigte Gesellschafter: Rene Rüddenklau, 80687 München und Christian Hänel, 85586 Poing) | System and method for gesture recognition and/or gesture control |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102008051757A1 (en) | 2007-11-12 | 2009-05-14 | Volkswagen Ag | Multimodal user interface of a driver assistance system for entering and presenting information |
US20150379362A1 (en) | 2013-02-21 | 2015-12-31 | Iee International Electronics & Engineering S.A. | Imaging device based occupant monitoring system supporting multiple functions |
US9747898B2 (en) | 2013-03-15 | 2017-08-29 | Honda Motor Co., Ltd. | Interpretation of ambiguous vehicle instructions |
DE102013012466B4 (en) | 2013-07-26 | 2019-11-07 | Audi Ag | Operating system and method for operating a vehicle-side device |
US20160378112A1 (en) | 2015-06-26 | 2016-12-29 | Intel Corporation | Autonomous vehicle safety systems and methods |
DE102017217664A1 (en) | 2017-10-05 | 2019-04-11 | Bayerische Motoren Werke Aktiengesellschaft | Determining a user's sense of a user in an at least partially autonomously driving vehicle |
-
2020
- 2020-02-21 DE DE102020202284.4A patent/DE102020202284A1/en active Pending
-
2021
- 2021-02-19 CN CN202110188644.8A patent/CN113296601A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
DE102020202284A1 (en) | 2021-08-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9104243B2 (en) | Vehicle operation device | |
CN108327722B (en) | System and method for identifying vehicle driver by moving pattern | |
US10417510B2 (en) | System, methods, and apparatus for in-vehicle fiducial mark tracking and interpretation | |
US9994233B2 (en) | Hands accelerating control system | |
JP6211506B2 (en) | Open / close detection device for vehicle opening / closing body | |
CN103998316A (en) | Systems, methods, and apparatus for controlling gesture initiation and termination | |
US10137857B1 (en) | Vehicle unlocking systems, devices, and methods | |
CN111347985B (en) | Automatic driving taxi | |
CN108688593B (en) | System and method for identifying at least one passenger of a vehicle by movement pattern | |
JP2006243849A (en) | Equipment control device and its method | |
CN110968184B (en) | Equipment control device | |
CN107958203B (en) | Human-machine interface with steering wheel and fingerprint sensor attached to steering wheel | |
US11789442B2 (en) | Anomalous input detection | |
US10752256B2 (en) | Method and device for controlling at least one driver interaction system | |
KR101946746B1 (en) | Positioning of non-vehicle objects in the vehicle | |
CN113296601A (en) | Method for training and/or optimizing an occupant monitoring system | |
CN110049892A (en) | The operating device of information entertainment for motor vehicles, operator for knowing this operating element place at method | |
CN108216087B (en) | Method and apparatus for identifying a user using identification of grip style of a door handle | |
KR101976497B1 (en) | System and method for operating unit support of vehicle | |
US10870436B2 (en) | Operation assistance system and operation assistance method | |
KR20150067679A (en) | System and method for gesture recognition of vehicle | |
KR102635284B1 (en) | Driver assistance system and driver assistance method | |
CN110879062A (en) | Vehicle-mounted navigation device | |
DE102017107765A1 (en) | METHOD AND APPARATUS FOR AVOIDING FEELING IN A SENSITIVE INPUT DEVICE | |
CN117222562A (en) | Method for operating a driver assistance system for longitudinal control of a vehicle with active advice function use, driver assistance system and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |