CN112389325A - Motor vehicle - Google Patents

Motor vehicle Download PDF

Info

Publication number
CN112389325A
CN112389325A CN202010799492.0A CN202010799492A CN112389325A CN 112389325 A CN112389325 A CN 112389325A CN 202010799492 A CN202010799492 A CN 202010799492A CN 112389325 A CN112389325 A CN 112389325A
Authority
CN
China
Prior art keywords
image display
vehicle
control device
partial image
motor vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010799492.0A
Other languages
Chinese (zh)
Other versions
CN112389325B (en
Inventor
M·吕布克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Publication of CN112389325A publication Critical patent/CN112389325A/en
Application granted granted Critical
Publication of CN112389325B publication Critical patent/CN112389325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/102Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint

Abstract

The invention relates to a motor vehicle, comprising: a plurality of image recording devices (2) which are arranged in a distributed manner on the vehicle and are used for recording images of the surroundings of the vehicle; a control device (3) for generating an image display (7) showing a 360 DEG surroundings around the motor vehicle from the image; and a display device (4) for displaying an image display (7), wherein the control device (3) is designed to generate an additional partial image display (8) which displays the region below the motor vehicle (1) from images which are captured temporally upstream of the image which is the basis of the displayed image display (7) and to integrate the partial image display into the displayed image display (7), wherein at least one sensor device (5) which detects the region below the motor vehicle (1) and communicates with the control device (3) is provided, wherein the overall image display formed by the image display (7) and the partial image display (8) can be changed at least in the region of the partial image display (8) as a function of the sensor information.

Description

Motor vehicle
Technical Field
The invention relates to a motor vehicle, comprising: a plurality of image capturing devices arranged in a distributed manner on the vehicle for capturing images of the surroundings of the vehicle; a control device for generating an image display showing a 360 ° surroundings around the motor vehicle from the image; and a display device for displaying an image display, wherein the control device is designed to generate an additional partial image display that displays an area under the motor vehicle from an image that is captured temporally before an image that is the basis of the displayed image display, and to blend the partial image display into the displayed image display to output a whole image display.
Background
Modern motor vehicles have the possibility of generating an image display showing the environment surrounding the motor vehicle over 360 ° as desired and outputting it on a suitable display. Systems that enable this possibility are often referred to as "top view systems" or "surround view systems" or similar systems. In general, the system uses four cameras, namely one camera on each of the head and tail of the vehicle and one camera on each of the sides, for example on the outside rear view mirror. Each camera provides individual images or image data which are acquired by the control device and processed or calculated with respect to one another in the control device in order to generate and subsequently output a virtual overhead view, i.e. a two-dimensional overhead view of the motor vehicle and its surroundings from above, or a virtual three-dimensional view with a three-dimensional vehicle model, as a self-display and optionally freely selectable viewing angle. Therefore, as long as the respective cameras capture images sideways, such a surrounding image can detect the complete vehicle surroundings for the driver.
The currently captured image enables a display of the current vehicle surroundings laterally to the vehicle surroundings, which can be viewed approximately "live". However, this is not the case for the area under the vehicle, since no camera is arranged there. The systems known to date mostly work in such a way that the area under the vehicle is either displayed black or shaded or is also displayed as a square colored in accordance with the surroundings. This region is also referred to as the "ground plane" and is sometimes larger than the vertical projection of the vehicle in the longitudinal direction of the vehicle, for example in passenger cars, because the head camera and the rear camera detect the ground in front of or behind the vehicle only in a certain distance, because the respective bumper limits the downward line of sight and there is a blind spot.
In addition to displaying the region as a stained or shaded square, it is also known to complement the region, i.e. the "ground plane", with sampled image data or video data, i.e. to create a partial image display from images taken temporally before the currently taken image or before the image underlying the displayed image display, and to blend the partial image display into the image display. That is to say, the image data are recorded and buffered in front of or behind the vehicle according to the direction of travel, in order to approximately reverse the current surroundings below the vehicle from these temporally previously recorded image data and to merge these as a partial image representation. The driver therefore derives an overall image representation from an approximately current image representation, as created from the last captured image, and a calculated, sampled partial image representation, as calculated from temporally earlier image data. For example, it is known from document DE 102016208369 a1 to complement the "ground plane" with such sampled data.
In this case, it is also known to switch to a three-dimensional vehicle model, for example, also (semi-) transparently when necessary, thus allowing the view below and behind the vehicle model, which is only schematically illustrated. This is expedient if driving on difficult ground or if the orientation of a charging plate for an electric/hybrid vehicle is to be determined below the vehicle floor, or in general in order to improve the view in a 3D scene of objects located behind the vehicle, for example from the field of view of a virtual camera.
When the "ground plane" is thus complemented with the buffered image data, i.e. by inserting such a partial image display and, if appropriate, a transparent vehicle display, the problem is that the availability of the buffered data is small. Since the buffered data must be deleted again after a short time, either during slow driving or in the stopped state, since it is not known whether there are previously invisible objects under the vehicle during this time. Thus, for example, in the case of a vehicle that is driven into a garage or is stopped, a ball, a cat, or a child may have arrived under the vehicle, which may occur due to an excessive time deviation between the image or image data used to create the partial image display and the current display time.
Disclosure of Invention
The problem addressed by the present invention is therefore to provide a motor vehicle which is improved in relation thereto.
In order to solve this problem, according to the invention, in a motor vehicle of the type mentioned at the outset, at least one sensor device is provided which detects a region below the motor vehicle, said at least one sensor device being in communication with the control device, wherein, depending on the sensor information, the overall image display, which is composed of the image display and the partial image display, can be changed at least in the region of the partial image display.
According to the invention, it is proposed that the area underneath the motor vehicle be monitored in this way by means of at least one sensor device, i.e. whether the area is free of obstacles or objects, or whether the object is located underneath the vehicle at the time of monitoring. Then, the buffered image data is finally used according to the detection result, thereby also generating a partial image display or outputting an entire image display. If, as a result, no object is detected below the vehicle, a partial image representation showing the "ground plane" can also be obtained using images or image data recorded significantly earlier in time and showing the region below the vehicle, i.e. the usability of the partial image representation is ultimately achieved or switched on by sensor detection, since it is thereby ensured that, in the end, there is no change between the time at which the "old", sampled image data were recorded and the current time, in the situation below the vehicle. That is, due to such sensor monitoring and detection of the space under the vehicle, the usability and usability of the earlier sampled image data is significantly improved.
If, on the other hand, the result is that an object, for example a ball, a toy or the like, is detected below the vehicle by the sensor device, the previously captured image representation cannot be used, or only to a limited extent, for a "ground level" representation, which results in an altered representation of the image representation at least in the region of the partial image representation compared to the case in which no obstacle is detected. Based on this changed image display, the driver can be made aware that he cannot rely on the inserted partial image display, or that he must check the area under the vehicle anyway. The driver thus obtains information about the obstacle situation or the object situation under the vehicle through the changed overall image display or the changed partial image display.
Preferably, the sampling is continued so that the "ground plane" can always be filled, for example in a top view display. The sampling mode is therefore always activated. If, however, no sampling has yet been carried out, the driver has the option, for example, whether he wishes to display a 360 ° ambience display. If he selects this display option, the system can also be automatically switched into the "ground level" display mode, that is to say the partial image display is automatically switched on with the possibility of parallel sensor monitoring. Alternatively, the display possibilities for the "ground plane" can also be selected separately from the 360 ° surroundings display.
The central element is at least one sensor device for detecting an area under the vehicle. Such a sensor device may be installed as an additional system component for the image display system according to the invention. Alternatively, sensor devices already provided for and associated with further auxiliary systems may also be used. Such floor sensors are already provided, for example, in vehicles designed for (partially) automated driving, in order to use them, for example, to carry out a start check for automatic starting, etc. Such a sensor or a plurality of such sensors can now additionally be integrated together into a 360 ° image display system and thus serve the further purpose of determining the availability of older, buffered image data for deriving a "ground plane" image display. In principle, the sensor device that can be used can be any sensor device, such as an ultrasonic sensor, an optical sensor, etc., wherein this list is not exhaustive and limiting.
If an object is detected under the vehicle, different possibilities regarding using this information may be considered. According to a first alternative of the invention, the control device may be designed to delete the image for partial image display and display the image display without displaying the partial image display when an object is detected under the vehicle. In this alternative, when an object is detected under the vehicle, the image data set for partial image display that is buffered is deleted, that is, partial image display is not performed because these image data do not correspond to or show the actual situation under the vehicle. Then, image display is performed without partial image display. The driver can already be informed of the presence of an object under the vehicle in this way, if necessary, instead of the partial image display in this area, text or warning signs indicating obstacles, i.e. warning information, can be inserted into the image area provided for the partial image display, and acoustic warning information can be output on the basis of the sensor detection. Conversely, if no object is detected, an image display with a partial image display created based on the sampled image data is of course output. However, deletion need not occur immediately with object recognition. Rather, it is also conceivable that, after the object detection, it is first checked whether and what has changed spatially or temporally below the vehicle, in order to thus be able to implement, for example: it is recognized that the vehicle is, for example, traveling past a charging pad, which one wishes to see in the image in order to operate accordingly. It is also conceivable to adapt the route and the time of travel according to whether the object determined by the floor sensor system was already present before the drive-over and therefore has migrated into the "ground level" as a result of the drive-over. That is, if such a situation is recognized, the cached images may remain stored despite the detection of the object, and may be deleted at a later time, for example, after the vehicle is located and stopped.
According to a second alternative of the invention, it is conceivable that the control device is designed to display the image representation together with the optically marked partial image representation upon detection of the presence of an object under the vehicle. In this case, although an object or object is detected on the basis of the "old" cached image data, a partial image display is created and also integrated into the overall image display, which is nevertheless optically marked so as to signal the driver via the optical marking, i.e. does not correspond to the actual situation underneath the vehicle.
The optical marking can be realized in different ways, for example in such a way that the partial image display can be highlighted in color. For example, the partial image display may be dyed, in particular, to red as the signal color. It is also conceivable to display the partial image display in a blinking manner, for example by fading in and out the partial image display or by displaying the partial image display gradually increasing and decreasing ("halo" display). These various optical marking or display possibilities are also listed by way of example and not by way of limitation.
In a further development of the invention, it can be provided that the control device is designed to automatically detect the sensor information as the speed threshold is undershot, in particular in the idle state. Since the "ground level" display is not required during normal driving, the control device does not need to acquire sensor information during normal driving. Rather, such a "ground level" display occurs primarily when driving slowly or in a standstill, so that the control device only requires corresponding sensor information in this case. According to the invention, it is now provided that the control device automatically acquires the sensor information as the respective speed threshold value is undershot, for example 10km/h, however, in particular in the stopped state of the vehicle, since then a driving situation is to be given in which the image data buffering is appropriately carried out, since a "ground level" display may be desired at a later time. For example, if the vehicle is parked in a garage or on a parking lot, the sensor information detection may be performed immediately following the turning on of the ignition or the vehicle start. In the described driving situation, it is also conceivable that the sensor device is also automatically switched on for detection by the control device if it is not currently active. This also ensures that the sensor information is always present when it is possible to require it.
In addition to a motor vehicle, the invention also relates to a method for operating a motor vehicle, which comprises: a plurality of image capturing devices arranged in a distributed manner on the vehicle for capturing images of the surroundings of the vehicle; a control device for generating an image display showing a 360 ° surroundings around the motor vehicle from the image; and a display device for displaying an image display, wherein the control device generates an additional partial image display that displays an area under the motor vehicle from an image captured temporally before an image that is a basis of the displayed image display, and incorporates the partial image display into the displayed image display to output a whole image display. The method is distinguished in that the region underneath the motor vehicle is detected by means of at least one sensor device which communicates with a control device, wherein the control device changes the overall image display, which is composed of the image display and the partial image display, at least in the region of the partial image display as a function of the sensor signal.
As already mentioned at the outset, different possibilities are provided for this change. On the one hand, when an object is detected under the vehicle, the image for partial image display may be deleted by the control device, and the image display may be output without outputting the partial image display. Alternatively, it is also conceivable that, when an object is detected below the vehicle, the control device outputs an image display together with the optically marked partial image display or together with a warning message in an image region provided for the partial image display. Here, the partial image display can be displayed in a color highlight manner or in a blinking manner.
Finally, provision may be made for the control device to automatically acquire the sensor information as the speed threshold is undershot, in particular in the standstill state or when the vehicle is started, wherein if the sensor device is not activated, an automatic switching on of the sensor device for detection is thereby also optionally carried out.
Drawings
Further advantages and details of the invention emerge from the examples of embodiment described below and from the figures. The figures show that:
FIG. 1 shows a schematic diagram of a motor vehicle according to the invention, and
fig. 2 to 4 show different overall image displays.
Detailed Description
Fig. 1 shows a motor vehicle 1 according to the invention, which comprises four image recording devices 2, one of which is arranged on each of the head and tail of the vehicle, and one image recording device on each of the sides of the vehicle, for example on an exterior rear view mirror, wherein only three image recording devices are shown in fig. 1. These image recording devices record the image data of the surroundings from the surroundings of the vehicle and transmit them to the control device 3, which, when the driver selects the respective 360 ° surroundings image mode, for example on the output device 4, for example a touch screen display, determines an image representation of the 360 ° surroundings around the motor vehicle and outputs it on the display device 4.
In order to be able to display the region underneath the motor vehicle in the image representation, the control device 3 can calculate this region within the scope of the partial image representation determined from the images or image data recorded earlier in time and integrate this region into the image representation, so that a total image representation results which comprises on the one hand the 360 ° surroundings image representation outside the vehicle and on the other hand the partial image representation of the region underneath the vehicle integrated into the total image representation.
In order to ensure the availability of older image data for as long as possible, at least one sensor device 5 is provided, which communicates with the control device 3 and which monitors the area below the motor vehicle 1 in order to determine possible obstacles, such as balls rolling under the area, toys, etc. Since these objects are not yet located under the vehicle at the point in time at which the previously captured image which is the basis for the partial image display was captured, it has proven problematic for driving if the partial image display is determined from an older image which does not show any real situation and is therefore incorrect.
If the control device 3 is now in the respective mode in order to generate an image display and also a partial image display, the region under the vehicle is monitored by the sensor device 5 and the respective generation and output mode is configured as a function of whether an object or object is detected.
Fig. 2 shows a schematic representation of an overall image display 6, which includes an image display 7, which shows the region of the exterior of the motor vehicle 1, which is shown here only in dashed lines, i.e., in a manner similar to a partially transparent representation, and which is acquired from the currently captured image.
Approximately in the region indicated by a dashed line showing the silhouette of the motor vehicle, a partial image display 8 is inserted, wherein in this case, since the sensor device does not detect an object below the motor vehicle, this partial image display is determined on the basis of previously captured images which were captured and displayed while covering the regions currently located below the vehicle. These image data are checked on the basis of the sensor information to the extent that, despite their older nature, they still show up as being actual under the motor vehicle, i.e., they can still be used, albeit buffered.
The situation is different in the illustration according to fig. 3. In fig. 3, this is again shown in the form of a schematic representation of a global image display 6, which comprises an image display 7 showing the 360 ° surroundings and an interposed partial image display 8. It is assumed here that the sensor device 5 detects an object below the vehicle. Although the partial image display 8 is determined from image data captured temporally earlier. However, these image data no longer correspond exactly to the actual situation and are therefore shown in the form of optical markings. In the example case, this is indicated by a significantly larger line width. For example, the partial image display is output in red, and may be delivered in red, blinking, or the like instead. Any of a variety of optical markers may be considered as long as they are immediately noticeable to the driver. Since the following signals are emitted to the driver by the optical highlighting: the partial image display does not show the current state under the vehicle and an object is detected just under the vehicle, so the driver must not rely on the partial image display, but rather must check the area under the vehicle and remove the object. Additionally, an audible warning may also be output.
Fig. 4 shows a third variant of the overall image display 6. Here, too, an image representation 7 relating to a 360 ° surroundings is shown again, which image representation is determined from the current image. However, since the object is detected below the vehicle by the sensor device 5, the region within the silhouette of the motor vehicle 1 is not filled, i.e. no partial image representation is determined. However, a warning sign 9 is shown in this area, which indicates a potentially dangerous situation to the driver.
The sensor device 5 can be a separate sensor device which is only associated with the auxiliary system. Alternatively, the sensor device can also be part of an underfloor sensor system, for example for monitoring an underfloor space and for initiating a take-off or the like in partially autonomous driving/semi-autonomous driving and fully autonomous driving/fully autonomous driving motor vehicles.
Finally, the control device 3 may be designed for automatically acquiring sensor information below a critical speed of, for example, 10km/h or 5km/h or in a stopped state. That is to say, it is thereby ensured that the corresponding sensor information is also detected when the possibility of a partial image display can subsequently be provided at a later time. In addition to this, the control device 3 can of course also be designed to automatically acquire the sensor information as soon as the vehicle is to be restarted, i.e. in particular also in this case the sensor information detection takes place immediately and it can be ensured that a verification of the authenticity of the older image data for the partial image display takes place. If necessary, the detection operation of the sensor device can also be switched on in principle by the control device if the sensor device is not active at all.

Claims (10)

1. A motor vehicle, comprising: a plurality of image recording devices (2) which are arranged in a distributed manner on the vehicle and are used for recording images of the surroundings of the vehicle; a control device (3) for generating an image display (7) showing a 360 DEG surroundings around the motor vehicle from the image; and a display device (4) for displaying an image display (7), wherein the control device (3) is designed to generate an additional partial image display (8) which displays the region below the motor vehicle (1) from an image which is captured temporally upstream of the image which is the basis of the displayed image display (7) and to merge this partial image display into the displayed image display (7) in order to output a total image display,
it is characterized in that the preparation method is characterized in that,
at least one sensor device (5) is provided which detects a region below the motor vehicle (1) and which communicates with the control device (3), wherein the overall image display, which is composed of the image display (7) and the partial image display (8), can be changed at least in the region of the partial image display (8) as a function of the sensor information.
2. The motor vehicle of claim 1, wherein the first and second drive wheels are driven by a motor,
it is characterized in that the preparation method is characterized in that,
the control device (3) is designed to delete the image for the partial image display (8) and to display the image display (7) without displaying the partial image display (8) when an object is detected below the vehicle (1).
3. The motor vehicle of claim 1, wherein the first and second drive wheels are driven by a motor,
it is characterized in that the preparation method is characterized in that,
the control device (3) is designed to display an image display (7) together with the optically marked partial image display (8) or to display a warning message (9) in an image region provided for the partial image display (8) when an object is detected below the vehicle (1).
4. The motor vehicle of claim 3, wherein the first and second drive wheels are driven by a motor,
it is characterized in that the preparation method is characterized in that,
the partial image display (8) can be displayed in a color highlight or in a blinking manner.
5. Motor vehicle according to one of the preceding claims,
it is characterized in that the preparation method is characterized in that,
the control device (3) is designed to automatically acquire the sensor information as the speed threshold is lowered, in particular in a stopped state or when the vehicle is started.
6. A method for operating a motor vehicle (1), which comprises: a plurality of image recording devices (2) which are arranged in a distributed manner on the vehicle and are used for recording images of the surroundings of the vehicle; a control device (3) for generating an image display (7) from the image, said image display displaying a 360 DEG surroundings around the motor vehicle (1); and a display device (4) for displaying an image display (7), wherein the control device (3) generates an additional partial image display (8) which displays the region below the motor vehicle (1) from an image which is captured temporally upstream of the image which is the basis of the displayed image display (7) and merges said partial image display into the displayed image display (7) in order to output a total image display,
it is characterized in that the preparation method is characterized in that,
the area below the motor vehicle (1) is detected by means of at least one sensor device (5) which is in communication with the control device (3), wherein the control device (3) changes the overall image display, which is composed of the image display (7) and the partial image display (8), at least in the area of the partial image display (8) as a function of the sensor information.
7. The method of claim 6, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
when an object is detected under the vehicle (1), the control device (3) deletes the image for the partial image display (8) and outputs the image display (7) without outputting the partial image display (8).
8. The method of claim 6, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
when an object is detected below the vehicle (1), the control device (3) outputs an image display (7) together with a partial image display (8) having optical markings or together with warning information (9) within an image region provided for the partial image display (8).
9. The method of claim 8, wherein the first and second light sources are selected from the group consisting of,
it is characterized in that the preparation method is characterized in that,
the partial image display (8) is displayed in a color highlight manner or in a blinking manner.
10. The method according to any one of claims 6 to 9,
it is characterized in that the preparation method is characterized in that,
the control device (3) automatically acquires the sensor information as the speed threshold is undershot, in particular in a standstill state or when the vehicle is started.
CN202010799492.0A 2019-08-13 2020-08-11 motor vehicle Active CN112389325B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019212124.1 2019-08-13
DE102019212124.1A DE102019212124B4 (en) 2019-08-13 2019-08-13 Motor vehicle and method for operating a motor vehicle

Publications (2)

Publication Number Publication Date
CN112389325A true CN112389325A (en) 2021-02-23
CN112389325B CN112389325B (en) 2023-11-07

Family

ID=74239523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010799492.0A Active CN112389325B (en) 2019-08-13 2020-08-11 motor vehicle

Country Status (3)

Country Link
US (1) US11161454B2 (en)
CN (1) CN112389325B (en)
DE (1) DE102019212124B4 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021104290B3 (en) * 2021-02-23 2022-06-02 Audi Ag Method for at least partially automated parking of a motor vehicle, driver assistance system and motor vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102427982A (en) * 2009-05-19 2012-04-25 株式会社伊美吉内柯斯特 Lane departure sensing method and apparatus using images that surround a vehicle
CN104271406A (en) * 2012-05-08 2015-01-07 丰田自动车株式会社 Overhead view image display device
US20160297430A1 (en) * 2015-04-10 2016-10-13 Jaguar Land Rover Limited Collision Avoidance System
CN107027329A (en) * 2014-07-11 2017-08-08 宝马股份公司 The topography of the surrounding environment of traveling instrument is spliced into an image
US20180139384A1 (en) * 2016-11-17 2018-05-17 Bendix Commercial Vehicle Systems Llc Vehicle Display
US20190031101A1 (en) * 2017-07-28 2019-01-31 AISIN Technical Center of America, Inc. Vehicle surroundings monitoring apparatus
US20190100106A1 (en) * 2017-10-02 2019-04-04 Hua-Chuang Automobile Information Technical Center Co., Ltd. Driving around-view auxiliary device
US20190135216A1 (en) * 2017-11-06 2019-05-09 Magna Electronics Inc. Vehicle vision system with undercarriage cameras

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013207907A1 (en) * 2013-04-30 2014-10-30 Bayerische Motoren Werke Aktiengesellschaft Vehicle positioning for inductive charging with the help of a vehicle camera
US9580012B2 (en) 2015-03-02 2017-02-28 Tk Holdings Inc. Vehicle object detection and notification system
DE102016107421A1 (en) * 2016-04-21 2017-10-26 Valeo Schalter Und Sensoren Gmbh Method for displaying surrounding areas, which can not be detected by a detection device, in the surroundings of a motor vehicle, driver assistance system and motor vehicle
DE102016208369A1 (en) * 2016-05-17 2017-12-07 Bayerische Motoren Werke Aktiengesellschaft Method for determining data representing part of the environment below the vehicle
GB2559760B (en) 2017-02-16 2019-08-28 Jaguar Land Rover Ltd Apparatus and method for displaying information
DE102017216791A1 (en) * 2017-09-22 2019-05-02 Zf Friedrichshafen Ag Sensory detection of open spaces under land vehicles
US20190279512A1 (en) * 2018-03-12 2019-09-12 Ford Global Technologies, Llc. Vehicle cameras for monitoring off-road terrain
US20200298758A1 (en) * 2019-03-18 2020-09-24 GM Global Technology Operations LLC System and method of animal detection and warning during vehicle start up

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102427982A (en) * 2009-05-19 2012-04-25 株式会社伊美吉内柯斯特 Lane departure sensing method and apparatus using images that surround a vehicle
CN104271406A (en) * 2012-05-08 2015-01-07 丰田自动车株式会社 Overhead view image display device
CN107027329A (en) * 2014-07-11 2017-08-08 宝马股份公司 The topography of the surrounding environment of traveling instrument is spliced into an image
US20160297430A1 (en) * 2015-04-10 2016-10-13 Jaguar Land Rover Limited Collision Avoidance System
US20180139384A1 (en) * 2016-11-17 2018-05-17 Bendix Commercial Vehicle Systems Llc Vehicle Display
US20190031101A1 (en) * 2017-07-28 2019-01-31 AISIN Technical Center of America, Inc. Vehicle surroundings monitoring apparatus
US20190100106A1 (en) * 2017-10-02 2019-04-04 Hua-Chuang Automobile Information Technical Center Co., Ltd. Driving around-view auxiliary device
US20190135216A1 (en) * 2017-11-06 2019-05-09 Magna Electronics Inc. Vehicle vision system with undercarriage cameras

Also Published As

Publication number Publication date
DE102019212124B4 (en) 2023-09-14
CN112389325B (en) 2023-11-07
DE102019212124A1 (en) 2021-02-18
US11161454B2 (en) 2021-11-02
US20210046871A1 (en) 2021-02-18

Similar Documents

Publication Publication Date Title
KR102042371B1 (en) Parking space detection method and apparatus
US8717196B2 (en) Display apparatus for vehicle
US20150293534A1 (en) Vehicle control system and method
JP4992764B2 (en) Safety confirmation judgment device and driving teaching support system
KR20180069854A (en) Display method of parking support information and parking support device
JP6705368B2 (en) Automatic driving device
JP2019121307A (en) Traffic light recognition device and automatic driving system
CN104163133A (en) Rear view camera system using rear view mirror location
JP6170416B2 (en) Road sign judgment device
KR20090101492A (en) Parking assistance device and parking assistance method
US20170297491A1 (en) Image generation device and image generation method
KR101486670B1 (en) Side-view mirror of digital cameras
CN104798368A (en) Onboard image processing system
US11260794B2 (en) Periphery monitoring device
KR101752675B1 (en) Car navigation equipped with camera
CN112389325B (en) motor vehicle
JP4986070B2 (en) Ambient monitoring device for vehicles
WO2017022262A1 (en) Surrounding monitoring device for operating machines
CN114585540A (en) Display of a vehicle environment for moving a vehicle to a target position
WO2017157865A1 (en) Navigation aid
GB2528098A (en) Vehicle camera system
JP2014016962A (en) Vehicle perimeter monitoring system
KR101896778B1 (en) Apparatus for displaying lane using outside mirror and method thereof
EP2246762A1 (en) System and method for driving assistance at road intersections
CN113291229B (en) Output control system, control method thereof, moving object, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant