CN109314770B - Peripheral monitoring device - Google Patents

Peripheral monitoring device Download PDF

Info

Publication number
CN109314770B
CN109314770B CN201780038757.1A CN201780038757A CN109314770B CN 109314770 B CN109314770 B CN 109314770B CN 201780038757 A CN201780038757 A CN 201780038757A CN 109314770 B CN109314770 B CN 109314770B
Authority
CN
China
Prior art keywords
water level
vehicle
image
vehicle body
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780038757.1A
Other languages
Chinese (zh)
Other versions
CN109314770A (en
Inventor
渡边一矢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aisin Corp
Original Assignee
Aisin Seiki Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisin Seiki Co Ltd filed Critical Aisin Seiki Co Ltd
Publication of CN109314770A publication Critical patent/CN109314770A/en
Application granted granted Critical
Publication of CN109314770B publication Critical patent/CN109314770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • B60R2300/305Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint

Abstract

The present invention relates to a periphery monitoring device, comprising: a control unit that causes a display device to display an image based on captured image data that is output from an imaging unit that images a vehicle body surface of a vehicle and a periphery of the vehicle body surface as an imaging range; and an image processing unit that superimposes a water level limit line based on water level limit information when the vehicle is wading on a corresponding position on the surface of the vehicle body on the image.

Description

Peripheral monitoring device
Technical Field
Embodiments of the present invention relate to a periphery monitoring device.
Background
Conventionally, there are vehicles having a wading (river crossing) function. For example, some vehicles called off-road vehicles (off-road vehicles) have a waterproof measure to enable wading (underwater travel) on a marsh, river, or flooded road surface. Some of such vehicles include the following systems: for example, the distance to the water surface is measured using a sensor such as an ultrasonic sonar, a graphic image of the water surface and the vehicle is displayed on a display device provided in the vehicle interior, and the wading traveling state is displayed in a simulated manner and presented to the driver.
Patent document 1: japanese laid-open patent publication No. 2015-512825
Disclosure of Invention
However, when the vehicle wades, the water surface often fluctuates. As in the above-described technique, when a graphical image representing the water surface is displayed by detecting the distance to the water surface using a sensor such as an ultrasonic sonar, the water surface may fluctuate due to water fluctuation, and it may be difficult to accurately measure the distance to the water surface. In this case, there are problems as follows: it is difficult to accurately represent the water fluctuation state or the height to the water surface by the graphic image, and it is difficult to appropriately provide the user with information on the water surface state or whether or not wading is appropriate.
Accordingly, one of the problems to be solved by the present invention is to provide a periphery monitoring device capable of appropriately providing a user with information on the display of a water surface state or whether or not the device is suitable for wading (underwater travel).
The periphery monitoring device according to the embodiment of the present invention includes, for example: a control unit that causes a display device to display an image based on captured image data that is output from an imaging unit that images a vehicle body surface of a vehicle and a periphery of the vehicle body surface as an imaging range; and an image processing unit that superimposes a water level limit line based on water level limit information when the vehicle is wading on a corresponding position on the surface of the vehicle body on the image. According to this configuration, for example, an actual image based on the captured image data captured by the imaging unit is displayed on the display device, and the actual vehicle body surface of the host vehicle and the surrounding conditions during wading (underwater travel), that is, the water surface conditions (such as the water level and the water fluctuation conditions) are displayed. Further, the water level limit line is superimposed on the corresponding position of the vehicle body surface on the image. As a result, the actual state, that is, the state of the water level reaching the part of the vehicle body, can be more accurately provided to the user, and the user can easily and intuitively grasp the state during wading.
In the periphery monitoring device, for example, the control unit may display an image obtained by correcting the captured image data obtained by capturing an image of a side surface of the vehicle. With this configuration, for example, the visibility of the screen contents (the side surface of the vehicle and the water level limit line superimposed thereon) can be further improved, and it is possible to make it easier for the user to more accurately understand the actual situation, that is, the situation where the water level has reached the part of the vehicle body.
In the periphery monitoring apparatus, for example, the control unit may perform viewpoint conversion processing on the captured image data as the correction. According to this configuration, for example, it is possible to provide an image in which the side surface of the vehicle is viewed from the side surface direction of the vehicle, and it is possible to provide an image in which the relationship between the water level limit line and the water surface and the relationship between the water surface and the vehicle body can be more easily understood.
In the periphery monitoring apparatus, for example, the control unit may perform distortion correction processing on the captured image data as the correction. According to this configuration, for example, the vehicle shape can be recognized more easily, and an image in which the relationship between the water level limit line and the water surface and the relationship between the water surface and the vehicle body can be understood more easily can be provided.
In the periphery monitoring apparatus, for example, the control unit may perform a clipping process of clipping a part of the captured image data as the correction. According to this configuration, for example, a specific portion of the vehicle, which should be noted as a relationship with the water level, can be displayed in an enlarged manner, and an image in which the relationship between the vehicle body and the water level limit line and the relationship between the water surface and the vehicle body can be more easily understood can be provided.
In the periphery monitoring device, for example, the image processing unit may superimpose the water level limit line along the shape of the surface shape of the vehicle on the corresponding position. According to this configuration, for example, the deviation between the surface of the vehicle and the water level limit line can be reduced, and an image in which the relationship between the water level limit line and the water surface and the relationship between the water surface and the vehicle body can be more easily understood can be provided.
In the periphery monitoring device, for example, the image processing unit may superimpose the water level limit line of a straight line on the corresponding position. With this configuration, for example, the superimposition processing of the water level limit line is facilitated, and the processing load can be reduced. Further, the water level limit line can be easily displayed on the vehicle, and the display that facilitates intuitive understanding of the relationship with the vehicle, the water level limit line, and the water surface can be performed.
In the periphery monitoring device, for example, when the water level limit line is displayed, the control unit may increase a display area of the vehicle body surface in the image to be larger than when the water level limit line is not displayed. According to this configuration, for example, the water level limit line superimposed on the vehicle body can be easily recognized, and the display mode in which the screen has been switched to the water level limit line can be easily recognized.
In the periphery monitoring device, for example, the image processing unit may superimpose a water level reference line at a position lower than the water level limit line superimposed on the image in the vehicle height direction. According to this configuration, since the water level reference line can indicate the degree of increase of the water level with respect to the vehicle body in a stepwise manner, attention can be given to the increase of the water level in a stepwise manner.
In the periphery monitoring device, for example, the image processing unit may superimpose the water level reference line on the water level limit line in a display mode different from the display mode. According to this configuration, the attention can be more clearly noticed with respect to the water level rise based on the water level reference line.
Drawings
Fig. 1 is a perspective view showing an example of a state in which a part of a vehicle interior of a vehicle in which a periphery monitoring device according to an embodiment is mounted is seen through.
Fig. 2 is a plan view showing an example of a vehicle in which the periphery monitoring device according to the embodiment is mounted.
Fig. 3 is an example of an instrument panel of a vehicle on which the periphery monitoring device according to the embodiment is mounted, and is a view in a field of view seen from the rear of the vehicle.
Fig. 4 is a block diagram showing an example of an image control system including the periphery monitoring apparatus according to the embodiment.
Fig. 5 is an explanatory diagram showing an example of display of a water level limit line by the periphery monitoring device according to the embodiment.
Fig. 6 is a display example based on the periphery monitoring device according to the embodiment, and is an explanatory diagram showing a state of an image after distortion correction and viewpoint conversion with respect to a water level limit line and a water level reference line.
Fig. 7 is a diagram showing an example of a display of the periphery monitoring apparatus according to the embodiment, and shows a state of another image after distortion correction and viewpoint conversion.
Fig. 8 is a block diagram showing an example of a CPU configuration for displaying a water level limit line implemented in an ECU of the periphery monitoring device according to the embodiment.
Fig. 9 is a display example of the display device of the periphery monitoring device according to the embodiment, and is a diagram showing a screen layout and a display example before the water level limit line is displayed, that is, in the standard display mode.
Fig. 10 is a display example of the display device of the periphery monitoring device according to the embodiment, and is a diagram showing a screen layout and a display example in the special display mode after the water level limit line display.
Fig. 11 is a flowchart illustrating an example of the display processing of the neighborhood monitoring image including the display of the water level limit line by the neighborhood monitoring apparatus according to the embodiment.
Detailed Description
Exemplary embodiments of the present invention are disclosed below. The structure of the embodiments shown below, and the actions, results, and effects brought about by the structure are merely one example. The present invention can be realized by a configuration other than the configurations disclosed in the following embodiments, and can obtain at least one of various effects and derived effects based on the basic configuration.
In the present embodiment, the vehicle 1 on which the periphery monitoring device (periphery monitoring system) is mounted may be, for example, an automobile using an internal combustion engine (not shown) as a driving source, i.e., an internal combustion engine, or an automobile using an electric motor (not shown) as a driving source, i.e., an electric automobile, a fuel cell automobile, or the like. Further, the hybrid vehicle may be one in which both of the above-described drive sources are used, or may be one having another drive source. The vehicle 1 may be equipped with various transmission devices, and various devices required for driving the internal combustion engine or the electric motor, such as systems and components. The vehicle 1 is, for example: a vehicle that can comfortably travel on a so-called "road" (mainly, a paved road and a road equivalent thereto) as well as on a "wild road" (mainly, an unpaved rough road and the like). It may be provided as a four-wheel drive vehicle: as a driving method, a driving force is transmitted to all 4 wheels 3, and all 4 wheels are used as driving wheels. The form, number, layout, and the like of the devices related to the driving of the wheels 3 can be variously set. For example, a vehicle mainly intended to travel on a "road" may be used. The driving method is not limited to the four-wheel drive method, and may be, for example, a front-wheel drive method or a rear-wheel drive method.
As illustrated in fig. 1, the vehicle body 2 forms a vehicle compartment 2a in which an unillustrated passenger sits. In the vehicle interior 2a, a steering unit 4, an accelerator operation unit 5, a brake operation unit 6, a shift operation unit 7, and the like are provided in a state of facing a seat 2b of a driver as a passenger. The steering portion 4 is, for example, a steering wheel protruding from the dashboard 24; the accelerator operation unit 5 is, for example, an accelerator pedal located under the foot of the driver; the brake operation unit 6 is, for example, a brake pedal located under the foot of the driver; the shift operation portion 7 is, for example, a shift lever protruding from a center console. The steering unit 4, the accelerator operation unit 5, the brake operation unit 6, the shift operation unit 7, and the like are not limited thereto.
Further, a display device 8 as a display output unit and/or an audio output device 9 as an audio output unit are provided in the vehicle interior 2 a. The display device 8 is, for example, an LCD (liquid crystal display) or an OELD (organic electroluminescent display). The sound output device 9 is, for example, a speaker. The display device 8 is covered with a transparent operation input unit 10 such as a touch panel. The occupant can visually confirm the image displayed on the display screen of the display device 8 through the operation input unit 10. The occupant can perform operation input by touching, pressing, or stroking the operation input unit 10 with a finger or the like at a position corresponding to an image displayed on the display screen of the display device 8. The display device 8, the audio output device 9, the operation input unit 10, and the like are provided on the monitor device 11 located at the center in the lateral direction, which is the vehicle width direction, of the dashboard 24, for example. The monitoring device 11 may include an operation input unit, not shown, such as a switch, a knob, a lever, and a button. In addition, a sound output device, not shown, may be provided in a different position from the monitoring device 11 in the vehicle interior 2a, and sound may be output from the sound output device 9 of the monitoring device 11 and another sound output device. The monitoring device 11 can be used for a navigation system or an audio system, for example.
Further, a display device 12 different from the display device 8 is provided in the vehicle interior 2 a. As illustrated in fig. 3, the display device 12 is provided, for example, on the instrument panel portion 25 of the instrument panel 24, and is located between the speed display portion 25a and the rotation number display portion 25b at substantially the center of the instrument panel portion 25. The size of the screen 12a of the display device 12 is smaller than the size of the screen 8a (fig. 3) of the display device 8. The display device 12 may display an image representing an indicator, a sign, or character information as auxiliary information, for example, when the surroundings monitoring or other functions of the vehicle 1 are operated. The amount of information displayed by the display device 12 may be less than the amount of information displayed by the display device 8. The display device 12 is, for example, an LCD or OELD. In addition, on the display device 8, information displayed by the display device 12 can be displayed.
As illustrated in fig. 1 and 2, the vehicle 1 is, for example, a four-wheeled automobile, and includes two front left and right wheels 3F and two rear left and right wheels 3R. Each of these four wheels 3 may be configured to be steerable. As illustrated in fig. 4, the vehicle 1 has a steering system 13 that redirects at least two wheels 3. The steering system 13 has an actuator 13a and a torque sensor 13 b. The steering system 13 is electrically controlled by an ECU14(electronic control unit) or the like, and operates an actuator 13 a. The steering system 13 is, for example, an electric power steering system, an SBW (steering by wire) system, or the like. The torque sensor 13b detects, for example, a torque supplied to the steering unit 4 by the driver.
Further, as illustrated in fig. 2, in the vehicle body 2, for example, four image pickup portions 15a to 15d are provided as the plurality of image pickup portions 15. The imaging unit 15 is a digital camera incorporating an imaging element such as a CCD (charge coupled device) or a CIS (complementary metal oxide semiconductor image sensor), for example. The imaging unit 15 can output moving image data (captured image data) at a predetermined frame rate. The image pickup section 15 has a wide-angle lens or a fisheye lens, respectively, and can pick up an image in a range of 140 ° to 220 ° in the horizontal direction. The optical axis of the imaging unit 15 may be oriented obliquely downward. Therefore, the imaging unit 15 sequentially images the external environment around the vehicle 1 including: the road surface on which the vehicle 1 can move, the water surface when wading, the surrounding conditions (presence or absence of water, the state of the water surface, the height to the water surface, etc.), or an object (as an obstacle, for example, a rock, a tree, a person, a bicycle, a vehicle, etc.).
The imaging unit 15a is provided in a wall portion below a rear window of a door 2h of a trunk, for example, at an end 2e located on the rear side of the vehicle body 2. The imaging unit 15b is located at, for example, the right end 2f of the vehicle body 2 and is provided in the right door mirror 2 g. The imaging unit 15c is provided on, for example, a front bumper or a front grille at an end 2c located on the front side of the vehicle body 2, i.e., on the front side in the vehicle longitudinal direction. The imaging unit 15d is located at, for example, the left end 2d of the vehicle body 2, i.e., the left end in the vehicle width direction, and is provided in the left door mirror 2 g. The ECU14 can perform arithmetic processing or image processing based on captured image data obtained by the plurality of imaging units 15 to generate an image with a wider viewing angle or generate a virtual overhead image in which the vehicle 1 is viewed from above. The ECU14 can perform distortion correction processing for correcting distortion by performing arithmetic processing or image processing on data of a wide-angle image (data of a curved image) obtained by the imaging unit 15, or can perform clipping processing for generating an image in which a specific region is clipped or generating image data representing only a specific region. Further, the ECU14 can perform viewpoint conversion processing to convert captured image data into virtual image data as captured from a virtual viewpoint different from the viewpoint captured by the capturing section 15. For example, the conversion may be performed into virtual image data representing a top view image in which the vehicle 1 is viewed from above, or into virtual image data representing a side view image in which the side of the vehicle 1 is viewed from a position away from the vehicle 1. The ECU14 provides the periphery monitoring information by displaying the acquired image data on the display device 8, for example, so as to enable safety confirmation of the right side or left side of the vehicle 1, confirmation of the water level during wading travel, which will be described later, and safety confirmation of the periphery of the vehicle 1 when viewed in plan.
The ECU14 may display a part of the vehicle body 2 together with the state of the water surface during wading travel, display the relationship between the vehicle 1 and the water surface during wading travel, and perform wading travel assistance (underwater travel assistance). In this case, as will be described later, the ECU14 can display an image in which a water level limit line, a water level reference line, or the like is superimposed on a part of the vehicle 1, for example, a side surface of the vehicle body 2. The ECU14 may recognize a lane line or the like indicated on the road surface around the vehicle 1 from the captured image data supplied from the imaging unit 15 to perform driving assistance, or may detect (extract) a parking space to perform parking assistance.
As illustrated in fig. 1 and 2, for example, four distance measuring units 16a to 16d and eight distance measuring units 17a to 17h are provided as the plurality of distance measuring units 16 and 17 in the vehicle body 2. The distance measuring units 16 and 17 are, for example, sonar devices that emit ultrasonic waves and capture reflected waves thereof. Sonar may also be referred to as sonar transducer, or ultrasonic detector, ultrasonic sonar. In the present embodiment, the distance measuring units 16 and 17 are provided at low positions of the vehicle 1 in the vehicle height direction, for example, front and rear bumpers, and can detect the periphery of the vehicle 1, for example, an obstacle, and measure the distance to the obstacle. The distance measuring units 16 and 17 may be used as sensors for determining whether or not the vehicle 1 is in an underwater state (underwater travel). As described above, since the distance measuring units 16 and 17 are provided at the low positions of the vehicle 1 in the vehicle height direction, that is, the front and rear bumpers, when wading is performed, the distance measuring units are submerged at an early stage until the water level reaches a height at which wading cannot be performed (a height superimposed on the water level limit line). For example, if the distance measuring units 16 and 17 are submerged, the reception state of the reflected wave becomes unstable, resulting in an operation error. Therefore, when the vehicle 1 enters a river or a marsh and is submerged, the plurality of distance measuring units 16 and 17 can output error signals almost simultaneously. For example, when the vehicle 1 enters a river or a marsh while traveling forward, the distance measuring units 17e, 17f, 17g, and 17h are submerged almost simultaneously and output error signals, and then the distance measuring units 16c and 16d output error signals. Similarly, when the vehicle 1 enters a river or a marsh while traveling backward, the distance measuring units 17a, 17b, 17c, and 17d are submerged almost at the same time and output error signals, and then the distance measuring units 16a and 16b output error signals. That is, based on the output states of the error signals of the distance measuring sections 16 and 17, information for determining whether the vehicle 1 is in the wading travel state can be obtained. In addition, when the vehicle 1 finishes wading (when it has landed), when the vehicle 1 comes out of a river or a marsh while traveling forward, the distance measuring units 17e, 17f, 17g, and 17h are recovered almost simultaneously, and then the distance measuring units 16c and 16d are recovered. When the landing is further performed, the distance measuring sections 16a and 16b recover, and finally, the distance measuring sections 17a, 17b, 17c, and 17d recover almost simultaneously. In this way, when all the distance measuring units 16 and 17 are recovered, information for determining that the vehicle 1 has completed wading (full entry) can be obtained. The distance measuring units 16 and 17 are of a waterproof structure and are configured so as not to be damaged by flooding.
Further, as illustrated in fig. 4, in the periphery monitoring system 100 (periphery monitoring device), the brake system 18, the steering angle sensor 19, the accelerator sensor 20, the shift position sensor 21, the wheel speed sensor 22, the acceleration sensor 26, and the like are electrically connected through the in-vehicle network 23 as a telecommunication line, in addition to the ECU14, the monitoring device 11, the steering system 13, the distance measuring sections 16 and 17, and the like. The in-vehicle network 23 is configured as a CAN (controller area network), for example. The ECU14 can send control signals via the in-vehicle network 23 to control the steering system 13, the brake system 18, and the like. The ECU14 can receive detection results of the torque sensor 13b, the brake sensor 18b, the steering angle sensor 19, the distance measuring unit 16, the distance measuring unit 17, the accelerator sensor 20, the shift position sensor 21, the wheel speed sensor 22, the acceleration sensor 26, and the like, operation signals of the operation input unit 10, and the like, through the in-vehicle line 23.
The ECU14 includes, for example, a CPU (central processing unit) 14a, a ROM (read only Memory) 14b, a RAM (random access Memory) 14c, a display control unit 14d, a sound control unit 14e, and an SSD (solid state drive, Flash Memory) 14 f. The CPU14a can perform arithmetic processing and control of image processing related to images displayed on the display devices 8 and 12, for example. For example, processing or calculation is performed to display an image in a standard display mode displayed during travel on land, an image in a special display mode displayed during wading travel, and the like. In addition, the CPU14a can perform various arithmetic processing and control as follows: determination of a movement target position (parking target position, target position) of the vehicle 1, calculation of a guidance route (guidance route, parking route, guidance parking route) of the vehicle 1, determination of whether or not there is interference with an object, automatic control of the vehicle 1, cancellation of the automatic control, and the like.
The CPU14a can read a program installed and stored in a nonvolatile storage device such as the ROM14b and perform arithmetic processing based on the program. The RAM14c temporarily stores various data used in the operation of the CPU14 a. The display control unit 14d mainly performs synthesis of image data displayed on the display device 8 in the arithmetic processing of the ECU 14. The audio control unit 14e mainly performs processing of audio data output from the audio output device 9 in the arithmetic processing of the ECU 14. SSD14f is a rewritable nonvolatile storage unit and can store data even when the power supply of ECU14 is turned off. In addition, the CPU14a, the ROM14b, the RAM14c, and the like may be integrated in the same package. Further, the ECU14 may be configured such that: instead of the CPU14a, another logical operation processor such as a DSP (digital signal processor), a logic circuit, or the like is used. Further, a Hard Disk Drive (HDD) may be provided instead of the SSD14f, and the SSD14f or the HDD may be provided separately from the ECU 14.
The brake system 18 is, for example, an anti-lock brake system (ABS) that suppresses locking of a brake, an Electronic Stability Control (ESC) that suppresses side slip of the vehicle 1 during turning, an electric brake system that enhances braking force (performs brake assist), a Brake By Wire (BBW), or the like. The brake system 18 provides braking force to the wheels 3 and even to the vehicle 1 through the actuator 18 a. The brake system 18 can detect signs of locking of the brake, spin of the wheels 3, or sideslip, and the like, based on a rotation difference of the left and right wheels 3, and the like, and perform various controls. The brake sensor 18b is a sensor that detects the position of the movable portion of the brake operation unit 6, for example.
The steering angle sensor 19 is a sensor that detects the amount of steering of the steering unit 4 such as a steering wheel. The ECU14 acquires the steering amount of the steering unit 4 by the driver, the steering amount of each wheel 3 during automatic steering, and the like from the steering angle sensor 19, and performs various controls. The accelerator sensor 20 is a sensor that detects the position of a movable portion of the accelerator operation unit 5, for example. The shift position sensor 21 is, for example, a sensor that detects the position of a movable portion of the shift operation portion 7. The wheel speed sensor 22 is a sensor that detects the rotation amount or the number of revolutions per unit time of the wheel 3. The wheel speed sensor 22 outputs a wheel speed pulse number indicating the detected number of revolutions as a sensor value. The ECU14 calculates the amount of movement of the vehicle 1 and the like based on the sensor values obtained from the wheel speed sensor 22, and performs various controls. In the present embodiment, it is assumed that the vehicle 1 is provided with two acceleration sensors 26(26a, 26 b). When the vehicle 1 is equipped with the ESC, the conventional acceleration sensor 26(26a, 26b) mounted on the ESC is used. The present embodiment is not limited to the acceleration sensor, and may be any sensor that can detect the acceleration of the vehicle 1 in the left-right direction. In the present embodiment, the acceleration in the front-rear direction and the acceleration in the left-right direction are derived.
The structure, arrangement, electrical connection, and the like of the various sensors and actuators described above are merely examples, and various settings (changes) can be made.
As an example, the ECU14 that realizes the surroundings monitoring system 100 displays on the display device 8 the relationship between the water surface around the vehicle 1 and the vehicle body 2 on which the "water level limit line" is superimposed, using an actual image based on captured image data captured by the imaging unit 15 during wading of the vehicle 1. In this case, the captured image data captured by the imaging unit 15 is data captured with the surface of the vehicle body 2 of the vehicle 1 and the periphery of the surface as imaging ranges. Here, the "water level limit line" refers to a mark indicating a limit that: a water level at which functions of the vehicle 1 such as a running function can be normally operated even if the vehicle 1 is submerged, and a limit of the water level at which water can be prevented from entering the vehicle interior 2 a.
Fig. 5 to 7 are explanatory views showing a state in which the "water level limit line L" is superimposed on an image based on captured image data. The "water level limit line L" is a line provided at a position of, for example, 600mm in the vehicle height direction from the ground contact position of the wheel 3, and is set at a height of: a height of a waterproofing treatment or a water stopping treatment or the like applied in a design stage of the vehicle 1; and a height set in advance based on the driving performance of the vehicle 1 and the like. The "water level limit line L" may be superimposed and displayed along, for example, the vehicle body side surface 2m (along the curved surface of the vehicle body 2) on the vehicle body 2 included in the image based on the captured image data captured by the imaging unit 15. For example, when the vehicle body side surface 2m is a curved surface, the water level limit line L is preferably also a shape (curved line) along the surface shape (horizontal cross-sectional shape) of the vehicle 1. Further, the water level limit line L is superimposed within the range of the vehicle body 2. In this case, the wheel 3 is also partially included and superimposed with the water level limit line L. In this way, the water level limit line L is superimposed on the vehicle body 2 along the vehicle body side surface 2m, so that the user can easily recognize where the water level has risen in the vehicle body 2 without giving a sense of discomfort.
As described above, the image pickup unit 15b and the image pickup unit 15d having the wide-angle lens or the fisheye lens are fixed to the door mirror 2 g. Therefore, the image pickup unit 15b and the image pickup unit 15d take a part of the vehicle body side surface 2m of the vehicle 1 (including the front wheels 3F and the rear wheels 3R) and a landscape on the side thereof as an image pickup range. That is, when the vehicle 1 enters water such as a river, a marsh, or a flooded road surface, an image can be obtained in which the vehicle body side surface 2m of the vehicle 1 and the water surface that rises due to entering water are received in the same screen. The ECU14 is configured to display an image obtained by superimposing and displaying the "water level limit line L" on the vehicle body side surface 2m of the vehicle 1 displayed together with the water surface on the display device 8, thereby allowing a user (such as a driver) to visually confirm the submerged state and the change in the submerged state during wading travel of the vehicle 1. In addition, since the water level is displayed as an actual image on the screen, the user can easily and intuitively grasp the change in the water level. In this case, since the displayed image is an actual image, even when the water surface fluctuates, the user can recognize the change in the water surface in real time, and the user can more appropriately and easily determine the situation during wading.
As shown in fig. 6 and 7, the ECU14 can superimpose the "water level reference line K" at a position lower than the "water level limit line L" in the vehicle height direction as a mark similar to the "water level limit line L". The "water level reference line K" is a mark line which is superimposed substantially in parallel with the water level limit line L and which informs the user of the fact that the water level has risen even though the water level has not reached the water level limit line L in advance. In the case where, for example, two "water level reference lines K" are superimposed, a position superimposed 400mm below, for example, the "water level limit line L", that is, 200mm from the ground contact surface of the wheel 3 is referred to as a "first water level reference line K1". The other is set as a "second water level reference line K2" at a position 200mm below the "water level limit line L", i.e., 400mm from the ground contact surface of the wheel 3. In this way, by setting the "water level reference line K", the degree of rise of the water level with respect to the vehicle body 2 can be indicated in a stepwise manner, and therefore, attention can be given to the rise of the water level in a stepwise manner. In addition, whether or not the "water level reference line K" is displayed may be selected by the user. Further, the number of "water level reference lines K" or the superimposed position (height from the ground contact position of the wheel 3) can be appropriately set.
The CPU14a included in the ECU14 includes an image acquisition unit 30, a display processing unit 32 (control unit), an image processing unit 34, and an output unit 36, as shown in fig. 8, in order to provide an image including the "water level limit line L" during wading travel as described above. Further, the display processing section 32 includes a distortion correcting section 38, a viewpoint converting section 40, and a layout adjusting section 42, and the image processing section 34 includes a water level limit line superimposing section 44 and a water level reference line superimposing section 46. The image acquisition unit 30, the display processing unit 32, the image processing unit 34, the output unit 36, the distortion correction unit 38, the viewpoint conversion unit 40, the layout adjustment unit 42, the water level limit line superimposing unit 44, and the water level reference line superimposing unit 46 can be realized by reading a program installed and stored in a storage device such as the ROM14b and executing the program.
The image acquisition unit 30 acquires captured image data output from an imaging unit 15 provided in the vehicle 1 and imaging the periphery of the vehicle 1, via the display control unit 14 d. The display control unit 14d may output the captured image data captured by the imaging unit 15 to the display device 8 or the display device 12 in the original state without being processed by the CPU14 a. The CPU14a may allow the user to select desired display contents by using an input device such as the operation input unit 10 or the operation unit 14 g. That is, the display control unit 14d can selectively display the image selected by the operation of the operation input unit 10 or the operation unit 14g on the display device 8. For example, the image of the rear side of the vehicle 1 captured by the imaging unit 15a may be displayed on the display device 8, or the image of the left side captured by the imaging unit 15d may be displayed on the display device 8.
The display processing unit 32 performs various image processing on the captured image data output from the imaging unit 15, converts an image based on the captured image data into an image in which the "water level limit line L" or the "water level reference line K" is easily recognized, and displays the image on the display device 8, and the imaging unit 15 images the surface of the vehicle body 2 of the vehicle 1 and the periphery of the surface as an imaging range.
For example, the distortion correcting section 38 performs a known distortion correcting process on the captured image data or an image based on the captured image data. As described above, when the imaging units 15b and 15d include the wide-angle lens or the fisheye lens, the original image based on the captured image data is a curved image as shown in fig. 5. In this case, the curved image is an image that includes the vehicle 1 and a wide range of situations in front of, behind, and to the side of the vehicle 1 within the imaging range. That is, it is possible to provide an image that allows the user to easily grasp the vehicle 1 and the situation of the entire periphery of the vehicle 1. On the other hand, a user who is not used to the curved image may take a long time to understand the displayed content. Further, the "water level limit line L" which should be close to a straight line is curved, and therefore, a sense of discomfort may be given. Therefore, the distortion correcting unit 38(CPU14a) performs processing for applying a displacement amount corresponding to the pixel position to captured image data captured on the side of the vehicle 1 and correcting the data based on the correction information stored in the ROM14b, for example, to reduce or eliminate the warp. As a result, the shape of the vehicle 1 displayed on the display device 8 can be made closer to the actual shape, and the user can more easily understand the display content.
The viewpoint converting unit 40 performs a known viewpoint converting process on the captured image data or an image based on the captured image data. For example, in order to make it easier for the user to understand the relationship between the "water level limit line L" or the "water level reference line K" superimposed on the vehicle body side surface 2m of the vehicle body 2 and the water surface W (water surface line), it is preferable to provide an image in which the user can easily visually confirm that the surface (for example, the vehicle body side surface 2m) of the vehicle body 2 on which the water level limit line L or the water level reference line K is superimposed is viewed from the direction of the sight line. For example, as shown in fig. 6 and 7, the converted image is an image obtained from a line of sight, such as a position away from the vehicle body side surface 2m toward the vehicle body side surface 2m, and looking down at the vicinity of the contact portion (water surface line) between the vehicle body side surface 2m and the water surface W. As a result, the relationship between the vehicle body side surface 2m and the water surface W can be more easily understood from the converted image. The viewpoint converting unit 40(CPU14a) applies conversion information of a map stored in the ROM14b or the like to captured image data captured of the side surface of the vehicle 1, for example, to generate virtual image data (line-of-sight converted image) when viewed from a viewpoint position away from the vehicle 1 and toward the vehicle 1. Fig. 6 and 7 show an example in which distortion correction and viewpoint conversion processing are performed on an image based on captured image data, but the distortion correction processing may be omitted and the viewpoint conversion processing may be performed. In this case, the image is viewed from a position 2m away from the vehicle body side surface as shown in fig. 5. Further, the viewpoint converting section 40 may perform distortion correction during the viewpoint converting process. The distortion correcting unit 38 and the viewpoint converting unit 40 may be collectively configured as an image converting unit.
The viewpoint converting section 40 can appropriately set the position of the viewpoint. For example, as shown in fig. 6, the image may be converted into a line-of-sight converted image in which the viewpoint is located away from the vehicle body side surface 2m at a substantially central portion of the vehicle body side surface 2m and directed toward the front side of the vehicle 1. As shown in fig. 7, the viewpoint converting unit 40 may convert the viewpoint into the line-of-sight converted image so that the viewpoint is located away from the vehicle body side surface 2m at the substantially central portion of the vehicle body side surface 2m and faces the vehicle body side surface 2m of the vehicle 1. Similarly, the image may be converted into a line-of-sight converted image as if it were directed rearward of the vehicle 1 from a substantially central portion of the vehicle body side surface 2 m. In this case, a clipping process of appropriately clipping a part of an image based on captured image data can be performed. In this way, the viewpoint position can be appropriately changed, or the display can be appropriately enlarged by the clipping processing, whereby an image showing the relationship between the water level limit line L (water level reference line K) and the water surface W of the portion to be noticed by the user can be provided, and the wading driving state can be grasped and recognized more easily. For example, when displaying an image directed to the vehicle body side surface 2m as shown in fig. 7, an image directed to the vehicle body side surface 2m opposite to the driver seat side may be displayed on the display device 8. In this case, the driver can visually confirm the side surface 2m on the driver's seat side, and the side surface 2m on the opposite side can be confirmed by the display device 8. These viewpoint switches can be selected, for example, in accordance with an input from the operation input unit 10 or the operation unit 14 g.
The layout adjustment unit 42 adjusts (changes) the layout of the screen displayed on the display device 8. As shown in fig. 9, the ECU14 divides the display area of the display device 8 into a plurality of areas and displays images in various directions, the inclinometer 50 indicating the posture of the vehicle 1, and the like as a driving assistance screen normally displayed on the display device 8. Fig. 9 is an example of an image in the standard display mode displayed when the vehicle 1 travels on land. In this case, as the standard display mode, the layout adjustment unit 42 arranges the front display region FV at an upper center portion of the display region of the display device 8, and arranges the left display region SVL and the right display region SVR at right and left sides thereof, for example. Further, below the front display region FV, a posture display region PV is disposed in which the inclinometer 50 is displayed. Further, when traveling on land, the travel assist may be performed by displaying, as necessary, a route sign R indicating the estimated traveling direction of the vehicle 1, a front reference line Qa indicating a rough criterion of the distance to the front end 2c of the vehicle body 2, and a side reference line Pa indicating a rough criterion of the distance to the side ends 2d and 2f of the vehicle body 2, in the front display area FV. The vehicle body 2 and the road surface are displayed in the left display area SVL and the right display area SVR as images in which distortion is corrected and the distortion is reduced. In the case of the standard display mode that is mainly displayed during travel on land, the layout adjustment unit 42 displays the vehicle body 2 and the road surface at a high display ratio in the left display region SVL and the right display region SVR. That is, in the case of the standard display mode, images that make it easy to grasp the road surface condition around the front wheels 3F are displayed in the left display region SVL and the right display region SVR. Further, the inclinometer 50 displays the inclination (roll angle) of the vehicle 1 in the left-right direction or the inclination (pitch angle) in the front-rear direction in the posture of a marker 52 based on a signal from the acceleration sensor 26(26a, 26 b).
The layout adjustment unit 42 can change the display contents (layout change) during wading so that the relationship between the water level limit line L or the water level reference line K and the water surface W can be understood more easily. Fig. 10 is an example of an image in the special display mode displayed when wading is performed. In the case of fig. 10, the layouts of the front display region FV, the left display region SVL, the right display region SVR, and the posture display region PV are the same as those in the case of the land travel of fig. 9, and the layouts of the insides of the left display region SVL and the right display region SVR are different from those in the case of fig. 9. As shown in fig. 10, in the case of the special display mode in which the water level limit line L (water level reference line K) is displayed, the display area of the surface of the vehicle body 2 in the images displayed in the left display area SVL and the right display area SVR is made larger than the case in which the water level limit line L (water level reference line K) is not displayed. In the case of the special display mode, how much the water surface W rises with respect to the water level limit line L (water level reference line K) is mainly displayed. Therefore, in the case of the special display mode, the display area of the vehicle body side surface 2m in the left display region SVL and the right display region SVR is increased. By changing the layout in the image in this way, the visibility of the relationship between the water level limit line L (water level reference line K) and the water surface W can be improved. Further, it can be easily recognized that the screen of the display device 8 has been switched to the display mode of the water level limit line L. The captured image data captured by the imaging unit 15 is displayed in the left display area SVL or the right display area SVR by performing, for example, a clipping process for clipping a region larger than the region displayed in the left display area SVL or the right display area SVR as an imaging range. Therefore, in order to increase the display area of the surface of the vehicle body 2, for example, the clipping range from the original image can be adjusted. In another embodiment, the water level limit line L (water level reference line K) may be displayed while maintaining the display area on the surface of the vehicle body 2 in the state of fig. 9.
The layout adjustment unit 42 may display an image showing the relationship between the water level limit line L (water level reference line K) and the water surface W as shown in fig. 6 or 7 by changing the layout over the entire display area of the display device 8. By performing such display, the visibility of the relationship between the water level limit line L (water level reference line K) and the water surface W is further improved, and the relationship between the water level limit line L (water level reference line K) and the water surface W can be emphasized so as to be particularly noticeable to the user.
The image processing unit 34 performs the following processing: a water level limit line L or a water level reference line K based on water level limit information during wading of the vehicle 1 is superimposed on the corresponding position on the surface of the vehicle body 2 on the image displayed on the display device 8. The water level limit information is, for example, information for superimposing the water level limit line L or the water level reference line K on an image, and may be information for specifying the image on the image, or may be height information from a reference line, for example, from the ground contact position of the wheel 3. When the conditions for displaying the water level limit line L are complete, the CPU14a superimposes the water level limit line L at a position at a height specified in the vehicle height direction in the design stage of the vehicle 1. As described above, each imaging unit 15 is fixed to the vehicle body 2 and defines an imaging range. Therefore, since it is possible to calculate where the vehicle body 2 is displayed in the display range in the captured image data or the image based on the captured image data, the water level limit line L is superimposed on the position. Similarly, the water level reference line superimposing unit 46 superimposes the water level reference line K at a height position specified in the vehicle height direction in the design stage of the vehicle 1. The water level limit line superimposing unit 44 and the water level reference line superimposing unit 46 may be superimposed in different display modes so that the water level limit line L and the water level reference line K (the first water level reference line K1 and the second water level reference line K2) can be easily recognized. For example, the kind of line or the color of the line can be changed. In addition, when the line color is selected, it is preferable that the line color be recognizable with respect to the color of the vehicle body 2. For example, the water level limit line superimposing unit 44 and the water level reference line superimposing unit 46 may acquire color information of the vehicle body 2, and may automatically select a recognizable line color or may be selected by the user. In addition, when the water level Limit line L is displayed in a superimposed manner, the water level Limit line superimposing unit 44 may add information indicating that the water level Limit line L is displayed, for example, "character information" such as "mark" or "Limit" like an arrow, as shown in fig. 6 and 7. In addition, when the "mark" or the "character information" is added, it is preferable to superimpose the mark on the upper side of the water level limit line L, and the "mark" or the "character information" is easily visually recognized even when the water surface W rises. When the water level reference line K is superimposed by the water level reference line superimposing unit 46, "marks" or "character information" may be similarly added. In fig. 6 and 7, the water level reference line K is shown as being superimposed in parallel with the water level limit line L, but the water level reference line K may be superimposed in a direction orthogonal to the water level limit line L and displayed together with a scale indicating the water depth.
The CPU14a can perform switching between, for example, the display state of fig. 9 to the display state of fig. 10 and vice versa by whether or not the water level limit line L (water level reference line K) is displayed, by a manual operation of the user, for example, an input operation based on the operation input unit 10 or the operation unit 14 g. As described above, switching may be performed according to the operating states (output states of error signals) of the distance measuring unit 16 and the distance measuring unit 17. In addition, the switching operation may be performed using an input method based on voice recognition, gesture recognition, or the like.
A control example of the periphery monitoring system 100 configured as described above will be described with reference to a flowchart of fig. 11. The flow shown in fig. 11 is repeated at a predetermined processing cycle.
First, the CPU14a determines whether or not the monitoring is currently in the vicinity (S100), and if the monitoring is not in the vicinity (S100, no), for example, when the power of the vehicle 1 is turned OFF (OFF) or when the display device 8 is used for another purpose (a navigation system, an audio system, or the like), the flow is temporarily terminated. On the other hand, if the CPU14a determines that the periphery monitoring is currently being performed (yes in S100), it determines whether or not the wading travel mode switching condition is satisfied (S102). As described above, the CPU14a determines that the wading travel mode switching condition is satisfied when an input operation is performed by the operation input unit 10 or the operation unit 14 g. Further, the CPU14a determines that the wading travel mode switching condition is satisfied when the error signal is output by the distance measuring unit 16 or the distance measuring unit 17 in a predetermined manner, for example, when the error signals are output by the distance measuring units 17e to 17h at substantially the same time (for example, within 2 seconds).
In S102, when the wading travel mode switching condition is satisfied (yes in S102), the CPU14a acquires captured image data currently captured by the imaging unit 15, for example, a side image of the vehicle 1, by the image acquisition unit 30 (S104). Next, the CPU14a performs image conversion processing on the acquired side image (S106). In the image conversion process, at least one of the distortion correction process by the distortion correction unit 38 and the viewpoint conversion process by the viewpoint conversion unit 40 is performed based on the state setting of the display image by the user. When the water level limit line L or the like is displayed by being superimposed on the original image shown in fig. 5 by the selection or setting of the user, the process of S106 may be skipped.
Next, the CPU14a superimposes the "water level limit line L" on the vehicle body side surface 2m of the vehicle body 2 by the water level limit line superimposing unit 44, and superimposes the "water level reference line K" on the vehicle body side surface 2m of the vehicle body 2 by the water level reference line superimposing unit 46 (S108). In this case, the "water level reference line K" may not be displayed by the selection or setting of the user. By not displaying the "water level reference line K", the image of the superimposed water level limit line L can be simplified. That is, it is possible to perform display corresponding to a user who wants simple display.
Next, when switching from the screen of the standard display mode of fig. 9 displayed as the periphery monitoring screen to the screen of the wading travel mode, the layout adjustment unit 42 checks whether or not the screen of the wading travel mode is designated as the main screen (S110). For example, when the "main display" is selected by the input operation based on the operation input unit 10 or the operation unit 14g (S110, yes), the layout adjustment unit 42 performs the following processing: the screen in the standard display mode of fig. 9 is switched to a screen in which the relationship between the water level limit line L (water level reference line K) and the water surface W is emphasized as shown in fig. 5 to 7, and the display (main display) is performed in the entire display area of the display device 8 (S112).
On the other hand, when the main display selection is not performed (S110, no), the layout adjustment unit 42 performs, as shown in fig. 10, the following processing: the display device 8 switches to the special display mode in which the display area of the surface of the vehicle body 2 in the images displayed in the left display area SVL and the right display area SVR is larger than the case where the water level limit line L (water level reference line K) is not displayed (S114).
While the CPU14a is displaying the wading travel pattern based on the relationship between the water level limit line L (water level reference line K) and the water surface W, it checks whether or not the pattern return condition for returning from the wading travel pattern to the land travel pattern is satisfied (S116). In the case of the manual return, the CPU14a determines that the mode return condition is satisfied when the user makes a screen return request (a screen display request in the land travel mode) by an input operation using the operation input unit 10, the operation unit 14g, or the like. In the case of automatic recovery, the CPU14a determines that the mode recovery condition is satisfied when all of the distance measuring units 16 and 17 are recovered (when all of the error signals are canceled due to the login), and when a predetermined period of time has elapsed thereafter, for example, 5 seconds have elapsed. When the mode return condition is satisfied (yes in S116), the layout adjustment unit 42 returns the image displayed on the display device 8 to the image in the standard display mode (the screen in fig. 9) (S118), and ends the series of display processing of the water level limit line L (water level reference line K) by the wading travel mode.
In S116, if the mode return condition is not satisfied (no in S116), the CPU14a proceeds to S104 and performs the processing of S104 and thereafter to sequentially update the display image during wading travel, and causes the display device 8 to display the transition of the relationship between the change in the water surface W (change in the water level) and the water level limit line L (water level reference line K) in real time. In S102, if the wading travel mode switching condition is not satisfied (S102, no), that is, if the display of the water level limit line L (water level reference line K) is not necessary, the process proceeds to S118, and the process of displaying the screen in the standard display mode on the display device 8 is continued.
As described above, according to the periphery monitoring system 100 of the present embodiment, the actual image based on the captured image data captured by the imaging unit 15 is displayed on the display device 8, and the surface of the actual vehicle body 2 of the vehicle 1 and the periphery during wading, that is, the water surface W, are displayed. Further, the water level limit line L is superimposed on the corresponding position of the surface of the vehicle body 2 on the image. As a result, the actual state, that is, the position where the water surface W has reached the vehicle body 2 or the state of the water surface, can be more accurately transmitted, and the user can easily and intuitively grasp the information.
In the case of the above embodiment, the example is shown in which the water level limit line L or the water level reference line K is superimposed on the vehicle body side surface 2m based on the captured image data captured by the imaging unit 15b or the imaging unit 15d provided in the door mirror 2 g. In another embodiment, captured image data captured by another imaging unit 15 may be used. For example, when the front portion (front surface of the vehicle body) of the vehicle body 2 is included in the imaging range of the imaging unit 15c provided at the front portion of the vehicle body 2, the water level limit line L or the water level reference line K may be superimposed on the image of the front surface of the vehicle body and displayed in the front display area FV of the display device 8. In this case, it becomes easy to grasp the change in the water level with respect to the vehicle body 2 when the vehicle 1 is advanced. Similarly, when the rear portion (rear surface of the vehicle body) of the vehicle body 2 is included in the imaging range of the imaging unit 15a provided at the rear portion of the vehicle body 2, the water level limit line L or the water level reference line K may be superimposed on the image of the rear surface of the vehicle body and displayed on the display device 8. In this case, the images before and after the shift operation may be switched based on the shift position of the shift operation unit 7 (shift lever). The display area of the display device 8 may be divided into four areas, and the water level limit lines L (water level reference lines K) may be displayed so as to be superimposed on the front, rear, left, and right images. In this case, since the relationship between the water surface W and the water level limit line L (water level reference line K) can be expressed for the entire periphery of the vehicle 1, even in a case where the vehicle 1 is inclined largely due to, for example, a large unevenness of the water bottom, the relationship between the water level limit line L (water level reference line K) and the water surface W can be easily grasped by the user.
In the present embodiment, an example in which the water level limit line L or the water level reference line K is superimposed along the vehicle body side surface 2m is shown. That is, when the vehicle body side surface 2m is a curved surface, the water level limit line L or the water level reference line K is superimposed so as to form a curved line corresponding to the curved surface, but the present invention is not limited to this. For example, when distortion correction is performed on captured image data or an image based on the captured image data so as to approximate the actual shape of the vehicle 1, the shape of the vehicle body 2 in the front-rear direction on which the water level limit line L or the water level reference line K is superimposed is a linear shape, and therefore the water level limit line L or the water level reference line K can be simply represented by a straight line. In this case, the image processing load by the ECU14 can be reduced. Further, by representing the water level limit line L or the water level reference line K by a simple straight line, the relationship with the water surface can be easily understood intuitively.
In the above-described embodiment, a so-called off-highway vehicle is described, but the periphery monitoring device of the present embodiment can be applied to a so-called on-highway vehicle (passenger car, etc.), and similar effects can be obtained. Further, the position of the water surface W with respect to the water level reference line K or the water level limit line L may be detected by image processing of the image on which the water level reference line K or the water level limit line L is superimposed, and warning information based on sound or the like may be output.
In the above-described embodiment, the example in which the signals from the distance measuring unit 16 and the distance measuring unit 17 are used when the automatic switching to the wading travel mode (underwater travel mode) is performed is shown, but the present invention is not limited to this, and the switching may be performed using signals from other sensors. The water surface W may be detected and switched based on the image captured by the imaging unit 15. Further, the mode switching may be performed only manually.
The embodiments and modifications of the present invention have been described above, but these embodiments and modifications are presented as examples and are not intended to limit the scope of the invention. These new embodiments may be implemented in other various ways, and various omissions, substitutions, and changes may be made without departing from the scope of the invention. These embodiments and modifications thereof are included in the scope or gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.
Description of the reference numerals
1 vehicle
2 vehicle body
2m side of the car body
8 display device
14 ECU
14a CPU
14b ROM
14d display control unit
14g operating part
15. 15a, 15b, 15c, 15d imaging unit
30 image acquisition unit
32 display processing part
34 image processing part
36 output part
38 distortion correcting unit
40 viewpoint converting part
42 layout adjustment part
44 water level limit line superposition part
46 water level reference line superposition part
L water level limit line
K water level reference line
W water surface
100 peripheral monitoring system (peripheral monitoring device)

Claims (10)

1. A perimeter monitoring device, comprising:
an imaging unit that images a vehicle body surface of a vehicle and a periphery of the vehicle body surface as an imaging range and outputs an image based on captured image data,
a control unit that causes a display device to display the image including the vehicle body surface and a periphery of the vehicle body surface; and
and an image processing unit that superimposes a water level limit line based on water level limit information when the vehicle is wading on a corresponding position on the surface of the vehicle body on the image.
2. The perimeter monitoring device according to claim 1, characterized in that:
the control unit displays an image obtained by correcting the captured image data obtained by capturing an image of a side surface of the vehicle.
3. The perimeter monitoring device according to claim 2, characterized in that:
as the correction, the control section performs viewpoint conversion processing on the captured image data.
4. The perimeter monitoring device according to claim 2 or 3, characterized in that:
as the correction, the control section performs distortion correction processing on the captured image data.
5. The perimeter monitoring device according to claim 2, characterized in that:
as the correction, the control section performs a clipping process of clipping a part of the captured image data.
6. The perimeter monitoring device according to claim 1, characterized in that:
the image processing section superimposes the water level limit line along the shape of the surface shape of the vehicle on the corresponding position.
7. The perimeter monitoring device according to claim 1, characterized in that:
the image processing unit superimposes the water level limit line of the straight line on the corresponding position.
8. The perimeter monitoring device according to claim 1, characterized in that:
when the water level limit line is displayed, the control unit makes a display area of the vehicle body surface in the image larger than when the water level limit line is not displayed.
9. The perimeter monitoring device according to claim 1, characterized in that:
the image processing unit superimposes a water level reference line at a position lower than the water level limit line superimposed on the image in a vehicle height direction.
10. The perimeter monitoring device according to claim 9, wherein:
the image processing unit superimposes the water level reference line in a display mode different from the water level limit line.
CN201780038757.1A 2016-06-30 2017-03-10 Peripheral monitoring device Active CN109314770B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016-129660 2016-06-30
JP2016129660A JP6642307B2 (en) 2016-06-30 2016-06-30 Perimeter monitoring device
PCT/JP2017/009743 WO2018003188A1 (en) 2016-06-30 2017-03-10 Periphery monitoring device

Publications (2)

Publication Number Publication Date
CN109314770A CN109314770A (en) 2019-02-05
CN109314770B true CN109314770B (en) 2021-03-19

Family

ID=60786804

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780038757.1A Active CN109314770B (en) 2016-06-30 2017-03-10 Peripheral monitoring device

Country Status (4)

Country Link
JP (1) JP6642307B2 (en)
CN (1) CN109314770B (en)
DE (1) DE112017003278T5 (en)
WO (1) WO2018003188A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108284925A (en) * 2018-01-09 2018-07-17 燕山大学 The recognition detection method of waterline
JP7358811B2 (en) * 2019-07-16 2023-10-11 株式会社アイシン Vehicle peripheral display device
JP7209652B2 (en) * 2020-02-05 2023-01-20 三菱電機株式会社 River management device, river management support system, river management support method, and river management support program
CN115171031B (en) * 2022-07-19 2023-01-31 杭州开闳流体科技有限公司 Vehicle reference object-based road surface water detection method and device and application

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006182108A (en) * 2004-12-27 2006-07-13 Nissan Motor Co Ltd Vehicle surroundings monitoring apparatus
JP2009032513A (en) * 2007-07-26 2009-02-12 Toyota Motor Corp Fuel cell automobile
JP5444139B2 (en) * 2010-06-29 2014-03-19 クラリオン株式会社 Image calibration method and apparatus
JP2012188058A (en) * 2011-03-14 2012-10-04 Panasonic Corp Display device
GB201118623D0 (en) * 2011-10-27 2011-12-07 Land Rover Uk Ltd Wading apparatus and method
GB201205653D0 (en) * 2012-03-30 2012-05-16 Jaguar Cars Wade sensing display control system
GB2520250B (en) * 2013-11-12 2016-05-25 Jaguar Land Rover Ltd Vehicle having wade sensing display and system therefor

Also Published As

Publication number Publication date
WO2018003188A1 (en) 2018-01-04
CN109314770A (en) 2019-02-05
JP6642307B2 (en) 2020-02-05
DE112017003278T5 (en) 2019-03-14
JP2018001899A (en) 2018-01-11

Similar Documents

Publication Publication Date Title
CN105539287B (en) Periphery monitoring device
US9481368B2 (en) Park exit assist system and park exit assist method
JP6507626B2 (en) Vehicle perimeter monitoring device
US10464551B2 (en) Traveling support device
EP2902271B1 (en) Parking assistance device, and parking assistance method and program
EP2990265B1 (en) Vehicle control apparatus
US20160075377A1 (en) Parking assist system, parking assist method and parking assist control program
US10150486B2 (en) Driving assistance device and driving assistance system
US10315569B2 (en) Surroundings monitoring apparatus and program thereof
US9902427B2 (en) Parking assistance device, parking assistance method, and non-transitory computer readable medium storing program
US20160075326A1 (en) Parking assist system
CN109314770B (en) Peripheral monitoring device
CN107925746B (en) Peripheral monitoring device
WO2018070298A1 (en) Display control apparatus
WO2018150642A1 (en) Surroundings monitoring device
EP3792868A1 (en) Image processing device
JP2017094922A (en) Periphery monitoring device
JP7283514B2 (en) display controller
CN110959289B (en) Peripheral monitoring device
CN110546047A (en) Parking assist apparatus
US10922977B2 (en) Display control device
JP2019138655A (en) Traveling support device
JP7383973B2 (en) parking assist device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant