WO2018025441A1 - Periphery monitoring device - Google Patents

Periphery monitoring device Download PDF

Info

Publication number
WO2018025441A1
WO2018025441A1 PCT/JP2017/010537 JP2017010537W WO2018025441A1 WO 2018025441 A1 WO2018025441 A1 WO 2018025441A1 JP 2017010537 W JP2017010537 W JP 2017010537W WO 2018025441 A1 WO2018025441 A1 WO 2018025441A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
unit
trajectory
display
Prior art date
Application number
PCT/JP2017/010537
Other languages
French (fr)
Japanese (ja)
Inventor
拓也 橋川
Original Assignee
アイシン精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン精機株式会社 filed Critical アイシン精機株式会社
Publication of WO2018025441A1 publication Critical patent/WO2018025441A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • Embodiments of the present invention relate to a periphery monitoring device mounted on a vehicle.
  • an imaging device installed in a vehicle images the surrounding environment of the vehicle and displays an image as an imaging result.
  • One of the problems of the present invention is to provide a peripheral monitoring device that is highly convenient.
  • a storage unit that stores a travel history of a vehicle, an image acquisition unit that acquires a first image showing a road surface under the floor of the vehicle, and a front wheel of the vehicle have passed.
  • a first computing unit that computes the first trajectory based on the travel history, a first image, a first trajectory, and a display object that indicates the position of the rear wheel of the vehicle are provided in the vehicle interior of the vehicle.
  • an output unit for superimposing and displaying on the display screen Therefore, the driver can perform a driving operation such that the rear wheel passes on the trajectory through which the front wheel has passed. Therefore, the periphery monitoring device according to the embodiment of the present invention is highly convenient.
  • the periphery monitoring device further includes, as an example, a second calculation unit that calculates a second locus indicating the predicted locus of the rear wheel of the vehicle based on the steering information, and the output unit includes 2 is further superimposed on the display screen.
  • the second calculation unit further calculates a third locus indicating the predicted locus of the front wheels of the vehicle based on the steering information, and the output unit 3 trajectories are further superimposed on the display screen. Therefore, the driver can check the position where the front wheel will advance from the display screen, and can drive the vehicle more safely.
  • the storage unit stores a second image captured at a first timing by an imaging device that is provided in the vehicle and images a road surface in the traveling direction of the vehicle. Then, the image acquisition unit receives the second image stored in the storage unit at the second timing after the first timing when the vehicle passes on the road surface reflected in the second image. Obtain as an image. Therefore, it is not necessary to install an imaging device that directly images the road surface under the floor of the vehicle.
  • FIG. 1 is a perspective view illustrating an example of a state in which a part of a passenger compartment of a vehicle on which the periphery monitoring device according to the embodiment is mounted is seen through.
  • FIG. 2 is a plan view (bird's eye view) illustrating an example of a vehicle on which the periphery monitoring device according to the embodiment is mounted.
  • FIG. 3 is a block diagram illustrating an example of a configuration of a periphery monitoring system including the periphery monitoring device according to the embodiment.
  • FIG. 4 is a diagram illustrating a display example in the normal display mode output by the periphery monitoring device according to the embodiment.
  • FIG. 5 is a diagram illustrating a display example in the underfloor display mode output from the periphery monitoring device according to the embodiment.
  • FIG. 6 is a diagram for explaining a concept of a method by which the periphery monitoring device according to the embodiment acquires an underfloor image.
  • FIG. 7 is a block diagram illustrating a functional configuration of the ECU as the periphery monitoring device according to the embodiment.
  • FIG. 8 is a diagram illustrating a structure of a ring buffer included in the periphery monitoring device according to the embodiment.
  • FIG. 9 is a diagram for explaining an example of a method for calculating the predicted trajectory of the rear wheels by the periphery monitoring device according to the embodiment.
  • FIG. 10 is a flowchart illustrating a procedure of image storage processing in the periphery monitoring device according to the embodiment.
  • FIG. 11 is a flowchart illustrating a procedure of processing for controlling display in the periphery monitoring device according to the embodiment.
  • FIG. 1 is a perspective view showing an example of a state in which a part of a passenger compartment 2a of a vehicle 1 equipped with a periphery monitoring device according to an embodiment is seen through.
  • FIG. 2 is a plan view (bird's eye view) illustrating an example of a vehicle 1 on which the periphery monitoring device according to the embodiment is mounted.
  • FIG. 3 is a block diagram illustrating an example of a configuration of the periphery monitoring system 100 including the periphery monitoring device according to the embodiment.
  • the vehicle 1 may be, for example, an automobile having an internal combustion engine (not shown) as a drive source, that is, an internal combustion engine automobile, or an automobile having an electric motor (not shown) as a drive source, that is, an electric vehicle or a fuel cell vehicle. Alternatively, an automobile using both of them as a driving source may be used.
  • the vehicle 1 can be mounted with various transmissions, and can be mounted with various devices, systems, components, or the like necessary for driving the internal combustion engine or the electric motor.
  • the system, number, layout, and the like of devices related to driving of the wheels 3 in the vehicle 1 can be variously set.
  • the vehicle body 2 constitutes a passenger compartment 2a in which a passenger (not shown) gets.
  • a steering unit 4, an acceleration operation unit 5, a braking operation unit 6, and a transmission operation unit 7 are provided in the passenger compartment 2 a so as to face the driver's seat 2 b as an occupant.
  • the steering unit 4 is a steering wheel protruding from a dashboard (instrument panel)
  • the acceleration operation unit 5 is an accelerator pedal positioned under the driver's feet
  • a braking operation unit Reference numeral 6 denotes a brake pedal positioned under the driver's feet
  • the shift operation unit 7 is a shift lever protruding from the center console.
  • the steering unit 4, the acceleration operation unit 5, the braking operation unit 6, and the speed change operation unit 7 are not limited to these.
  • a monitor device 11 is provided in the passenger compartment 2a.
  • the monitor device 11 has a display screen 8 and an audio output device 9.
  • the display screen 8 is configured by a display device.
  • the display device is, for example, an LCD (liquid crystal display) or an OELD (organic electroluminescent display).
  • the audio output device 9 is a speaker as an example.
  • the display screen 8 is covered with a transparent operation input unit 10.
  • the operation input unit 10 is, for example, a touch panel. An occupant or the like can visually recognize the image displayed on the display screen 8 via the operation input unit 10.
  • the monitor apparatus 11 is provided in the center part of the vehicle width direction (left-right direction) of a dashboard as an example.
  • the monitor device 11 may include an operation input unit other than the touch panel.
  • the monitor device 11 may be provided with a switch, a dial, a joystick, or a push button as another operation input unit.
  • the monitor device 11 can be used as a navigation system or an audio system, for example, or can be provided separately from these systems.
  • the vehicle 1 is a four-wheeled vehicle and includes two left and right front wheels 3 ⁇ / b> F and two right and left rear wheels 3 ⁇ / b> R.
  • the tire angle of the front wheel 3 ⁇ / b> F changes corresponding to the operation of the steering unit 4.
  • the steering system 13 is, for example, an electric power steering system or an SBW (steer by wire) system.
  • the steering system 13 supplements the steering force by adding an assist torque to the steering unit 4 by the actuator 13a or steers the wheel 3.
  • the steering system 13 detects the torque that the driver gives to the steering unit 4 by the torque sensor 13b.
  • the steering system 13 may be a rear wheel steering device (ARS: Active Rear Steering).
  • ARS Active Rear Steering
  • the rear wheel steering device steers the rear wheel 3R. Specifically, when the rear wheel steering device is employed, the rear wheel 3R is steered in the same phase as or opposite to the steering angle of the front wheel 3F depending on the driving state of the vehicle 1 (for example, vehicle speed or turning state). Is done.
  • the steering system 13 may be configured to be able to steer the front wheel 3F and the rear wheel 3R independently or in association with each other.
  • the two front wheels 3F are steered substantially in parallel with each other in the same phase (the same phase, the same turning direction, and the same turning direction), and the two rear wheels 3R are steered in the same phase and substantially parallel.
  • the drive wheels can be set in various ways.
  • the vehicle 1 is provided with a plurality of imaging units 16.
  • the imaging unit 16 is an imaging device including an imaging device such as a CCD (charge coupled device) or a CIS (CMOS image sensor).
  • the imaging unit 16 can output an image at a predetermined frame rate.
  • Each of the imaging units 16 includes a wide-angle lens or a fish-eye lens, and can capture a range of 140 ° to 190 °, for example, in the horizontal direction.
  • the optical axis of the imaging unit 16 is set downward (for example, vertically or obliquely downward).
  • the imaging unit 16a is provided on the front end 2c (for example, a front grill) of the vehicle body 2.
  • the imaging unit 16a can image the surrounding environment in the forward direction of the vehicle 1.
  • the direction in which the driver seated on the seat 2 b faces the front that is, the windshield side viewed from the driver, is the front direction of the vehicle 1 and the front side of the vehicle body 2.
  • the imaging unit 16b is provided on the left end 2d of the vehicle body 2, more specifically, on the left door mirror 2g.
  • the imaging unit 16b can capture the surrounding environment in the left direction of the vehicle 1.
  • the imaging unit 16c is provided at a rear end 2e of the vehicle body 2 (for example, a wall portion below the rear trunk door 2h).
  • the imaging unit 16c can capture the surrounding environment in the rear direction of the vehicle 1.
  • the imaging unit 16d is provided on the right end 2f of the vehicle body 2, more specifically, on the right door mirror 2g.
  • the imaging unit 16d can capture the surrounding environment in the right direction of the vehicle 1.
  • the surrounding environment refers to the situation around the vehicle 1 including the road surface around the vehicle 1.
  • any one of the imaging units 16 can output an image showing the road surface in the traveling direction of the vehicle 1, the configuration of each imaging unit 16, the number of imaging units 16, the location of each imaging unit 16, and each The orientation of the imaging unit 16 is not limited to the above-described content.
  • the ECU (Electronic Control Unit) 24 is an example of the periphery monitoring device of the embodiment.
  • the ECU 24 can perform arithmetic processing and image processing based on the images obtained by the plurality of imaging units 16 and can display the image subjected to the image processing on the display screen 8.
  • the perimeter monitoring device can control the display content of the display screen 8 at least in the underfloor display mode.
  • the underfloor display mode is a display mode in which an image showing at least the road surface under the current floor of the vehicle 1 is displayed.
  • the display mode of the periphery monitoring device includes an underfloor display mode and a normal display mode.
  • FIG. 4 is a diagram showing a display example in the normal display mode.
  • the display screen 8 includes a display area 81, a display area 82, a display area 83, and a display area 84.
  • the normal display mode an image of the current surrounding environment in the front direction of the vehicle 1 captured by the imaging unit 16a is displayed in the display area 81.
  • the display area 82 an image captured by the imaging unit 16b and showing the current surrounding environment in the left direction of the vehicle 1 is displayed.
  • the display area 83 an image captured by the imaging unit 16d and showing the current surrounding environment in the right direction of the vehicle 1 is displayed.
  • a display object indicating the rear shape of the vehicle 1 that indicates the inclination of the vehicle 1 or the like is displayed.
  • a switching button 85 is displayed on the right side of the display area 84.
  • the predicted trajectory 101 of the front wheel 3F is superimposed on the display area 81 and displayed.
  • FIG. 5 is a diagram showing a display example in the underfloor display mode.
  • the display area 81 displays at least an image showing the current underfloor road surface of the vehicle 1 in the surrounding environment.
  • the road surface around the vehicle 1 including the road surface under the floor of the vehicle 1 is displayed in the display area 81.
  • an image showing at least the current road surface under the floor is referred to as an underfloor image.
  • a contour line 102 which is a display object indicating the outer shape of the vehicle 1 indicates a contour line 103 which is a display object indicating the outer shape of the front wheel 3F of the vehicle 1 and the outer shape of the rear wheel 3R of the vehicle 1. It is displayed together with the outline 104 which is a display object.
  • the outline 102 of the vehicle 1 is an example of identification information that can identify the current position of the vehicle 1.
  • the contour 103 of the front wheel 3F is an example of identification information that can identify the current position of the front wheel 3F.
  • the contour 104 of the rear wheel 3R is an example of identification information that can identify the current position of the rear wheel 3R.
  • the contour lines 102 to 104 are superimposed on the underfloor image. A region surrounded by the contour line 102 indicates a road surface under the current floor.
  • a passage locus 105 which is a display object indicating a locus through which the front wheel 3F has passed, is further displayed in the display area 81.
  • an off-road such as a rocky place, a muddy road, or a snowy road
  • a point with a track record through which the front wheel 3F has passed is relatively safe.
  • the periphery monitoring device of the embodiment presents at least the passing trajectory 105 and the contour line 104 of the rear wheel 3R together with an image showing the current road surface under the floor to the driver, so that the driver is once in front wheel 3F. It is possible to drive the vehicle so that the rear wheel 3R passes on the trajectory through which the vehicle has passed. Since safer driving is possible, the periphery monitoring device is more convenient than the case where the passing track 105 or the contour line 104 of the rear wheel 3R is not presented.
  • the predicted trajectory 106 of the rear wheel 3R is further displayed in the display area 81.
  • the predicted trajectory 106 of the rear wheel 3R changes based on the tire angle. Since the tire angle is linked to the steering amount of the steering unit 4, the predicted locus 106 of the rear wheel 3 ⁇ / b> R is linked to the steering amount of the steering unit 4.
  • the periphery monitoring device of the embodiment further presents the predicted trajectory 106 of the rear wheel 3R to the driver, so that the driver can adjust the rear wheel 3R to the current steering amount together with the trajectory that the front wheel 3F has passed so far.
  • the trajectory that travels accordingly can be visually recognized on the display screen 8. Therefore, the driver can more easily perform the driving of passing the rear wheel 3R on the trajectory through which the front wheel 3F has passed.
  • the predicted locus 101 of the front wheel 3F is further displayed in the display area 81 in the underfloor display mode.
  • the predicted trajectory 101 of the front wheel 3F may not necessarily be displayed.
  • the passing trajectory 105 of the front wheel 3F is displayed as a dotted line, but the passing trajectory 105 of the front wheel 3F can be displayed in any other display manner.
  • the passing trajectory 105 of the front wheel 3F can be displayed by, for example, a solid line or a broken line.
  • the passage trajectory 105 of the front wheel 3F may be displayed with a frame line having the same width as the width of each wheel 3.
  • each of the predicted trajectories 101 and 106 can be displayed in an arbitrary display mode.
  • each of the predicted trajectories 101 and 106 has the same width as the width of each wheel 3 and is displayed with a frame line having a predetermined length.
  • condition for switching between the normal display mode and the underfloor display mode is not limited to a specific condition.
  • switching between the normal display mode and the underfloor display mode is performed by touching the switching button 85.
  • the normal display mode may not be provided. There may be other display modes different from the normal display mode.
  • the underfloor display mode In the underfloor display mode, arbitrary information can be displayed in the display areas 82 to 84 of the display screen 8. In the underfloor display mode, images different from the normal display mode may be displayed in the display areas 82 to 84. In the underfloor display mode, the underfloor image may be displayed on the display screen 8 together with the respective display objects (the contour lines 102 to 104, the passing trajectory 105 of the front wheel 3F, and the predicted trajectories 101 and 106). In the example of FIG. 5, in the underfloor display mode, the display areas 82 to 84 are displayed in the same manner as in the normal display mode.
  • the periphery monitoring device updates the display content at a predetermined rate in both the display mode under the floor and the normal display mode. Therefore, the occupant can visually recognize the current surrounding environment of the vehicle 1 as an image from the display screen 8 regardless of the display mode of the underfloor display mode or the normal display mode.
  • the update rate of the display content may be the same as or different from the rate at which the imaging unit 16 outputs an image.
  • an image showing the current road surface under the floor is selected from a plurality of images captured in the past, and the selected image is displayed as the underfloor image.
  • FIG. 6 is a diagram for explaining the concept of a method for acquiring an underfloor image according to the present embodiment.
  • the periphery monitoring device stores an image captured at a slightly previous timing by the imaging unit 16a, and uses the image as an image indicating a road surface on which the vehicle 1 is currently passing, that is, an underfloor image. be able to.
  • FIG. 6 is a diagram for explaining the concept of a method for acquiring an underfloor image according to the present embodiment.
  • the road surface 501 through which the vehicle 1 is passing is not included in the imaging region 502 of the imaging unit 16a, but the imaging unit 16a at time t0.
  • the time t0 is a time before the time t1, and is a time when the vehicle 1 is located on the opposite side of the traveling direction from the position of the vehicle 1 at the time t0.
  • the periphery monitoring device uses the image captured by the imaging unit 16a at time t0 as the underfloor image at time t1.
  • the method for acquiring the underfloor image is not limited to the above method.
  • an imaging unit that images the current road surface under the floor may be provided under the floor of the vehicle 1, and the periphery monitoring device may use an image output by the imaging unit as the underfloor image.
  • the vehicle 1 includes a steering system 13, a brake system 18, a steering angle sensor 19, an accelerator sensor 20, a shift sensor 21, a wheel speed sensor 22, an in-vehicle network 23, an ECU 24, and two acceleration sensors 26. (26a, 26b).
  • the ECU 24 is an example of a periphery monitoring device.
  • the monitor device 11, the steering system 13, the imaging unit 16, the brake system 18, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the ECU 24, and the acceleration sensor 26 (26a, 26b) are a peripheral monitoring system. 100 is configured.
  • the monitor device 11, the steering system 13, the brake system 18, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the ECU 24, and the acceleration sensor 26 are electrically connected via an in-vehicle network 23 that is an electric communication line. Connected.
  • the in-vehicle network 23 is configured as, for example, a CAN (Controller Area Network).
  • the acceleration sensor 26a is a sensor that detects acceleration in the front-rear direction
  • the acceleration sensor 26b is a sensor that detects acceleration in the left-right direction.
  • the brake system 18 includes an actuator 18a and a brake sensor 18b.
  • the brake system 18 applies a braking force to the wheel 3 via the actuator 18a.
  • the brake sensor 18b transmits, for example, the position of a brake pedal as a movable part of the braking operation unit 6 to the ECU 24.
  • the steering angle sensor 19 is, for example, a sensor that detects the steering amount of the steering unit 4.
  • the steering angle sensor 19 transmits the steering angle information such as the steering angle of the steering unit 4 by the driver and the steering angle at the time of automatic steering to the ECU 24.
  • the accelerator sensor 20 is a displacement sensor that detects the position of an accelerator pedal as a movable part of the acceleration operation unit 5, for example.
  • the accelerator sensor 20 transmits the position of the acceleration operation unit 5 to the ECU 24.
  • the shift sensor 21 is, for example, a sensor that detects the position of the movable part of the shift operation unit 7 and transmits the position of the movable part of the shift operation unit 7 to the ECU 24 as shift information.
  • the wheel speed sensor 22 is a sensor that detects the amount of rotation of the wheel 3 or the number of rotations per unit time.
  • the wheel speed sensor 22 transmits the wheel speed pulse number indicating the detected rotation speed to the ECU 24 as wheel speed information.
  • the ECU 24 detects the detection results by sensors such as the torque sensor 13b, the brake sensor 18b, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the acceleration sensor 26, and the like via the in-vehicle network 23, and An instruction signal (control signal, operation signal, input signal, data) from the operation input unit 10 or the like can be received.
  • the ECU 24 receives an image from the imaging unit 16.
  • the ECU 24 may receive an image from the imaging unit 16 via the in-vehicle network 23.
  • the ECU 24 is, for example, a computer.
  • the ECU 24 includes a CPU (Central Processing Unit) 24a, a ROM (Read Only Memory) 24b, a RAM (Random Access Memory) 24c, a display control unit 24d, a voice control unit 24e, an SSD (Solid State Drive) 24f, Is provided.
  • the CPU 24a, ROM 24b, and RAM 24c may be integrated in the same package.
  • the CPU 24a reads a program stored in a nonvolatile storage device such as the ROM 24b, and executes various arithmetic processes and controls according to the program.
  • the CPU 24a executes, for example, image processing related to an image to be displayed on the display screen 8.
  • the ROM 24b stores a program and parameters necessary for executing the program.
  • the RAM 24c temporarily stores various types of data used in computations by the CPU 24a.
  • the display control unit 24d mainly performs processing of a captured image that is acquired from the imaging unit 16 and output to the CPU 24a among arithmetic processing in the ECU 24, data conversion of a display image that is acquired from the CPU 24a and displayed on the display screen 8, and the like. Execute.
  • the voice control unit 24e mainly performs a voice process acquired from the CPU 24a and output to the voice output device 9 among the calculation processes in the ECU 24.
  • the SSD 24f is a rewritable nonvolatile storage unit, and maintains data acquired from the CPU 24a even when the power of the ECU 24 is turned off.
  • FIG. 7 is a block diagram showing a functional configuration of the ECU 24 as the periphery monitoring device according to the present embodiment.
  • the CPU 24a executes a program stored in advance in the ROM 24b, thereby realizing an acquisition unit 401, an angle calculation unit 402, a storage processing unit 403, a reception unit 405, and a display processing unit 406.
  • the program for realizing these functional configurations may be provided via any recording medium other than the ROM 24b that can be read by the computer.
  • the ECU 24 implements a ring buffer 404 on the RAM 24c.
  • Part or all of the acquisition unit 401, the angle calculation unit 402, the storage processing unit 403, the reception unit 405, and the display processing unit 406 is a hardware circuit or a combination of a hardware circuit and software (program). It may be realized by.
  • the acquisition unit 401 acquires various information from various sensors provided in the vehicle 1.
  • the acquisition unit 401 according to the present embodiment is output from the imaging units 16a to 16d that are provided in the vehicle 1 and images the periphery of the vehicle 1, and the acceleration sensors 26a and 26b that are provided in the vehicle 1. Acceleration data obtained. Further, the acquisition unit 401 acquires the steering angle information from the steering angle sensor 19 and the wheel speed information from the wheel speed sensor 22.
  • the acquisition unit 401 associates images, acceleration data, rudder angle information, and wheel speed information whose acquired times are substantially the same with each other.
  • the angle calculation unit 402 calculates the tilt angle (pitch angle and roll angle) of the vehicle 1 based on the acceleration data acquired from the acceleration sensors 26a and 26b.
  • the pitch angle is an angle indicating the inclination around the left and right axes of the vehicle 1
  • the roll angle is an angle indicating the inclination around the longitudinal axis of the vehicle 1.
  • the angle calculation unit 402 associates the roll angle and pitch angle calculated from the acceleration data with an image associated with the acceleration data. This makes it possible to recognize the roll angle and pitch angle of the vehicle 1 when the image is captured.
  • the storage processing unit 403 includes a correction unit 411, an estimation unit 412, and a storage unit 413, and generates and stores an image to be displayed on the display screen 8.
  • the correction unit 411 performs rotation correction on the image captured by the imaging unit 16a and showing the surrounding environment in front of the vehicle 1.
  • the correction unit 411 performs rotation correction according to the roll angle associated with the image, with the position coordinate in the image corresponding to the center of the lens used for imaging by the imaging unit 16a as the origin. .
  • the estimation unit 412 estimates the current position of the vehicle 1.
  • the position of the vehicle 1 is a predetermined position of the vehicle 1 (for example, the center of the vehicle 1).
  • the method for estimating the position of the vehicle 1 is not limited to a specific method.
  • the estimation unit 412 calculates the amount of movement of the vehicle 1 from the previously imaged position based on the acceleration data, the steering angle information, and the wheel speed information acquired by the acquisition unit 401. Then, the estimation unit 412 estimates the position of the vehicle 1 from the movement amount.
  • the estimation unit 412 calculates an optical flow using an image picked up in the past and an image picked up in the past, calculates the movement amount of the vehicle 1 based on the calculated optical flow, and Estimate the position.
  • the estimation unit 412 calculates an optical flow using an image picked up in the past and an image picked up in the past, calculates the movement amount of the vehicle 1 based on the calculated optical flow, and Estimate the position.
  • the vehicle 1 may include a GPS (not shown), and the estimation unit 412 may estimate the position of the vehicle 1 by using a signal from the GPS.
  • the estimation unit 412 estimates the current direction of the vehicle 1.
  • the method of estimating the orientation of the vehicle 1 is not limited to a specific method, the orientation of the vehicle 1 can be estimated by a method similar to the position of the vehicle 1.
  • the storage unit 413 stores the image corrected by the correction unit 411 in the ring buffer 404 together with the position information and the direction information of the vehicle 1.
  • the present embodiment does not limit the interval at which images are stored in the ring buffer 404, for example, an image showing a road surface that is passing at any timing can be acquired from an image stored in the past.
  • the storage interval is set to a value that is not too long.
  • the position information and the orientation information stored in the ring buffer 404 are used as a travel history for calculating the passing trajectory 105 of the front wheel 3F. That is, the ring buffer 404 corresponds to a storage unit that stores a travel history of the vehicle 1.
  • the periphery monitoring device obtains discrete data indicating the passing trajectory 105 of the front wheel 3F by calculating the position of the front wheel 3F for each piece of position information.
  • the passing trajectory 105 of the front wheel 3F is preferably accurate to some extent, and for this purpose, the granularity of the stored position information and orientation information is desirably fine to some extent. Therefore, the storage interval is set to a value that is not too long.
  • an image, position information, and orientation information may be stored every time a distance of 0.1 m is traveled.
  • the travel history may be stored separately from the position information and the orientation information associated with the image, and in this case, the image storage interval is not limited to these.
  • FIG. 8 is a diagram showing the structure of the ring buffer 404.
  • the ring buffer 404 stores an image and the position information and direction information of the vehicle 1 at the time of capturing the image in association with each other.
  • the ring buffer 404 is a storage area logically arranged in a ring shape.
  • the image requested to be stored is overwritten and stored in the oldest updated area.
  • the accepting unit 405 accepts an instruction signal (control signal) from the operation input unit 10 or the like.
  • the receiving unit 405 according to the present embodiment receives a display mode switching operation from these instruction signals.
  • the display processing unit 406 performs display processing on the display screen 8 in a display mode according to the operation received by the receiving unit 405.
  • the display processing unit 406 includes an image acquisition unit 421, a passage locus calculation unit 422, a prediction locus calculation unit 423, and an output unit 424.
  • the image acquisition unit 421 acquires an image indicating a road surface under the floor of the vehicle 1, that is, an underfloor image.
  • the image acquisition unit 421 reads from the ring buffer 404 an image showing a road surface on which the vehicle 1 is currently passing.
  • the image acquisition unit 421 uses the image captured at the first timing stored in the ring buffer 404 as an underfloor image at the second timing when the vehicle 1 passes on the road surface reflected in the image. get.
  • the second timing is a timing after the first timing.
  • the image acquisition unit 421 selects, from a plurality of images stored in the ring buffer 404, an image showing a road surface on which the vehicle 1 is currently passing. Which image shows the road surface on which the vehicle 1 is currently passing can be determined based on position information associated with each image, for example.
  • an image is selected as follows.
  • the direction of the optical axis and the angle of view of the imaging unit 16 are known or can be acquired.
  • the image acquisition unit 421 arranges an object indicating the vehicle 1 on a virtual plan view, and further maps the imaging range of each stored image on the plan view.
  • the image acquisition unit 421 specifies the position where the imaging range of each image is mapped based on the position information and orientation information associated with each image and the direction and angle of view of the optical axis of the imaging unit 16. Can do.
  • the image acquisition unit 421 selects an image including an object indicating the vehicle 1 in the imaging range as an image showing a road surface on which the vehicle 1 is currently passing.
  • the image acquisition unit 421 may select two or more images from a plurality of images stored in the ring buffer 404 and combine the selected two or more images. Combining is to connect a plurality of images seamlessly.
  • the timing at which two or more images before composition are taken may be different.
  • the image acquisition unit 421 includes a part or all of the image captured by the imaging unit 16a at time t2, and a part or all of the image captured by the imaging unit 16a at time t3 after time t2.
  • the underfloor image at the current time t4 after the time t3 may be generated.
  • two or more images before synthesis may be different from each other in the imaging unit 16 that has captured the images.
  • the image acquisition unit 421 synthesizes part or all of the image captured by the imaging unit 16a at time t5 and part or all of the image captured by the imaging unit 16b at time t5.
  • An underfloor image at the current time t6 after the time t5 may be generated.
  • the image acquisition unit 421 may perform processing other than synthesis on the image before synthesis or the image after synthesis.
  • the processing includes, for example, clipping, masking, filtering on a part or the whole of the image, correction, or viewpoint conversion.
  • the correction is, for example, distortion correction or gamma correction.
  • the viewpoint conversion is, for example, generating a bird's-eye view image from an image captured by the imaging unit 16.
  • the image acquisition unit 421 may process the selected image even when only one image is selected as the underfloor image. The processing can be executed at an arbitrary timing from when the image is acquired until it is output to the display screen 8.
  • the passing locus calculation unit 422 calculates the passing locus 105 of the front wheel 3F based on the traveling history accumulated in the ring buffer 404.
  • the passage locus calculating unit 422 calculates the passage locus 105 for each of the two front wheels 3F.
  • the calculation method of the passage locus 105 of the front wheel 3F is not limited to a specific method.
  • the passage trajectory calculation unit 422 reads a plurality of position information and direction information stored in a nearby predetermined period from the ring buffer 404. Then, the passage trajectory calculation unit 422 calculates the position of the front wheel 3F corresponding to each of the plurality of read position information for each read position information.
  • the position where the front wheel 3F is attached in the vehicle 1 is known or can be acquired.
  • the position of the front wheel 3F is uniquely determined by the combination of the position information, the direction information of the vehicle 1, and the position where the front wheel 3F is attached in the vehicle 1.
  • the passage trajectory calculation unit 422 can acquire the position of the front wheel 3F in chronological order by calculating the position of the front wheel 3F in order from the oldest timing when the corresponding position information is stored. Note that the passing locus calculation unit 422 may calculate the position of the front wheel 3F in the reverse order of the time series. A column in which the positions of the front wheels 3F are arranged in time series order or in the reverse order of the time series corresponds to discrete data indicating the passage trajectory 105 of the front wheels 3F.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectories 101 and 106 of the wheels 3 based on the steering information acquired by the acquisition unit 401. In particular, in the present embodiment, the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R based on the steering information.
  • the calculation method of the predicted trajectory 106 of the rear wheel 3R is not limited to a specific method.
  • FIG. 9 is a diagram for explaining an example of a calculation method of the predicted trajectory 106 of the rear wheel 3R.
  • the vehicle When the driver changes the tire angle of the front wheel 3F by operating the steering unit 4, the vehicle is located at the intersection of the direction orthogonal to the direction 3Fa of the front wheel 3F and the extension direction of the rear wheel axle 38 that supports the rear wheel 3R.
  • the rear wheel 3R also moves along an arc centered on the turning center G1.
  • the predicted trajectory calculation unit 423 is a part of an arc drawn by the rear wheel 3R centered on the turning center G1, and a line segment that the rear wheel 3R follows from the present to a predetermined short time is used as a predicted trajectory of the rear wheel 3R. It calculates as 106.
  • the predicted trajectory calculation unit 423 calculates the tire angle of the front wheel 3F from the steering information, and calculates the predicted trajectory 106 of the rear wheel 3R based on the calculated tire angle and the relationship shown in FIG.
  • the acquisition unit 401 may acquire the tire angle of the front wheel 3F as steering information, and the predicted trajectory calculation unit 423 may use the acquired tire angle of the front wheel 3F for calculation of the predicted trajectory 106 of the rear wheel 3R. .
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R in consideration of the tire angle of the rear wheel 3R. May be.
  • the predicted trajectory 106 of the rear wheel 3R can be obtained by calculating an arc in which the rear wheel 3R moves when turning is performed during normal traveling.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F in the same procedure as the predicted trajectory 106 of the rear wheel 3R.
  • the predicted trajectory calculation unit 423 is a part of an arc drawn by the front wheel 3F centered on the turning center G1, and the line segment that the front wheel 3F follows from the present to a predetermined short time is used as the predicted trajectory 101 of the front wheel 3F.
  • the output unit 424 displays the underfloor image, the contour lines 102 to 104, the passing trajectory 105 of the front wheel 3F, and the predicted trajectories 101 and 106 on the display screen 8.
  • FIG. 10 is a flowchart illustrating a procedure of image storage processing in the periphery monitoring device according to the embodiment.
  • the process of FIG. 10 is executed for each control cycle.
  • the control cycle is sufficiently shorter than the cycle determined as Yes in the determination process of S106.
  • the imaging unit 16 images the surrounding environment of the vehicle 1 (S101).
  • the imaging unit 16 a images an area including the road surface in the traveling direction of the vehicle 1.
  • the acquisition unit 401 acquires an image from the imaging unit 16 and acceleration data from the acceleration sensor 26 (S102). Then, the angle calculation unit 402 calculates the roll angle and pitch angle of the vehicle 1 from the acceleration data (S103). Then, the correction unit 411 performs rotation correction corresponding to the roll angle on the captured image (S104).
  • the estimation unit 412 calculates the movement amount of the vehicle 1 based on the acceleration data, the steering angle information, and the wheel speed information, and estimates the current position and orientation of the vehicle 1 (S105).
  • the storage unit 413 determines whether or not the position of the vehicle 1 has changed by a predetermined distance (for example, 0.1 m) from the position when the image was stored last time (S106).
  • a predetermined distance for example, 0.1 m
  • the storage unit 413 determines that the position of the vehicle 1 has not changed by a predetermined distance or more (S106: No)
  • a predetermined distance or more for example, 0.1 m
  • the present invention is not limited to such a storage method, and the steering angle is a predetermined angle or more. You may preserve
  • the storage unit 413 determines that the position of the vehicle 1 has changed by a predetermined distance (for example, 0.1 m) or more from the position when the image was stored last time (S106: Yes), the storage unit 413. However, the corrected current image is overwritten and saved in the oldest updated area of the ring buffer 404 (S107). At that time, the storage unit 413 stores the position information and orientation information of the captured image in association with each other.
  • a predetermined distance for example, 0.1 m
  • the storage unit 413 updates the position information and orientation information at the time of imaging of each image stored in the ring buffer 404 to position information and orientation information based on the current position and orientation (S108).
  • FIG. 11 is a flowchart illustrating a procedure of processing for controlling display in the periphery monitoring device according to the embodiment.
  • the receiving unit 405 determines whether or not an operation for starting the underfloor display mode has been received (S201). In one example, when the touch of the switching button 85 by the occupant is detected, the operation input unit 10 detects that and notifies the ECU 24 of the fact. When the notification of the touch of the switching button 85 is received when the display mode is the normal display mode, the reception unit 405 determines that an operation for starting the underfloor display mode has been received.
  • condition for starting the underfloor display mode is not limited to this.
  • reception unit 405 determines that an operation for starting the underfloor display mode has been received (S201: No). If the reception unit 405 does not determine that an operation for starting the underfloor display mode has been received (S201: No), the process proceeds to S212 described below.
  • the image acquisition unit 421 acquires an image of the road surface currently passing from the ring buffer 404 (S202). ).
  • the passage locus calculation unit 422 calculates the passage locus 105 of the front wheel 3F based on the past position information and orientation information stored in the ring buffer 404 (S203). Further, the predicted trajectory calculation unit 423 acquires the steering angle information via the acquisition unit 401 (S204). Then, the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F and the predicted trajectory 106 of the rear wheel 3R based on the acquired steering angle information (S205).
  • passage trajectory 105 and the predicted trajectories 101 and 106 can be expressed using an arbitrary coordinate system.
  • the passage trajectory 105 and the predicted trajectories 101 and 106 are expressed in a coordinate system based on the current position and orientation of the vehicle 1.
  • the output unit 424 superimposes the contour line 102 of the vehicle 1 on the image acquired by the image acquisition unit 421 (S206). At this time, the output unit 424 aligns the contour line 102 of the vehicle 1 and the image so that the vehicle 1 in which the contour line 102 of the vehicle 1 is included in the image corresponds to the position where the vehicle 1 is currently passing. In the alignment, the contour 102 of the vehicle 1 is superimposed so that the correspondence between the image and the contour 102 of the vehicle 1 matches the correspondence between the current vehicle 1 and the current road surface under the floor of the vehicle 1. It is to adjust the position and the direction of the outline 102 of the vehicle 1.
  • Alignment method is not limited to a specific method.
  • the output unit 424 indicates the current position and orientation of the vehicle 1 in the region shown in the acquired image, the position information and orientation information associated with the image, and the optical axis of the imaging unit 16. Based on the direction and the angle of view, the position where the current vehicle 1 exists and the current direction of the vehicle 1 are specified in the region shown in the image.
  • the direction of the optical axis and the angle of view of the imaging unit 16 are known or can be acquired.
  • the output unit 424 may adjust the position and orientation of the contour line 102 of the vehicle 1 with reference to the image, or may adjust the position and orientation of the image with reference to the contour line 102 of the vehicle 1. Good.
  • the output unit 424 may adjust the image so that the contour line 102 of the vehicle 1 is located in the center of the display area 81 and the direction of the contour line 102 of the vehicle 1 faces the display area 81.
  • the output unit 424 may perform processing such as viewpoint conversion on the image or the outline 102 of the vehicle 1 as necessary.
  • the output unit 424 superimposes the contour 103 of the front wheel 3F and the contour 104 of the rear wheel 3R on the image (S207).
  • the output unit 424 performs alignment in S207 as in S206. Since the position where the front wheel 3F and the rear wheel 3R are attached in the vehicle 1 is known or can be acquired, the output unit 424, for example, the position of the contour 102 of the vehicle 1 and the front wheel 3F and the rear wheel in the vehicle 1 Based on the position at which 3R is attached, the positions at which the contour 103 of the front wheel 3F and the contour 104 of the rear wheel 3R are superimposed are determined.
  • the output unit 424 may rotate and superimpose the contour lines 103 and 104 of each wheel 3 by an angle corresponding to the steering angle information.
  • the contour 103 of the front wheel 3F is rotated and displayed by an angle corresponding to the steering angle information.
  • the output unit 424 may rotate and superimpose the contour line 104 of the rear wheel 3R.
  • contour 102 of the vehicle 1, the contour 103 of the front wheel 3F, and the contour 104 of the rear wheel 3R may be grouped in advance, and the grouped contours 102 to 104 may be superimposed.
  • the output unit 424 superimposes the passing trajectory 105 of the front wheel 3F on the image (S208), and superimposes the predicted trajectory 101 of the front wheel 3F and the predicted trajectory 106 of the rear wheel 3R on the image (S209).
  • the output unit 424 aligns the image and each locus using the same method as in S207.
  • the output part 424 outputs the image on which each display object was superimposed on the display screen 8 (S210).
  • the receiving unit 405 determines whether or not an operation for ending the underfloor display mode has been received (S211).
  • the operation input unit 10 detects that and notifies the ECU 24 of the fact.
  • the accepting unit 405 determines that an operation for ending the underfloor display mode has been accepted.
  • the processing from S202 to S211 is looped at a predetermined short cycle until the underfloor display mode ends. Since the underfloor image is sequentially updated by the processing of S210 in each loop, the occupant can visually recognize the current road surface under the floor of the vehicle 1 as an image on the display screen 8 when the vehicle 1 is traveling. it can. Further, the contour line 104 of the rear wheel 3R is updated in each loop. In addition, position information as a travel history is accumulated during travel, and in each loop, the passing trajectory 105 of the front wheel 3F is updated according to the accumulated travel history. Further, the steering angle information is acquired in each loop, and the predicted trajectory 106 of the rear wheel 3R is updated according to the acquired steering angle information.
  • the occupant can change the locus of the front wheel 3F, the position of the rear wheel 3R, and the expected locus of the rear wheel 3R, which change according to the traveling of the vehicle 1 to the present, through the display screen 8 in substantially real time. Can be confirmed.
  • the display processing unit 406 displays the display content of the normal display mode on the display screen 8 (S212). Then, the process returns to S201.
  • the traveling history for calculating the passing trajectory 105 of the front wheel 3F has been described as including position information and direction information.
  • the travel history is not limited to this.
  • the travel history includes only position information.
  • the passage trajectory calculation unit 422 may calculate a vector connecting two pieces of position information with consecutive stored timings as the direction of the vehicle 1 at each imaging timing.
  • the traveling history may include the position of the front wheel 3F.
  • the estimation unit 412 estimates the position of the front wheel 3F based on the position of the vehicle 1 and the position where the front wheel 3F is attached in the vehicle 1, and the storage unit 413 uses the estimated position of the front wheel 3F as a ring. Save to buffer 404.
  • the passage trajectory calculation unit 422 obtains discrete data indicating the passage trajectory 105 of the front wheel 3F by reading out the positions of the plurality of front wheels 3F stored during a predetermined period of time from the ring buffer 404.
  • the above technique can also be applied when the vehicle 1 moves backward.
  • the traveling direction of the vehicle 1 is the backward direction of the vehicle 1. Therefore, the image acquisition unit 421 can acquire an underfloor image from the image captured by the imaging unit 16c.
  • the periphery monitoring device may display the passing trajectory of the rear wheel 3R and the display object indicating the position of the front wheel 3F on the display screen 8 so as to be superimposed on the image. Further, when the vehicle 1 moves backward, the periphery monitoring device may superimpose the predicted trajectory of the front wheel 3F on the image and display it on the display screen 8.
  • the ring buffer 404 stores the travel history of the vehicle 1.
  • the image acquisition unit 421 acquires an image indicating a road surface under the floor of the vehicle 1.
  • the passage locus calculation unit 422 calculates the passage locus 105 of the front wheel 3F based on the travel history.
  • the output unit 424 superimposes and displays the trajectory 105 of the front wheel 3F, the contour 104 of the rear wheel 3R indicating the position of the rear wheel 3R, and an image showing the road surface under the floor of the vehicle 1 on the display screen 8.
  • the driver can perform a driving operation such that the rear wheel 3R passes on the trajectory through which the front wheel 3F has passed. Therefore, the convenience of the periphery monitoring device is high.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R based on the steering information. Then, the output unit 424 further superimposes and displays the predicted trajectory 106 of the rear wheel 3R on the display screen 8. As a result, the driver can more easily perform driving such that the rear wheel 3R is allowed to pass on the locus through which the front wheel 3F has passed.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F based on the steering information. Then, the output unit 424 further superimposes and displays the predicted trajectory 101 of the front wheel 3F on the display screen 8. Since the driver can confirm the position where the front wheel 3F will proceed from now on the display screen 8, the driver 1 can drive the vehicle 1 more safely.
  • the ring buffer 404 stores an image captured at the first timing by the imaging unit 16 that images the road surface in the traveling direction of the vehicle 1.
  • the image acquisition unit 421 moves the image captured at the first timing stored in the ring buffer 404 under the floor of the vehicle 1 at the second timing when the vehicle 1 passes on the road surface reflected in the image. Get as an image to show.
  • the second timing is a timing after the first timing.
  • SYMBOLS 1 ... Vehicle, 2 ... Vehicle body, 2a ... Cabin, 2b ... Seat, 2c, 2d, 2e, 2f ... End part, 2g ... Door mirror, 2h ... Door, 3 ... Wheel, 3F ... Front wheel, 3R ... Rear wheel, 4 DESCRIPTION OF SYMBOLS ... Steering part, 5 ... Acceleration operation part, 6 ... Braking operation part, 7 ... Shift operation part, 8 ... Display screen, 9 ... Audio
  • Torque sensor 16, 16a, 16b, 16c, 16d ... Imaging unit, 18 ... Brake system, 18a ... Actuator, 18b ... Brake sensor, 19 ... Steering angle sensor, 20 ... Accelerator sensor, 21 ... Shift sensor 22 ... wheel speed sensor, 23 ... in-vehicle network, 24 ... ECU, 24a ... CPU, 24b ... ROM, 24c ... RAM, 24d ... display control unit, 24e ... voice control unit, 24f ... SD, 26, 26a, 26b ... acceleration sensor, 38 ... rear wheel axle, 38a ... center point, 38b ... arc, 81, 82, 83, 84 ... display area, 85 ... switching button, 100 ...
  • peripheral monitoring system 101, 106 ... Predicted locus, 102, 103, 104 ... Contour line, 105 ... Passing locus, 401 ... Acquisition unit, 402 ... Angle calculation unit, 403 ... Storage processing unit, 404 ... Ring buffer, 405 ... Reception unit, 406 ... Display processing 411 ... correction unit 412 ... estimation unit 413 ... storage unit 421 ... image acquisition unit 422 ... passing trajectory calculation unit 423 ... predicted trajectory calculation unit 424 ... output unit 501 ... road surface 502, 503 ... Imaging area.

Abstract

A periphery monitoring device according to an embodiment is provided with: a storage unit for storing a travel history of a vehicle; an image acquisition unit for acquiring a first image showing the road surface underneath the floor of the vehicle; a first calculation unit for calculating, on the basis of the travel history, a first trajectory through which the front wheels of the vehicle have passed; and an output unit that produces a superimposed display of the first image, the first trajectory, and display objects indicating the positions of the rear wheels of the vehicle on a display screen provided inside the passenger compartment of the vehicle.

Description

周辺監視装置Perimeter monitoring device
 本発明の実施形態は、車両に搭載される周辺監視装置に関する。 Embodiments of the present invention relate to a periphery monitoring device mounted on a vehicle.
 従来、車両に設置された撮像装置で車両の周辺環境を撮像し、撮像結果である画像を表示する技術が提案されている。 Conventionally, a technique has been proposed in which an imaging device installed in a vehicle images the surrounding environment of the vehicle and displays an image as an imaging result.
特開2016-021653号公報JP 2016-021653 A
 例えば車両が路外地形(オフロード)を走行する際などにおいて、後輪を通過させる経路の選択が重要になる場合がある。その点、従来技術には、改善の余地がある。 For example, when the vehicle travels on off-road terrain (off-road), the selection of the route through which the rear wheel passes may be important. In this respect, there is room for improvement in the conventional technology.
 本発明の課題の一つは、利便性が高い周辺監視装置を提供することである。 One of the problems of the present invention is to provide a peripheral monitoring device that is highly convenient.
 本発明の実施形態の周辺監視装置は、一例として、車両の走行履歴を記憶する記憶部と、車両の床下の路面を示す第1の画像を取得する画像取得部と、車両の前輪が通過した第1の軌跡を走行履歴に基づいて演算する第1演算部と、第1の画像と、第1の軌跡と、車両の後輪の位置を示す表示オブジェクトとを、車両の車室内に設けられた表示画面に重畳表示する出力部と、を備えた。よって、運転者は、いちど前輪が通過した軌跡上を後輪が通過するように操舵する、といった運転が可能となるので、本発明の実施形態の周辺監視装置は利便性が高い。 In the periphery monitoring device according to the embodiment of the present invention, as an example, a storage unit that stores a travel history of a vehicle, an image acquisition unit that acquires a first image showing a road surface under the floor of the vehicle, and a front wheel of the vehicle have passed. A first computing unit that computes the first trajectory based on the travel history, a first image, a first trajectory, and a display object that indicates the position of the rear wheel of the vehicle are provided in the vehicle interior of the vehicle. And an output unit for superimposing and displaying on the display screen. Therefore, the driver can perform a driving operation such that the rear wheel passes on the trajectory through which the front wheel has passed. Therefore, the periphery monitoring device according to the embodiment of the present invention is highly convenient.
 また、本発明の実施形態の周辺監視装置は、一例として、車両の後輪の予測軌跡を示す第2の軌跡を操舵情報に基づいて演算する第2演算部をさらに備え、出力部は、第2の軌跡を表示画面にさらに重畳表示する。よって、運転者は、前輪が通過した軌跡上を後輪を通過させる運転をより容易に行うことが可能となる。 In addition, the periphery monitoring device according to the embodiment of the present invention further includes, as an example, a second calculation unit that calculates a second locus indicating the predicted locus of the rear wheel of the vehicle based on the steering information, and the output unit includes 2 is further superimposed on the display screen. As a result, the driver can more easily perform a driving operation in which the rear wheels are passed on the locus through which the front wheels have passed.
 また、本発明の実施形態の周辺監視装置では、一例として、第2演算部は、さらに、車両の前輪の予測軌跡を示す第3の軌跡を操舵情報に基づいて演算し、出力部は、第3の軌跡を表示画面にさらに重畳表示する。よって、運転者は、前輪がこれから進行する位置を表示画面で確認することが可能となるので、より安全に車両を運転することが可能となる。 In the periphery monitoring device according to the embodiment of the present invention, as an example, the second calculation unit further calculates a third locus indicating the predicted locus of the front wheels of the vehicle based on the steering information, and the output unit 3 trajectories are further superimposed on the display screen. Therefore, the driver can check the position where the front wheel will advance from the display screen, and can drive the vehicle more safely.
 また、本発明の実施形態の周辺監視装置では、一例として、記憶部は、車両に設けられ車両の進行方向の路面を撮像する撮像装置によって第1のタイミングに撮像された第2の画像を記憶し、画像取得部は、第2の画像に写っている路面上を車両が通過する第1のタイミングよりも後の第2のタイミングに、記憶部に記憶された第2の画像を第1の画像として取得する。よって、車両の床下の路面を直接に撮像する撮像装置の設置が不要となる。 In the periphery monitoring device according to the embodiment of the present invention, as an example, the storage unit stores a second image captured at a first timing by an imaging device that is provided in the vehicle and images a road surface in the traveling direction of the vehicle. Then, the image acquisition unit receives the second image stored in the storage unit at the second timing after the first timing when the vehicle passes on the road surface reflected in the second image. Obtain as an image. Therefore, it is not necessary to install an imaging device that directly images the road surface under the floor of the vehicle.
図1は、実施形態にかかる周辺監視装置を搭載する車両の車室の一部が透視された状態の一例が示された斜視図である。FIG. 1 is a perspective view illustrating an example of a state in which a part of a passenger compartment of a vehicle on which the periphery monitoring device according to the embodiment is mounted is seen through. 図2は、実施形態にかかる周辺監視装置を搭載する車両の一例が示された平面図(鳥瞰図)である。FIG. 2 is a plan view (bird's eye view) illustrating an example of a vehicle on which the periphery monitoring device according to the embodiment is mounted. 図3は、実施形態にかかる周辺監視装置を有する周辺監視システムの構成の一例を示すブロック図である。FIG. 3 is a block diagram illustrating an example of a configuration of a periphery monitoring system including the periphery monitoring device according to the embodiment. 図4は、実施形態にかかる周辺監視装置が出力する、通常表示モードでの表示例を示した図である。FIG. 4 is a diagram illustrating a display example in the normal display mode output by the periphery monitoring device according to the embodiment. 図5は、実施形態にかかる周辺監視装置が出力する、床下表示モードでの表示例を示した図である。FIG. 5 is a diagram illustrating a display example in the underfloor display mode output from the periphery monitoring device according to the embodiment. 図6は、実施形態にかかる周辺監視装置が床下画像を取得する方法の概念を説明するための図である。FIG. 6 is a diagram for explaining a concept of a method by which the periphery monitoring device according to the embodiment acquires an underfloor image. 図7は、実施形態にかかる周辺監視装置としてのECUの機能的構成を示すブロック図である。FIG. 7 is a block diagram illustrating a functional configuration of the ECU as the periphery monitoring device according to the embodiment. 図8は、実施形態にかかる周辺監視装置が備えるリングバッファの構造を示した図である。FIG. 8 is a diagram illustrating a structure of a ring buffer included in the periphery monitoring device according to the embodiment. 図9は、実施形態にかかる周辺監視装置による後輪の予測軌跡の演算方法の一例を説明するための図である。FIG. 9 is a diagram for explaining an example of a method for calculating the predicted trajectory of the rear wheels by the periphery monitoring device according to the embodiment. 図10は、実施形態にかかる周辺監視装置における、画像の保存処理の手順を示すフローチャートである。FIG. 10 is a flowchart illustrating a procedure of image storage processing in the periphery monitoring device according to the embodiment. 図11は、実施形態にかかる周辺監視装置における、表示を制御する処理の手順を示すフローチャートである。FIG. 11 is a flowchart illustrating a procedure of processing for controlling display in the periphery monitoring device according to the embodiment.
 以下、本実施形態の周辺監視装置を車両1に搭載した例をあげて説明する。 Hereinafter, an example in which the periphery monitoring device of the present embodiment is mounted on the vehicle 1 will be described.
 図1は、実施形態にかかる周辺監視装置を搭載する車両1の車室2aの一部が透視された状態の一例が示された斜視図である。図2は、実施形態にかかる周辺監視装置を搭載する車両1の一例が示された平面図(鳥瞰図)である。図3は、実施形態にかかる周辺監視装置を有する周辺監視システム100の構成の一例を示すブロック図である。 FIG. 1 is a perspective view showing an example of a state in which a part of a passenger compartment 2a of a vehicle 1 equipped with a periphery monitoring device according to an embodiment is seen through. FIG. 2 is a plan view (bird's eye view) illustrating an example of a vehicle 1 on which the periphery monitoring device according to the embodiment is mounted. FIG. 3 is a block diagram illustrating an example of a configuration of the periphery monitoring system 100 including the periphery monitoring device according to the embodiment.
 車両1は、例えば、不図示の内燃機関を駆動源とする自動車、即ち内燃機関自動車であってもよいし、不図示の電動機を駆動源とする自動車、即ち電気自動車または燃料電池自動車等であってもよいし、それらの双方を駆動源とする自動車であってもよい。また、車両1は、種々の変速装置を搭載することができるし、内燃機関や電動機を駆動するのに必要な種々の、装置、システム、または部品等を搭載することができる。また、車両1における車輪3の駆動に関わる装置の方式、個数、およびレイアウト等は、種々に設定することができる。 The vehicle 1 may be, for example, an automobile having an internal combustion engine (not shown) as a drive source, that is, an internal combustion engine automobile, or an automobile having an electric motor (not shown) as a drive source, that is, an electric vehicle or a fuel cell vehicle. Alternatively, an automobile using both of them as a driving source may be used. The vehicle 1 can be mounted with various transmissions, and can be mounted with various devices, systems, components, or the like necessary for driving the internal combustion engine or the electric motor. In addition, the system, number, layout, and the like of devices related to driving of the wheels 3 in the vehicle 1 can be variously set.
 図1に示されるように、第1の実施形態にかかる車体2は、不図示の乗員が乗車する車室2aを構成している。車室2a内には、乗員としての運転者の座席2bに臨む状態で、操舵部4、加速操作部5、制動操作部6、および変速操作部7が設けられている。本実施形態では、一例として、操舵部4は、ダッシュボード(インストルメントパネル)から突出したステアリングホイールであり、加速操作部5は、運転者の足下に位置されたアクセルペダルであり、制動操作部6は、運転者の足下に位置されたブレーキペダルであり、変速操作部7は、センターコンソールから突出したシフトレバーである。なお、操舵部4、加速操作部5、制動操作部6、および変速操作部7は、これらには限定されない。 As shown in FIG. 1, the vehicle body 2 according to the first embodiment constitutes a passenger compartment 2a in which a passenger (not shown) gets. A steering unit 4, an acceleration operation unit 5, a braking operation unit 6, and a transmission operation unit 7 are provided in the passenger compartment 2 a so as to face the driver's seat 2 b as an occupant. In the present embodiment, as an example, the steering unit 4 is a steering wheel protruding from a dashboard (instrument panel), the acceleration operation unit 5 is an accelerator pedal positioned under the driver's feet, and a braking operation unit Reference numeral 6 denotes a brake pedal positioned under the driver's feet, and the shift operation unit 7 is a shift lever protruding from the center console. The steering unit 4, the acceleration operation unit 5, the braking operation unit 6, and the speed change operation unit 7 are not limited to these.
 また、車室2a内には、モニタ装置11が設けられている。モニタ装置11は、表示画面8および音声出力装置9を有する。表示画面8は、表示装置によって構成されている。表示装置は、例えば、LCD(liquid crystal display)またはOELD(organic electroluminescent display)等である。音声出力装置9は、一例として、スピーカである。また、本実施形態では、一例として、表示画面8は、透明な操作入力部10で覆われている。操作入力部10は、例えばタッチパネルである。乗員等は、表示画面8に表示される画像を操作入力部10を介して視認することができる。また、乗員等は、表示画面8に表示される画像に対応した位置で手指等で操作入力部10を触れたり押したり動かしたりして操作することで、操作入力を実行することができる。また、本実施形態では、一例として、モニタ装置11は、ダッシュボードの車幅方向(左右方向)の中央部に設けられている。また、モニタ装置11は、タッチパネル以外の操作入力部を備え得る。例えば、モニタ装置11は、他の操作入力部として、スイッチ、ダイヤル、ジョイスティック、または押しボタンが設けられていてもよい。モニタ装置11は、例えば、ナビゲーションシステムまたはオーディオシステムと兼用され得るし、これらのシステムとは別に設けられ得る。 Further, a monitor device 11 is provided in the passenger compartment 2a. The monitor device 11 has a display screen 8 and an audio output device 9. The display screen 8 is configured by a display device. The display device is, for example, an LCD (liquid crystal display) or an OELD (organic electroluminescent display). The audio output device 9 is a speaker as an example. In the present embodiment, as an example, the display screen 8 is covered with a transparent operation input unit 10. The operation input unit 10 is, for example, a touch panel. An occupant or the like can visually recognize the image displayed on the display screen 8 via the operation input unit 10. In addition, an occupant or the like can perform an operation input by operating the operation input unit 10 by touching, pushing, or moving the operation input unit 10 with a finger or the like at a position corresponding to the image displayed on the display screen 8. Moreover, in this embodiment, the monitor apparatus 11 is provided in the center part of the vehicle width direction (left-right direction) of a dashboard as an example. The monitor device 11 may include an operation input unit other than the touch panel. For example, the monitor device 11 may be provided with a switch, a dial, a joystick, or a push button as another operation input unit. The monitor device 11 can be used as a navigation system or an audio system, for example, or can be provided separately from these systems.
 また、図1,2,3に示されるように、本実施形態では、一例として、車両1は、四輪自動車であり、左右二つの前輪3Fと、左右二つの後輪3Rとを有する。そして、例えば前輪3Fのタイヤ角が操舵部4の操作に対応して変化する。操舵システム13は、例えば、電動パワーステアリングシステムまたはSBW(steer by wire)システム等である。操舵システム13は、アクチュエータ13aによって操舵部4にアシストトルクを付加することによって操舵力を補ったり、車輪3を転舵したりする。操舵システム13は、トルクセンサ13bによって、運転者が操舵部4に与えるトルクを検出する。また、操舵システム13は、後輪操舵装置(ARS:Active Rear Steering)であってもよい。後輪操舵装置は、後輪3Rを操舵するものである。具体的には、後輪操舵装置が採用される場合、後輪3Rは、車両1の運転状態(例えば車速または旋回状態)に応じて、前輪3Fの操舵角と同位相、あるいは逆位相で操舵される。 As shown in FIGS. 1, 2, and 3, in the present embodiment, as an example, the vehicle 1 is a four-wheeled vehicle and includes two left and right front wheels 3 </ b> F and two right and left rear wheels 3 </ b> R. For example, the tire angle of the front wheel 3 </ b> F changes corresponding to the operation of the steering unit 4. The steering system 13 is, for example, an electric power steering system or an SBW (steer by wire) system. The steering system 13 supplements the steering force by adding an assist torque to the steering unit 4 by the actuator 13a or steers the wheel 3. The steering system 13 detects the torque that the driver gives to the steering unit 4 by the torque sensor 13b. Further, the steering system 13 may be a rear wheel steering device (ARS: Active Rear Steering). The rear wheel steering device steers the rear wheel 3R. Specifically, when the rear wheel steering device is employed, the rear wheel 3R is steered in the same phase as or opposite to the steering angle of the front wheel 3F depending on the driving state of the vehicle 1 (for example, vehicle speed or turning state). Is done.
 なお、別の例では、操舵システム13は、前輪3Fと後輪3Rとを独立または関連して操舵可能に構成されてもよい。例えば、二つの前輪3Fは、互いに同相(同位相、同転舵方向、同回動方向)で略平行に転舵され、二つの後輪3Rは、互いに同相で略平行に転舵される。なお、駆動輪は種々に設定可能である。 In another example, the steering system 13 may be configured to be able to steer the front wheel 3F and the rear wheel 3R independently or in association with each other. For example, the two front wheels 3F are steered substantially in parallel with each other in the same phase (the same phase, the same turning direction, and the same turning direction), and the two rear wheels 3R are steered in the same phase and substantially parallel. The drive wheels can be set in various ways.
 また、本実施形態では、一例として、図2に示されるように、車両1には、複数の撮像部16が設けられている。本実施形態では、4つの撮像部16a~16dが設けられている。撮像部16は、例えば、CCD(charge coupled device)やCIS(CMOS image sensor)等の撮像素子を内蔵する撮像装置である。撮像部16は、所定のフレームレートで画像を出力することができる。撮像部16は、それぞれ、広角レンズまたは魚眼レンズを有し、水平方向には例えば140°~190°の範囲を撮影することができる。また、撮像部16の光軸は下方(例えば、鉛直方向または斜め下方)に向けて設定されている。 In the present embodiment, as an example, as shown in FIG. 2, the vehicle 1 is provided with a plurality of imaging units 16. In the present embodiment, four imaging units 16a to 16d are provided. The imaging unit 16 is an imaging device including an imaging device such as a CCD (charge coupled device) or a CIS (CMOS image sensor). The imaging unit 16 can output an image at a predetermined frame rate. Each of the imaging units 16 includes a wide-angle lens or a fish-eye lens, and can capture a range of 140 ° to 190 °, for example, in the horizontal direction. In addition, the optical axis of the imaging unit 16 is set downward (for example, vertically or obliquely downward).
 本実施形態では、撮像部16aは、車体2の前側の端部2c(例えばフロントグリル)に設けられている。撮像部16aは、車両1の前方向の周辺環境を撮像することができる。ただし、座席2bに着座した運転者が正面を向く方向、すなわち運転者からみたフロントガラス側を、車両1の前方向および車体2の前側としている。撮像部16bは、車体2の左側の端部2d、より詳しくは、左側のドアミラー2gに設けられている。撮像部16bは、車両1の左方向の周辺環境を撮像することができる。撮像部16cは、車体2の後側の端部2e(例えばリアトランクのドア2hの下方の壁部)に設けられている。撮像部16cは、車両1の後方向の周辺環境を撮像することができる。撮像部16dは、車体2の右側の端部2f、より詳しくは、右側のドアミラー2gに設けられている。撮像部16dは、車両1の右方向の周辺環境を撮像することができる。 In the present embodiment, the imaging unit 16a is provided on the front end 2c (for example, a front grill) of the vehicle body 2. The imaging unit 16a can image the surrounding environment in the forward direction of the vehicle 1. However, the direction in which the driver seated on the seat 2 b faces the front, that is, the windshield side viewed from the driver, is the front direction of the vehicle 1 and the front side of the vehicle body 2. The imaging unit 16b is provided on the left end 2d of the vehicle body 2, more specifically, on the left door mirror 2g. The imaging unit 16b can capture the surrounding environment in the left direction of the vehicle 1. The imaging unit 16c is provided at a rear end 2e of the vehicle body 2 (for example, a wall portion below the rear trunk door 2h). The imaging unit 16c can capture the surrounding environment in the rear direction of the vehicle 1. The imaging unit 16d is provided on the right end 2f of the vehicle body 2, more specifically, on the right door mirror 2g. The imaging unit 16d can capture the surrounding environment in the right direction of the vehicle 1.
 なお、周辺環境は、車両1の周辺の路面を含む、車両1の周辺の状況をいう。いずれかの撮像部16が車両1の進行方向の路面が写っている画像を出力することができる限り、各撮像部16の構成、撮像部16の数、各撮像部16の設置箇所、および各撮像部16の向きは、上記した内容に限定されない。 The surrounding environment refers to the situation around the vehicle 1 including the road surface around the vehicle 1. As long as any one of the imaging units 16 can output an image showing the road surface in the traveling direction of the vehicle 1, the configuration of each imaging unit 16, the number of imaging units 16, the location of each imaging unit 16, and each The orientation of the imaging unit 16 is not limited to the above-described content.
 ECU(Electronic Control Unit)24は、実施形態の周辺監視装置の一例である。ECU24は、複数の撮像部16によって得られた画像に基づいて演算処理や画像処理を実行し、当該画像処理がなされた画像を表示画面8に表示できる。 The ECU (Electronic Control Unit) 24 is an example of the periphery monitoring device of the embodiment. The ECU 24 can perform arithmetic processing and image processing based on the images obtained by the plurality of imaging units 16 and can display the image subjected to the image processing on the display screen 8.
 周辺監視装置は、表示画面8の表示内容を少なくとも床下表示モードで制御することができる。床下表示モードは、車両1の現在の床下の路面を少なくとも示す画像を表示する表示モードである。ここでは一例として、周辺監視装置の表示モードは、床下表示モードと通常表示モードとを備えることとする。 The perimeter monitoring device can control the display content of the display screen 8 at least in the underfloor display mode. The underfloor display mode is a display mode in which an image showing at least the road surface under the current floor of the vehicle 1 is displayed. Here, as an example, the display mode of the periphery monitoring device includes an underfloor display mode and a normal display mode.
 まず、通常表示モードの一例を説明する。図4は、通常表示モードでの表示例を示した図である。表示画面8は、表示領域81、表示領域82、表示領域83および表示領域84を備える。通常表示モードにおいては、表示領域81には、撮像部16aによって撮像された、車両1の前方向の現在の周辺環境が写っている画像が表示される。また、表示領域82には、撮像部16bによって撮像された、車両1の左方向の現在の周辺環境が写っている画像が表示される。表示領域83には、撮像部16dによって撮像された、車両1の右方向の現在の周辺環境が写っている画像が表示される。表示領域84には、車両1の傾きなどを示す、車両1の後方形状を示す表示オブジェクトが表示される。表示領域84の右方には、切り替えボタン85が表示される。また、表示領域81には、前輪3Fの予測軌跡101が重畳して表示される。 First, an example of the normal display mode will be described. FIG. 4 is a diagram showing a display example in the normal display mode. The display screen 8 includes a display area 81, a display area 82, a display area 83, and a display area 84. In the normal display mode, an image of the current surrounding environment in the front direction of the vehicle 1 captured by the imaging unit 16a is displayed in the display area 81. In the display area 82, an image captured by the imaging unit 16b and showing the current surrounding environment in the left direction of the vehicle 1 is displayed. In the display area 83, an image captured by the imaging unit 16d and showing the current surrounding environment in the right direction of the vehicle 1 is displayed. In the display area 84, a display object indicating the rear shape of the vehicle 1 that indicates the inclination of the vehicle 1 or the like is displayed. On the right side of the display area 84, a switching button 85 is displayed. In addition, the predicted trajectory 101 of the front wheel 3F is superimposed on the display area 81 and displayed.
 次に、床下表示モードについて説明する。図5は、床下表示モードでの表示例を示した図である。床下表示モードにおいては、表示領域81には、周辺環境のうちの、車両1の現在の床下の路面を示す画像が少なくとも表示される。図5に示す例では、車両1の床下の路面を含む車両1の周囲の路面が表示領域81に表示されている。以降、現在の床下の路面を少なくとも示す画像を、床下画像と表記する。 Next, the underfloor display mode will be described. FIG. 5 is a diagram showing a display example in the underfloor display mode. In the underfloor display mode, the display area 81 displays at least an image showing the current underfloor road surface of the vehicle 1 in the surrounding environment. In the example shown in FIG. 5, the road surface around the vehicle 1 including the road surface under the floor of the vehicle 1 is displayed in the display area 81. Hereinafter, an image showing at least the current road surface under the floor is referred to as an underfloor image.
 さらに、表示領域81には、車両1の外形を示す表示オブジェクトである輪郭線102が、車両1の前輪3Fの外形を示す表示オブジェクトである輪郭線103および車両1の後輪3Rの外形を示す表示オブジェクトである輪郭線104とともに、表示される。車両1の輪郭線102は、車両1の現在の位置を特定可能な識別情報の一例である。前輪3Fの輪郭線103は、前輪3Fの現在の位置を特定可能な識別情報の一例である。後輪3Rの輪郭線104は、後輪3Rの現在の位置を特定可能な識別情報の一例である。輪郭線102~104は、床下画像に重畳される。輪郭線102で囲まれた領域は、現在の床下の路面を示す。 Further, in the display area 81, a contour line 102 which is a display object indicating the outer shape of the vehicle 1 indicates a contour line 103 which is a display object indicating the outer shape of the front wheel 3F of the vehicle 1 and the outer shape of the rear wheel 3R of the vehicle 1. It is displayed together with the outline 104 which is a display object. The outline 102 of the vehicle 1 is an example of identification information that can identify the current position of the vehicle 1. The contour 103 of the front wheel 3F is an example of identification information that can identify the current position of the front wheel 3F. The contour 104 of the rear wheel 3R is an example of identification information that can identify the current position of the rear wheel 3R. The contour lines 102 to 104 are superimposed on the underfloor image. A region surrounded by the contour line 102 indicates a road surface under the current floor.
 ここで、本実施形態では、床下表示モードにおいて、表示領域81に、前輪3Fが現在までに通過した軌跡を示す表示オブジェクトである通過軌跡105がさらに表示される。岩場、ぬかるみ、または雪道、などのオフロードにおいては、安全性の観点から、車輪3の通過には不適なポイントが存在する場合がある。そのようなポイントが存在する虞のある場所を走行する場合においては、いちど前輪3Fが通過した実績のあるポイントは比較的安全であると考えられる。実施形態の周辺監視装置は、運転者に、少なくとも、通過軌跡105と、後輪3Rの輪郭線104と、を現在の床下の路面を示す画像とともに提示することにより、運転者は、いちど前輪3Fが通過した軌跡上を後輪3Rが通過するように操舵する、といった運転が可能となる。より安全な運転が可能となるので、周辺監視装置は、通過軌跡105または後輪3Rの輪郭線104を提示しない場合に比べて利便性が向上する。 Here, in the present embodiment, in the underfloor display mode, a passage locus 105, which is a display object indicating a locus through which the front wheel 3F has passed, is further displayed in the display area 81. In an off-road such as a rocky place, a muddy road, or a snowy road, there may be an unsuitable point for passing the wheel 3 from the viewpoint of safety. When traveling in a place where such a point may exist, it is considered that a point with a track record through which the front wheel 3F has passed is relatively safe. The periphery monitoring device of the embodiment presents at least the passing trajectory 105 and the contour line 104 of the rear wheel 3R together with an image showing the current road surface under the floor to the driver, so that the driver is once in front wheel 3F. It is possible to drive the vehicle so that the rear wheel 3R passes on the trajectory through which the vehicle has passed. Since safer driving is possible, the periphery monitoring device is more convenient than the case where the passing track 105 or the contour line 104 of the rear wheel 3R is not presented.
 また、床下表示モードにおいて、表示領域81に、後輪3Rの予測軌跡106がさらに表示される。後輪3Rの予測軌跡106は、タイヤ角に基づいて、変化する。タイヤ角は操舵部4の操舵量に連動するので、後輪3Rの予測軌跡106は、操舵部4の操舵量に連動する。実施形態の周辺監視装置は、運転者に、後輪3Rの予測軌跡106をさらに提示することにより、運転者は、前輪3Fが現在までに通過した軌跡とともに、後輪3Rが現在の操舵量に応じて進行する軌跡を表示画面8上で視認することができる。よって、運転者は、前輪3Fが通過した軌跡上を後輪3Rを通過させる運転をより容易に行うことが可能となる。 In the underfloor display mode, the predicted trajectory 106 of the rear wheel 3R is further displayed in the display area 81. The predicted trajectory 106 of the rear wheel 3R changes based on the tire angle. Since the tire angle is linked to the steering amount of the steering unit 4, the predicted locus 106 of the rear wheel 3 </ b> R is linked to the steering amount of the steering unit 4. The periphery monitoring device of the embodiment further presents the predicted trajectory 106 of the rear wheel 3R to the driver, so that the driver can adjust the rear wheel 3R to the current steering amount together with the trajectory that the front wheel 3F has passed so far. The trajectory that travels accordingly can be visually recognized on the display screen 8. Therefore, the driver can more easily perform the driving of passing the rear wheel 3R on the trajectory through which the front wheel 3F has passed.
 なお、ここでは一例として、床下表示モードにおいて、表示領域81に、前輪3Fの予測軌跡101がさらに表示される。床下表示モードにおいては、前輪3Fの予測軌跡101は必ずしも表示されなくてもよい。 Here, as an example, the predicted locus 101 of the front wheel 3F is further displayed in the display area 81 in the underfloor display mode. In the underfloor display mode, the predicted trajectory 101 of the front wheel 3F may not necessarily be displayed.
 また、ここでは、前輪3Fの通過軌跡105は、点線で表示されることとしているが、前輪3Fの通過軌跡105は、他の任意の表示様態で表示され得る。前輪3Fの通過軌跡105は、例えば、実線または破線で表示され得る。別の例では、前輪3Fの通過軌跡105は、各車輪3の幅と同じ幅を有する枠線で表示され得る。 Further, here, the passing trajectory 105 of the front wheel 3F is displayed as a dotted line, but the passing trajectory 105 of the front wheel 3F can be displayed in any other display manner. The passing trajectory 105 of the front wheel 3F can be displayed by, for example, a solid line or a broken line. In another example, the passage trajectory 105 of the front wheel 3F may be displayed with a frame line having the same width as the width of each wheel 3.
 同様に、各予測軌跡101,106は、任意の表示様態で表示され得る。ここでは一例として、各予測軌跡101,106は、各車輪3の幅と同じ幅を有し、所定の長さを有する枠線で表示されている。 Similarly, the predicted trajectories 101 and 106 can be displayed in an arbitrary display mode. Here, as an example, each of the predicted trajectories 101 and 106 has the same width as the width of each wheel 3 and is displayed with a frame line having a predetermined length.
 また、通常表示モードと床下表示モードとの切り替えの条件は、特定の条件に限定されない。ここでは一例として、切り替えボタン85のタッチによって通常表示モードと床下表示モードとの間の切り替えが実行されることとする。また、通常表示モードはなくてもよい。通常表示モードと異なる他の表示モードがあってもよい。 Also, the condition for switching between the normal display mode and the underfloor display mode is not limited to a specific condition. Here, as an example, it is assumed that switching between the normal display mode and the underfloor display mode is performed by touching the switching button 85. Further, the normal display mode may not be provided. There may be other display modes different from the normal display mode.
 また、床下表示モードにおいて、表示画面8の表示領域82~84には、任意の情報が表示され得る。床下表示モードにおいて、表示領域82~84には、通常表示モードとは異なる画像が表示されてもよい。床下表示モードにおいては、表示画面8に、床下画像が、各表示オブジェクト(輪郭線102~104、前輪3Fの通過軌跡105、および各予測軌跡101,106)とともに、全画面表示されてもよい。なお、図5の例では、床下表示モードにおいて、表示領域82~84には、通常表示モードと同様の表示が行われることとしている。 In the underfloor display mode, arbitrary information can be displayed in the display areas 82 to 84 of the display screen 8. In the underfloor display mode, images different from the normal display mode may be displayed in the display areas 82 to 84. In the underfloor display mode, the underfloor image may be displayed on the display screen 8 together with the respective display objects (the contour lines 102 to 104, the passing trajectory 105 of the front wheel 3F, and the predicted trajectories 101 and 106). In the example of FIG. 5, in the underfloor display mode, the display areas 82 to 84 are displayed in the same manner as in the normal display mode.
 なお、周辺監視装置は、床下表示モードおよび通常表示モードのいずれの表示モードの場合においても、所定のレートで表示内容を更新する。よって、床下表示モードおよび通常表示モードのいずれの表示モードの場合であっても、乗員は、表示画面8から、車両1の現在の周辺環境を映像として視認することができる。表示内容の更新のレートは、撮像部16が画像を出力するレートと同じであってもよいし、異なっていてもよい。 Note that the periphery monitoring device updates the display content at a predetermined rate in both the display mode under the floor and the normal display mode. Therefore, the occupant can visually recognize the current surrounding environment of the vehicle 1 as an image from the display screen 8 regardless of the display mode of the underfloor display mode or the normal display mode. The update rate of the display content may be the same as or different from the rate at which the imaging unit 16 outputs an image.
 また、本実施形態では、一例として、過去に撮像された複数の画像から現在の床下の路面が写っている画像が選択され、選択された画像が床下画像として表示される。 Also, in the present embodiment, as an example, an image showing the current road surface under the floor is selected from a plurality of images captured in the past, and the selected image is displayed as the underfloor image.
 図6は、本実施形態にかかる床下画像を取得する方法の概念を説明するための図である。例えば車両1が直進する場合、あるタイミングに撮像部16aによって撮像された画像に写っている路面上を、その後のタイミングに車両1が通過する。よって、周辺監視装置は、撮像部16aによって少し前のタイミングに撮像された画像を保存しておくことで、その画像を、車両1が現在通過中の路面を示す画像、即ち床下画像として使用することができる。図6の例では、時刻t1においては、車両1が通過中の路面501、即ち車両1の床下の路面は、撮像部16aの撮像領域502には含まれていないが、時刻t0における撮像部16aの撮像領域503に含まれている。時刻t0は、時刻t1よりも前の時刻であり、車両1が時刻t0における車両1の位置よりも進行方向の反対側に位置していたときの時刻である。周辺監視装置は、時刻t0において撮像部16aによって撮像された画像を、時刻t1において、床下画像として使用する。 FIG. 6 is a diagram for explaining the concept of a method for acquiring an underfloor image according to the present embodiment. For example, when the vehicle 1 goes straight, the vehicle 1 passes on the road surface shown in the image captured by the imaging unit 16a at a certain timing at a later timing. Therefore, the periphery monitoring device stores an image captured at a slightly previous timing by the imaging unit 16a, and uses the image as an image indicating a road surface on which the vehicle 1 is currently passing, that is, an underfloor image. be able to. In the example of FIG. 6, at time t1, the road surface 501 through which the vehicle 1 is passing, that is, the road surface under the floor of the vehicle 1, is not included in the imaging region 502 of the imaging unit 16a, but the imaging unit 16a at time t0. Are included in the imaging region 503. The time t0 is a time before the time t1, and is a time when the vehicle 1 is located on the opposite side of the traveling direction from the position of the vehicle 1 at the time t0. The periphery monitoring device uses the image captured by the imaging unit 16a at time t0 as the underfloor image at time t1.
 なお、床下画像の取得方法は上記の方法だけに限定されない。例えば、車両1の床下に、現在の床下の路面を撮像する撮像部が配設され、周辺監視装置は、当該撮像部が出力する画像を床下画像として使用してもよい。 Note that the method for acquiring the underfloor image is not limited to the above method. For example, an imaging unit that images the current road surface under the floor may be provided under the floor of the vehicle 1, and the periphery monitoring device may use an image output by the imaging unit as the underfloor image.
 図3に説明を戻す。車両1は、操舵システム13と、ブレーキシステム18と、舵角センサ19と、アクセルセンサ20と、シフトセンサ21と、車輪速センサ22と、車内ネットワーク23と、ECU24と、2個の加速度センサ26(26a,26b)とを備える。ECU24は、周辺監視装置の一例である。モニタ装置11、操舵システム13、撮像部16、ブレーキシステム18、舵角センサ19、アクセルセンサ20、シフトセンサ21、車輪速センサ22、ECU24、および加速度センサ26(26a,26b)は、周辺監視システム100を構成する。 Returning to FIG. The vehicle 1 includes a steering system 13, a brake system 18, a steering angle sensor 19, an accelerator sensor 20, a shift sensor 21, a wheel speed sensor 22, an in-vehicle network 23, an ECU 24, and two acceleration sensors 26. (26a, 26b). The ECU 24 is an example of a periphery monitoring device. The monitor device 11, the steering system 13, the imaging unit 16, the brake system 18, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the ECU 24, and the acceleration sensor 26 (26a, 26b) are a peripheral monitoring system. 100 is configured.
 モニタ装置11、操舵システム13、ブレーキシステム18、舵角センサ19、アクセルセンサ20、シフトセンサ21、車輪速センサ22、ECU24、および加速度センサ26が、電気通信回線である車内ネットワーク23を介して電気的に接続されている。車内ネットワーク23は、例えば、CAN(Controller Area Network)として構成されている。 The monitor device 11, the steering system 13, the brake system 18, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the ECU 24, and the acceleration sensor 26 are electrically connected via an in-vehicle network 23 that is an electric communication line. Connected. The in-vehicle network 23 is configured as, for example, a CAN (Controller Area Network).
 加速度センサ26aは、前後方向の加速度を検出するセンサであり、加速度センサ26bは、左右方向の加速度を検出するセンサである。 The acceleration sensor 26a is a sensor that detects acceleration in the front-rear direction, and the acceleration sensor 26b is a sensor that detects acceleration in the left-right direction.
 ブレーキシステム18は、アクチュエータ18aと、ブレーキセンサ18bとを有する。ブレーキシステム18は、アクチュエータ18aを介して、車輪3に制動力を与える。ブレーキセンサ18bは、例えば、制動操作部6の可動部としてのブレーキペダルの位置をECU24へ送信する。 The brake system 18 includes an actuator 18a and a brake sensor 18b. The brake system 18 applies a braking force to the wheel 3 via the actuator 18a. The brake sensor 18b transmits, for example, the position of a brake pedal as a movable part of the braking operation unit 6 to the ECU 24.
 舵角センサ19は、例えば、操舵部4の操舵量を検出するセンサである。舵角センサ19は、運転者による操舵部4の操舵角、および、自動操舵時の操舵角、等の舵角情報をECU24へ送信する。 The steering angle sensor 19 is, for example, a sensor that detects the steering amount of the steering unit 4. The steering angle sensor 19 transmits the steering angle information such as the steering angle of the steering unit 4 by the driver and the steering angle at the time of automatic steering to the ECU 24.
 アクセルセンサ20は、例えば、加速操作部5の可動部としてのアクセルペダルの位置を検出する変位センサである。アクセルセンサ20は、加速操作部5の位置をECU24へ送信する。 The accelerator sensor 20 is a displacement sensor that detects the position of an accelerator pedal as a movable part of the acceleration operation unit 5, for example. The accelerator sensor 20 transmits the position of the acceleration operation unit 5 to the ECU 24.
 シフトセンサ21は、例えば、変速操作部7の可動部の位置を検出するセンサであり、変速操作部7の可動部の位置をシフト情報としてECU24へ送信する。 The shift sensor 21 is, for example, a sensor that detects the position of the movable part of the shift operation unit 7 and transmits the position of the movable part of the shift operation unit 7 to the ECU 24 as shift information.
 車輪速センサ22は、車輪3の回転量または単位時間当たりの回転数を検出するセンサである。車輪速センサ22は、検出した回転数を示す車輪速パルス数を車輪速情報としてECU24へ送信する。 The wheel speed sensor 22 is a sensor that detects the amount of rotation of the wheel 3 or the number of rotations per unit time. The wheel speed sensor 22 transmits the wheel speed pulse number indicating the detected rotation speed to the ECU 24 as wheel speed information.
 ECU24は、車内ネットワーク23を介して、トルクセンサ13b、ブレーキセンサ18b、舵角センサ19、アクセルセンサ20、シフトセンサ21、車輪速センサ22、加速度センサ26、等のセンサ類による検出結果、ならびに、操作入力部10等の指示信号(制御信号、操作信号、入力信号、データ)を受け取ることができる。 The ECU 24 detects the detection results by sensors such as the torque sensor 13b, the brake sensor 18b, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the acceleration sensor 26, and the like via the in-vehicle network 23, and An instruction signal (control signal, operation signal, input signal, data) from the operation input unit 10 or the like can be received.
 また、ECU24は、撮像部16から画像を受信する。なお、ECU24は、車内ネットワーク23を介して、撮像部16から画像を受信してもよい。ECU24は、例えば、コンピュータである。ECU24は、CPU(Central Processing Unit)24aと、ROM(Read Only Memory)24bと、RAM(Random Access Memory)24cと、表示制御部24dと、音声制御部24eと、SSD(Solid State Drive)24fとを備える。CPU24a、ROM24bおよびRAM24cは、同一パッケージ内に集積されていてもよい。 Further, the ECU 24 receives an image from the imaging unit 16. The ECU 24 may receive an image from the imaging unit 16 via the in-vehicle network 23. The ECU 24 is, for example, a computer. The ECU 24 includes a CPU (Central Processing Unit) 24a, a ROM (Read Only Memory) 24b, a RAM (Random Access Memory) 24c, a display control unit 24d, a voice control unit 24e, an SSD (Solid State Drive) 24f, Is provided. The CPU 24a, ROM 24b, and RAM 24c may be integrated in the same package.
 CPU24aは、ROM24b等の不揮発性の記憶装置に記憶されたプログラムを読み出し、当該プログラムにしたがって各種の演算処理および制御を実行する。CPU24aは、例えば、表示画面8に表示させる画像に関連した画像処理等を実行する。 The CPU 24a reads a program stored in a nonvolatile storage device such as the ROM 24b, and executes various arithmetic processes and controls according to the program. The CPU 24a executes, for example, image processing related to an image to be displayed on the display screen 8.
 ROM24bは、プログラムおよびプログラムの実行に必要なパラメータ等を記憶する。RAM24cは、CPU24aでの演算で用いられる各種のデータを一時的に記憶する。表示制御部24dは、ECU24での演算処理のうち、主として、撮像部16から取得してCPU24aへ出力する撮像画像の処理、CPU24aから取得して表示画面8に表示させる表示用画像のデータ変換等を実行する。音声制御部24eは、ECU24での演算処理のうち、主として、CPU24aから取得して音声出力装置9に出力させる音声の処理を実行する。SSD24fは、書き換え可能な不揮発性の記憶部であって、ECU24の電源がオフされた場合にあってもCPU24aから取得したデータを維持する。 The ROM 24b stores a program and parameters necessary for executing the program. The RAM 24c temporarily stores various types of data used in computations by the CPU 24a. The display control unit 24d mainly performs processing of a captured image that is acquired from the imaging unit 16 and output to the CPU 24a among arithmetic processing in the ECU 24, data conversion of a display image that is acquired from the CPU 24a and displayed on the display screen 8, and the like. Execute. The voice control unit 24e mainly performs a voice process acquired from the CPU 24a and output to the voice output device 9 among the calculation processes in the ECU 24. The SSD 24f is a rewritable nonvolatile storage unit, and maintains data acquired from the CPU 24a even when the power of the ECU 24 is turned off.
 図7は、本実施形態にかかる周辺監視装置としてのECU24の機能的構成を示すブロック図である。CPU24aが、ROM24b内に予め格納されたプログラムを実行することで、取得部401と、角度算出部402と、保存処理部403と、受付部405と、表示処理部406と、を実現する。なお、これらの機能的構成を実現するためのプログラムは、コンピュータが読み取り可能な、ROM24b以外の任意の記録媒体を介して提供されてもよい。また、ECU24は、RAM24c上にリングバッファ404を実現する。なお、取得部401、角度算出部402、保存処理部403、受付部405、および表示処理部406のうちの一部または全部は、ハードウェア回路、またはハードウェア回路とソフトウェア(プログラム)との組み合わせによって実現されてもよい。 FIG. 7 is a block diagram showing a functional configuration of the ECU 24 as the periphery monitoring device according to the present embodiment. The CPU 24a executes a program stored in advance in the ROM 24b, thereby realizing an acquisition unit 401, an angle calculation unit 402, a storage processing unit 403, a reception unit 405, and a display processing unit 406. Note that the program for realizing these functional configurations may be provided via any recording medium other than the ROM 24b that can be read by the computer. Further, the ECU 24 implements a ring buffer 404 on the RAM 24c. Part or all of the acquisition unit 401, the angle calculation unit 402, the storage processing unit 403, the reception unit 405, and the display processing unit 406 is a hardware circuit or a combination of a hardware circuit and software (program). It may be realized by.
 取得部401は、車両1が備える各種センサ等から、種々の情報を取得する。本実施形態にかかる取得部401は、車両1に設けられ当該車両1の周辺を撮像する撮像部16a~16dから出力された画像と、当該車両1に設けられた加速度センサ26a、26bから出力された加速度データと、を取得する。さらに取得部401は、舵角センサ19から舵角情報と、車輪速センサ22から車輪速情報とを取得する。 The acquisition unit 401 acquires various information from various sensors provided in the vehicle 1. The acquisition unit 401 according to the present embodiment is output from the imaging units 16a to 16d that are provided in the vehicle 1 and images the periphery of the vehicle 1, and the acceleration sensors 26a and 26b that are provided in the vehicle 1. Acceleration data obtained. Further, the acquisition unit 401 acquires the steering angle information from the steering angle sensor 19 and the wheel speed information from the wheel speed sensor 22.
 また、取得部401は、取得された時刻が略一致する画像、加速度データ、舵角情報、および車輪速情報を、相互に対応付けておく。 Also, the acquisition unit 401 associates images, acceleration data, rudder angle information, and wheel speed information whose acquired times are substantially the same with each other.
 角度算出部402は、加速度センサ26a、26bから取得した加速度データに基づいて、車両1の傾斜角度(ピッチ角およびロール角)を算出する。なお、ピッチ角とは、車両1の左右軸周りの傾きを示した角度とし、ロール角とは、車両1の前後軸周りの傾きを示した角度とする。 The angle calculation unit 402 calculates the tilt angle (pitch angle and roll angle) of the vehicle 1 based on the acceleration data acquired from the acceleration sensors 26a and 26b. The pitch angle is an angle indicating the inclination around the left and right axes of the vehicle 1, and the roll angle is an angle indicating the inclination around the longitudinal axis of the vehicle 1.
 また、角度算出部402は、加速度データから算出したロール角およびピッチ角を、当該加速度データと対応付けられている画像と対応付けておく。これにより画像が撮像された時の車両1のロール角およびピッチ角が認識可能となる。 The angle calculation unit 402 associates the roll angle and pitch angle calculated from the acceleration data with an image associated with the acceleration data. This makes it possible to recognize the roll angle and pitch angle of the vehicle 1 when the image is captured.
 保存処理部403は、補正部411と、推定部412と、保存部413と、を備え、表示画面8に表示するための画像を生成し、保存する。 The storage processing unit 403 includes a correction unit 411, an estimation unit 412, and a storage unit 413, and generates and stores an image to be displayed on the display screen 8.
 補正部411は、撮像部16aにより撮像された、車両1の前方の周辺環境を写した画像に対する回転補正を行う。 The correction unit 411 performs rotation correction on the image captured by the imaging unit 16a and showing the surrounding environment in front of the vehicle 1.
 本実施形態にかかる補正部411は、撮像部16aが撮像に用いるレンズの中心に対応する、画像内の位置座標を原点として、当該画像と対応付けられているロール角に応じた回転補正を行う。 The correction unit 411 according to the present embodiment performs rotation correction according to the roll angle associated with the image, with the position coordinate in the image corresponding to the center of the lens used for imaging by the imaging unit 16a as the origin. .
 推定部412は、現在の車両1の位置を推定する。車両1の位置とは、予め定められた車両1の一部(例えば車両1の中心)の位置である。 The estimation unit 412 estimates the current position of the vehicle 1. The position of the vehicle 1 is a predetermined position of the vehicle 1 (for example, the center of the vehicle 1).
 車両1の位置の推定方法は、特定の方法に限定されない。ここでは一例として、推定部412は、取得部401によって取得された、加速度データ、舵角情報、および車輪速情報に基づいて、車両1の前回撮像した位置からの移動量を算出する。そして、推定部412は、移動量から車両1の位置を推定する。 The method for estimating the position of the vehicle 1 is not limited to a specific method. Here, as an example, the estimation unit 412 calculates the amount of movement of the vehicle 1 from the previously imaged position based on the acceleration data, the steering angle information, and the wheel speed information acquired by the acquisition unit 401. Then, the estimation unit 412 estimates the position of the vehicle 1 from the movement amount.
 別の例では、推定部412は、過去に撮像した画像と現在撮像した画像とを用いたオプティカルフローを演算し、演算されたオプティカルフローに基づいて車両1の移動量を算出し、車両1の位置を推定する。オフロードを走行する場合、荒れた路面等の影響により車輪3の空転等が生じることが考えられる。この場合、車輪3の回転数に基づいて車両1の移動量を推定すると誤差が生じる可能性が大きい。この様な場合に、オプティカルフローに基づいて車両1の移動量および位置の推定を行うと、推定精度を向上させることができる。 In another example, the estimation unit 412 calculates an optical flow using an image picked up in the past and an image picked up in the past, calculates the movement amount of the vehicle 1 based on the calculated optical flow, and Estimate the position. When traveling off-road, it is conceivable that idling of the wheels 3 occurs due to the influence of a rough road surface or the like. In this case, if the movement amount of the vehicle 1 is estimated based on the number of rotations of the wheels 3, there is a high possibility that an error will occur. In such a case, if the movement amount and position of the vehicle 1 are estimated based on the optical flow, the estimation accuracy can be improved.
 さらに別の例では、車両1は不図示のGPSを備え、推定部412は、GPSからの信号を用いることによって車両1の位置を推定してもよい。 In yet another example, the vehicle 1 may include a GPS (not shown), and the estimation unit 412 may estimate the position of the vehicle 1 by using a signal from the GPS.
 また、推定部412は、現在の車両1の向きを推定する。車両1の向きの推定の方法は特定の方法に限定されないが、車両1の向きは、車両1の位置と同様の方法によって推定可能である。 Further, the estimation unit 412 estimates the current direction of the vehicle 1. Although the method of estimating the orientation of the vehicle 1 is not limited to a specific method, the orientation of the vehicle 1 can be estimated by a method similar to the position of the vehicle 1.
 保存部413は、補正部411に補正された後の画像を車両1の位置情報および向き情報と共にリングバッファ404に保存する。 The storage unit 413 stores the image corrected by the correction unit 411 in the ring buffer 404 together with the position information and the direction information of the vehicle 1.
 本実施形態は、画像をリングバッファ404に保存する間隔を制限するものではないが、例えば、どのタイミングにおいても通過中の路面が写っている画像を過去に保存された画像から取得可能なように、保存の間隔は、長すぎない値に設定される。 Although the present embodiment does not limit the interval at which images are stored in the ring buffer 404, for example, an image showing a road surface that is passing at any timing can be acquired from an image stored in the past. The storage interval is set to a value that is not too long.
 ここで、実施形態では、一例として、リングバッファ404に保存されている位置情報および向き情報は、前輪3Fの通過軌跡105を演算するための走行履歴として使用される。即ち、リングバッファ404は、車両1の走行履歴を記憶する記憶部に該当する。周辺監視装置は、位置情報毎に前輪3Fの位置を演算することによって、前輪3Fの通過軌跡105を示す離散データを取得する。前輪3Fの通過軌跡105は、ある程度精密であることが望ましく、そのためには、保存される位置情報および向き情報の粒度はある程度細かいことが望ましい。よって、保存の間隔は、長すぎない値に設定される。例えば、0.1mの距離を進行する毎に画像、位置情報および向き情報が保存される等が考えられる。なお、走行履歴が、画像に対応づけられた位置情報および向き情報とは別に保存されてもよく、その場合には、画像の保存の間隔はこれらに限定されない。 Here, in the embodiment, as an example, the position information and the orientation information stored in the ring buffer 404 are used as a travel history for calculating the passing trajectory 105 of the front wheel 3F. That is, the ring buffer 404 corresponds to a storage unit that stores a travel history of the vehicle 1. The periphery monitoring device obtains discrete data indicating the passing trajectory 105 of the front wheel 3F by calculating the position of the front wheel 3F for each piece of position information. The passing trajectory 105 of the front wheel 3F is preferably accurate to some extent, and for this purpose, the granularity of the stored position information and orientation information is desirably fine to some extent. Therefore, the storage interval is set to a value that is not too long. For example, an image, position information, and orientation information may be stored every time a distance of 0.1 m is traveled. Note that the travel history may be stored separately from the position information and the orientation information associated with the image, and in this case, the image storage interval is not limited to these.
 図8は、リングバッファ404の構造を示した図である。図8に示されるように、リングバッファ404には、画像と、当該画像の撮像時の車両1の位置情報および向き情報と、が対応付けて蓄積される。図8に示されるように、リングバッファ404は、論理的にリング状に配置された記憶領域である。そして、リングバッファ404においては、保存部413の保存要求に応じて、最も古く更新された領域に対して、当該保存要求された画像等を上書き保存していく。 FIG. 8 is a diagram showing the structure of the ring buffer 404. As shown in FIG. 8, the ring buffer 404 stores an image and the position information and direction information of the vehicle 1 at the time of capturing the image in association with each other. As shown in FIG. 8, the ring buffer 404 is a storage area logically arranged in a ring shape. In the ring buffer 404, in response to a storage request from the storage unit 413, the image requested to be stored is overwritten and stored in the oldest updated area.
 受付部405は、操作入力部10等の指示信号(制御信号)を受け付ける。本実施形態にかかる受付部405は、これら指示信号から、表示モードの切り替えの操作を受け付ける。 The accepting unit 405 accepts an instruction signal (control signal) from the operation input unit 10 or the like. The receiving unit 405 according to the present embodiment receives a display mode switching operation from these instruction signals.
 表示処理部406は、受付部405が受け付けた操作に応じた表示モードで、表示画面8に対する表示処理を行う。表示処理部406は、画像取得部421と、通過軌跡演算部422と、予測軌跡演算部423と、出力部424とを備える。 The display processing unit 406 performs display processing on the display screen 8 in a display mode according to the operation received by the receiving unit 405. The display processing unit 406 includes an image acquisition unit 421, a passage locus calculation unit 422, a prediction locus calculation unit 423, and an output unit 424.
 画像取得部421は、車両1の床下の路面を示す画像、即ち床下画像を取得する。ここでは、画像取得部421は、リングバッファ404から、車両1が現在通過中の路面が写っている画像を読み出す。換言すると、画像取得部421は、リングバッファ404に保存されている第1のタイミングに撮像された画像を、当該画像に写っている路面上を車両1が通過する第2のタイミングに床下画像として取得する。第2のタイミングは第1のタイミングよりも後のタイミングである。 The image acquisition unit 421 acquires an image indicating a road surface under the floor of the vehicle 1, that is, an underfloor image. Here, the image acquisition unit 421 reads from the ring buffer 404 an image showing a road surface on which the vehicle 1 is currently passing. In other words, the image acquisition unit 421 uses the image captured at the first timing stored in the ring buffer 404 as an underfloor image at the second timing when the vehicle 1 passes on the road surface reflected in the image. get. The second timing is a timing after the first timing.
 画像取得部421は、リングバッファ404に保存されている複数の画像から、車両1が現在通過中の路面が写っている画像を選択する。いずれの画像に車両1が現在通過中の路面が写っているかは、例えば、各画像に対応づけられている位置情報に基づいて判断可能である。 The image acquisition unit 421 selects, from a plurality of images stored in the ring buffer 404, an image showing a road surface on which the vehicle 1 is currently passing. Which image shows the road surface on which the vehicle 1 is currently passing can be determined based on position information associated with each image, for example.
 一例では、次のように画像が選択される。撮像部16の光軸の向きおよび画角は既知または取得可能である。画像取得部421は、仮想的な平面図に車両1を示すオブジェクトを配置し、その平面図に、保存された各画像の撮像範囲をさらにマッピングする。画像取得部421は、各画像に対応づけられている位置情報および向き情報と、撮像部16の光軸の向きおよび画角とに基づいて、各画像の撮像範囲をマッピングする位置を特定することができる。画像取得部421は、車両1を示すオブジェクトを撮像範囲に含む画像を、車両1が現在通過中の路面が写っている画像として選択する。 In one example, an image is selected as follows. The direction of the optical axis and the angle of view of the imaging unit 16 are known or can be acquired. The image acquisition unit 421 arranges an object indicating the vehicle 1 on a virtual plan view, and further maps the imaging range of each stored image on the plan view. The image acquisition unit 421 specifies the position where the imaging range of each image is mapped based on the position information and orientation information associated with each image and the direction and angle of view of the optical axis of the imaging unit 16. Can do. The image acquisition unit 421 selects an image including an object indicating the vehicle 1 in the imaging range as an image showing a road surface on which the vehicle 1 is currently passing.
 なお、画像取得部421は、リングバッファ404に蓄積されている複数の画像のうち、2以上の画像を選択して、選択した2以上の画像を合成してもよい。合成とは、複数の画像をシームレスに接続することである。 Note that the image acquisition unit 421 may select two or more images from a plurality of images stored in the ring buffer 404 and combine the selected two or more images. Combining is to connect a plurality of images seamlessly.
 合成前の2以上の画像は、撮像されたタイミングが異なっていてもよい。画像取得部421は、例えば、時刻t2において撮像部16aによって撮像された画像の一部または全部と、時刻t2よりも後の時刻t3において撮像部16aによって撮像された画像の一部または全部と、を合成することによって、時刻t3よりも後の現在の時刻t4における床下画像を生成してもよい。 The timing at which two or more images before composition are taken may be different. For example, the image acquisition unit 421 includes a part or all of the image captured by the imaging unit 16a at time t2, and a part or all of the image captured by the imaging unit 16a at time t3 after time t2. , The underfloor image at the current time t4 after the time t3 may be generated.
 また、合成前の2以上の画像は、撮像した撮像部16がそれぞれ異なっていてもよい。画像取得部421は、例えば、時刻t5において撮像部16aによって撮像された画像の一部または全部と、時刻t5において撮像部16bによって撮像された画像の一部または全部と、を合成することによって、時刻t5よりも後の現在の時刻t6における床下画像を生成してもよい。 In addition, two or more images before synthesis may be different from each other in the imaging unit 16 that has captured the images. For example, the image acquisition unit 421 synthesizes part or all of the image captured by the imaging unit 16a at time t5 and part or all of the image captured by the imaging unit 16b at time t5. An underfloor image at the current time t6 after the time t5 may be generated.
 また、画像取得部421は、合成の前の画像または合成の後の画像に、合成以外の加工を施してもよい。加工は、例えば、切り抜き、マスキング、画像の一部もしくは全体に対するフィルタ処理、補正、または視点変換、を含む。補正は、例えば、歪曲補正またはガンマ補正である。視点変換は、例えば、撮像部16によって撮像された画像から鳥瞰画像を生成することである。また、画像取得部421は、1つの画像のみを床下画像として選択する場合であっても、選択した画像に加工を施してもよい。加工は、画像が取得されてから表示画面8に出力されるまでの任意のタイミングで実行され得る。 In addition, the image acquisition unit 421 may perform processing other than synthesis on the image before synthesis or the image after synthesis. The processing includes, for example, clipping, masking, filtering on a part or the whole of the image, correction, or viewpoint conversion. The correction is, for example, distortion correction or gamma correction. The viewpoint conversion is, for example, generating a bird's-eye view image from an image captured by the imaging unit 16. In addition, the image acquisition unit 421 may process the selected image even when only one image is selected as the underfloor image. The processing can be executed at an arbitrary timing from when the image is acquired until it is output to the display screen 8.
 通過軌跡演算部422は、リングバッファ404に蓄積されている走行履歴に基づいて、前輪3Fの通過軌跡105を演算する。通過軌跡演算部422は、2つの前輪3Fのそれぞれについて通過軌跡105を演算する。 The passing locus calculation unit 422 calculates the passing locus 105 of the front wheel 3F based on the traveling history accumulated in the ring buffer 404. The passage locus calculating unit 422 calculates the passage locus 105 for each of the two front wheels 3F.
 前輪3Fの通過軌跡105の演算方法は特定の方法に限定されない。一例では、通過軌跡演算部422は、リングバッファ404から、間近の所定期間に保存された複数の位置情報および向き情報を読み出す。そして、通過軌跡演算部422は、読み出した複数の位置情報のそれぞれに対応する前輪3Fの位置を、読み出した位置情報毎に演算する。車両1において前輪3Fが取り付けられている位置は既知または取得可能である。車両1の位置情報、向き情報、および車両1において前輪3Fが取り付けられている位置の組み合わせによって、前輪3Fの位置が一意に決定される。通過軌跡演算部422は、前輪3Fの位置を、対応する位置情報が保存されたタイミングが古い方から順番に演算することによって、前輪3Fの位置を時系列順に取得することができる。なお、通過軌跡演算部422は、時系列とは逆の順番に前輪3Fの位置を演算してもよい。前輪3Fの位置を時系列順または時系列とは逆の順に並べた列は、前輪3Fの通過軌跡105を示す離散データに該当する。 The calculation method of the passage locus 105 of the front wheel 3F is not limited to a specific method. In one example, the passage trajectory calculation unit 422 reads a plurality of position information and direction information stored in a nearby predetermined period from the ring buffer 404. Then, the passage trajectory calculation unit 422 calculates the position of the front wheel 3F corresponding to each of the plurality of read position information for each read position information. The position where the front wheel 3F is attached in the vehicle 1 is known or can be acquired. The position of the front wheel 3F is uniquely determined by the combination of the position information, the direction information of the vehicle 1, and the position where the front wheel 3F is attached in the vehicle 1. The passage trajectory calculation unit 422 can acquire the position of the front wheel 3F in chronological order by calculating the position of the front wheel 3F in order from the oldest timing when the corresponding position information is stored. Note that the passing locus calculation unit 422 may calculate the position of the front wheel 3F in the reverse order of the time series. A column in which the positions of the front wheels 3F are arranged in time series order or in the reverse order of the time series corresponds to discrete data indicating the passage trajectory 105 of the front wheels 3F.
 予測軌跡演算部423は、取得部401が取得した操舵情報に基づいて、車輪3の予測軌跡101,106を演算する。特に、本実施形態においては、予測軌跡演算部423は、操舵情報に基づいて、後輪3Rの予測軌跡106を演算する。後輪3Rの予測軌跡106の演算方法は特定の方法に限定されない。 The predicted trajectory calculation unit 423 calculates the predicted trajectories 101 and 106 of the wheels 3 based on the steering information acquired by the acquisition unit 401. In particular, in the present embodiment, the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R based on the steering information. The calculation method of the predicted trajectory 106 of the rear wheel 3R is not limited to a specific method.
 図9は、後輪3Rの予測軌跡106の演算方法の一例を説明するための図である。運転者が操舵部4を操作することによって前輪3Fのタイヤ角を変更した場合、前輪3Fの向き3Faと直交する方向と、後輪3Rを支持する後輪車軸38の延長方向との交点に車両1の旋回中心G1が存在する。つまり、車両1が前輪3Fのタイヤ角にしたがって旋回する場合、後輪車軸38の中心点38aもこの旋回中心G1を中心とする円弧38bに沿って移動する。また、後輪3Rも旋回中心G1を中心とする円弧に沿って移動する。このように、スリップが発生していない場合等の通常の走行で旋回が行われている場合、後輪3Rが辿る軌跡は、前輪3Fのタイヤ角に基づいて一意に決定される。予測軌跡演算部423は、旋回中心G1を中心とする後輪3Rが描く円弧の一部であって、後輪3Rが現在から所定の短い時間までに辿る線分を、後輪3Rの予測軌跡106として演算する。 FIG. 9 is a diagram for explaining an example of a calculation method of the predicted trajectory 106 of the rear wheel 3R. When the driver changes the tire angle of the front wheel 3F by operating the steering unit 4, the vehicle is located at the intersection of the direction orthogonal to the direction 3Fa of the front wheel 3F and the extension direction of the rear wheel axle 38 that supports the rear wheel 3R. There is one turning center G1. That is, when the vehicle 1 turns according to the tire angle of the front wheel 3F, the center point 38a of the rear wheel axle 38 also moves along the arc 38b centered on the turning center G1. The rear wheel 3R also moves along an arc centered on the turning center G1. As described above, when the vehicle is turning in a normal travel such as when no slip is generated, the trajectory followed by the rear wheel 3R is uniquely determined based on the tire angle of the front wheel 3F. The predicted trajectory calculation unit 423 is a part of an arc drawn by the rear wheel 3R centered on the turning center G1, and a line segment that the rear wheel 3R follows from the present to a predetermined short time is used as a predicted trajectory of the rear wheel 3R. It calculates as 106.
 予測軌跡演算部423は、操舵情報から前輪3Fのタイヤ角を演算し、演算されたタイヤ角と図9に示した関係とに基づいて後輪3Rの予測軌跡106を演算する。なお、取得部401は、前輪3Fのタイヤ角を操舵情報として取得し、予測軌跡演算部423は、取得された前輪3Fのタイヤ角を後輪3Rの予測軌跡106の演算に使用してもよい。 The predicted trajectory calculation unit 423 calculates the tire angle of the front wheel 3F from the steering information, and calculates the predicted trajectory 106 of the rear wheel 3R based on the calculated tire angle and the relationship shown in FIG. The acquisition unit 401 may acquire the tire angle of the front wheel 3F as steering information, and the predicted trajectory calculation unit 423 may use the acquired tire angle of the front wheel 3F for calculation of the predicted trajectory 106 of the rear wheel 3R. .
 なお、予測軌跡演算部423は、ARSなどが採用されることにより後輪3Rが転舵可能に構成される場合、後輪3Rのタイヤ角を考慮して後輪3Rの予測軌跡106を演算してもよい。例えば、通常の走行で旋回が行われる場合に後輪3Rが移動する円弧を演算することによって、後輪3Rの予測軌跡106を求めることが可能である。 When the rear wheel 3R is configured to be steerable by employing ARS or the like, the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R in consideration of the tire angle of the rear wheel 3R. May be. For example, the predicted trajectory 106 of the rear wheel 3R can be obtained by calculating an arc in which the rear wheel 3R moves when turning is performed during normal traveling.
 予測軌跡演算部423は、後輪3Rの予測軌跡106と同様の手順で、前輪3Fの予測軌跡101を演算する。例えば、予測軌跡演算部423は、旋回中心G1を中心とする前輪3Fが描く円弧の一部であって、前輪3Fが現在から所定の短い時間までに辿る線分を、前輪3Fの予測軌跡101として演算する。 The predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F in the same procedure as the predicted trajectory 106 of the rear wheel 3R. For example, the predicted trajectory calculation unit 423 is a part of an arc drawn by the front wheel 3F centered on the turning center G1, and the line segment that the front wheel 3F follows from the present to a predetermined short time is used as the predicted trajectory 101 of the front wheel 3F. Calculate as
 出力部424は、床下画像、輪郭線102~104、前輪3Fの通過軌跡105、および各予測軌跡101,106を、表示画面8に表示する。 The output unit 424 displays the underfloor image, the contour lines 102 to 104, the passing trajectory 105 of the front wheel 3F, and the predicted trajectories 101 and 106 on the display screen 8.
 次に、以上のように構成された実施形態の周辺監視装置の動作について説明する。図10は、実施形態の周辺監視装置における、画像の保存処理の手順を示すフローチャートである。図10の処理は、制御周期毎に実行される。制御周期は、一例では、S106の判断処理においてYesと判断される周期よりも充分に短い。 Next, the operation of the periphery monitoring device of the embodiment configured as described above will be described. FIG. 10 is a flowchart illustrating a procedure of image storage processing in the periphery monitoring device according to the embodiment. The process of FIG. 10 is executed for each control cycle. In one example, the control cycle is sufficiently shorter than the cycle determined as Yes in the determination process of S106.
 まず、撮像部16が、車両1の周辺環境を撮像する(S101)。特に撮像部16aは、車両1の進行方向のうち、路面を含む領域を撮像する。 First, the imaging unit 16 images the surrounding environment of the vehicle 1 (S101). In particular, the imaging unit 16 a images an area including the road surface in the traveling direction of the vehicle 1.
 次に、取得部401が、撮像部16から画像を、加速度センサ26から加速度データを取得する(S102)。そして、角度算出部402が、加速度データから、車両1のロール角及びピッチ角を算出する(S103)。そして、補正部411が、撮像された画像に対して、ロール角に応じた回転補正を行う(S104)。 Next, the acquisition unit 401 acquires an image from the imaging unit 16 and acceleration data from the acceleration sensor 26 (S102). Then, the angle calculation unit 402 calculates the roll angle and pitch angle of the vehicle 1 from the acceleration data (S103). Then, the correction unit 411 performs rotation correction corresponding to the roll angle on the captured image (S104).
 次に、推定部412が、加速度データ、舵角情報、および車輪速情報に基づいて、車両1の移動量を算出すると共に、現在の車両1の位置および向きを推定する(S105)。 Next, the estimation unit 412 calculates the movement amount of the vehicle 1 based on the acceleration data, the steering angle information, and the wheel speed information, and estimates the current position and orientation of the vehicle 1 (S105).
 次に、保存部413は、車両1の位置が、前回に画像を保存した際の位置から、所定の距離(例えば、0.1m)以上変化したか否かを判断する(S106)。保存部413によって、車両1の位置が所定の距離以上変化していないと判断された場合(S106:No)、処理を終了する。なお、本実施形態では、車両1の位置が所定の距離以上変化した場合に保存を行う例について説明しているが、このような保存手法に制限するものではなく、舵角が所定の角度以上変化した場合、または、所定時間毎に保存をおこなっても良い。 Next, the storage unit 413 determines whether or not the position of the vehicle 1 has changed by a predetermined distance (for example, 0.1 m) from the position when the image was stored last time (S106). When the storage unit 413 determines that the position of the vehicle 1 has not changed by a predetermined distance or more (S106: No), the process ends. In the present embodiment, an example is described in which storage is performed when the position of the vehicle 1 changes by a predetermined distance or more. However, the present invention is not limited to such a storage method, and the steering angle is a predetermined angle or more. You may preserve | save when it changes or every predetermined time.
 一方、保存部413が、車両1の位置が、前回に画像を保存した際の位置から、所定の距離(例えば、0.1m)以上変化したと判断した場合(S106:Yes)、保存部413が、リングバッファ404の最も古くに更新された領域に対して、補正した後の現在の画像を、上書き保存する(S107)。その際に、保存部413は、当該撮像画像の位置情報および向き情報を対応付けて保存する。 On the other hand, when the storage unit 413 determines that the position of the vehicle 1 has changed by a predetermined distance (for example, 0.1 m) or more from the position when the image was stored last time (S106: Yes), the storage unit 413. However, the corrected current image is overwritten and saved in the oldest updated area of the ring buffer 404 (S107). At that time, the storage unit 413 stores the position information and orientation information of the captured image in association with each other.
 次に、保存部413は、リングバッファ404に記憶されている各画像の撮像時の位置情報および向き情報を、現在の位置および向きを基準とした位置情報および向き情報に更新する(S108)。 Next, the storage unit 413 updates the position information and orientation information at the time of imaging of each image stored in the ring buffer 404 to position information and orientation information based on the current position and orientation (S108).
 図11は、実施形態にかかる周辺監視装置における、表示を制御する処理の手順を示すフローチャートである。 FIG. 11 is a flowchart illustrating a procedure of processing for controlling display in the periphery monitoring device according to the embodiment.
 まず、受付部405は、床下表示モードを開始する操作を受け付けたか否かを判断する(S201)。一例では、乗員による切り替えボタン85のタッチを検知した場合、操作入力部10が、その旨を検知してECU24に通知する。受付部405は、表示モードが通常表示モードであるときに切り替えボタン85のタッチの通知を受信した場合、床下表示モードを開始する操作を受け付けたと判断する。 First, the receiving unit 405 determines whether or not an operation for starting the underfloor display mode has been received (S201). In one example, when the touch of the switching button 85 by the occupant is detected, the operation input unit 10 detects that and notifies the ECU 24 of the fact. When the notification of the touch of the switching button 85 is received when the display mode is the normal display mode, the reception unit 405 determines that an operation for starting the underfloor display mode has been received.
 なお、床下表示モードを開始する条件は、これに限定されない。 Note that the condition for starting the underfloor display mode is not limited to this.
 床下表示モードを開始する操作を受け付けたと受付部405によって判断されない場合(S201:No)、処理が後述のS212に移る。床下表示モードを開始する操作を受け付けたと受付部405によって判断された場合(S201:Yes)、画像取得部421は、リングバッファ404から、現在通過中の路面が写っている画像を取得する(S202)。 If the reception unit 405 does not determine that an operation for starting the underfloor display mode has been received (S201: No), the process proceeds to S212 described below. When the reception unit 405 determines that an operation for starting the underfloor display mode has been received (S201: Yes), the image acquisition unit 421 acquires an image of the road surface currently passing from the ring buffer 404 (S202). ).
 また、通過軌跡演算部422は、リングバッファ404に格納されている過去の位置情報および向き情報に基づいて、前輪3Fの通過軌跡105を演算する(S203)。また、予測軌跡演算部423は、取得部401を介して舵角情報を取得する(S204)。そして、予測軌跡演算部423は、取得した舵角情報に基づいて、前輪3Fの予測軌跡101および後輪3Rの予測軌跡106を演算する(S205)。 Further, the passage locus calculation unit 422 calculates the passage locus 105 of the front wheel 3F based on the past position information and orientation information stored in the ring buffer 404 (S203). Further, the predicted trajectory calculation unit 423 acquires the steering angle information via the acquisition unit 401 (S204). Then, the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F and the predicted trajectory 106 of the rear wheel 3R based on the acquired steering angle information (S205).
 なお、通過軌跡105および各予測軌跡101,106は、任意の座標系を用いて表現され得る。一例では、現在の車両1の位置および向きを基準とする座標系で、通過軌跡105および各予測軌跡101,106が表現される。 Note that the passage trajectory 105 and the predicted trajectories 101 and 106 can be expressed using an arbitrary coordinate system. In one example, the passage trajectory 105 and the predicted trajectories 101 and 106 are expressed in a coordinate system based on the current position and orientation of the vehicle 1.
 次に、出力部424は、画像取得部421によって取得された画像に、車両1の輪郭線102を重畳する(S206)。このとき、出力部424は、車両1の輪郭線102が画像に含まれている車両1が現在通過中の位置に対応するように、車両1の輪郭線102と画像との位置合わせを行う。位置合わせは、画像と車両1の輪郭線102との対応関係が、現在の車両1と車両1の現在の床下の路面との対応関係に合致するように、車両1の輪郭線102を重畳する位置および車両1の輪郭線102の向きを調整することである。 Next, the output unit 424 superimposes the contour line 102 of the vehicle 1 on the image acquired by the image acquisition unit 421 (S206). At this time, the output unit 424 aligns the contour line 102 of the vehicle 1 and the image so that the vehicle 1 in which the contour line 102 of the vehicle 1 is included in the image corresponds to the position where the vehicle 1 is currently passing. In the alignment, the contour 102 of the vehicle 1 is superimposed so that the correspondence between the image and the contour 102 of the vehicle 1 matches the correspondence between the current vehicle 1 and the current road surface under the floor of the vehicle 1. It is to adjust the position and the direction of the outline 102 of the vehicle 1.
 位置合わせの方法は、特定の方法に限定されない。一例では、出力部424は、取得された画像に写っている領域における、現在の車両1の位置および向きを、画像に対応づけられている位置情報および向き情報と、撮像部16の光軸の向きおよび画角とに基づいて、画像に写っている領域において現在車両1が存在する位置および現在の車両1の向きを特定する。撮像部16の光軸の向きおよび画角は既知または取得可能である。 Alignment method is not limited to a specific method. In one example, the output unit 424 indicates the current position and orientation of the vehicle 1 in the region shown in the acquired image, the position information and orientation information associated with the image, and the optical axis of the imaging unit 16. Based on the direction and the angle of view, the position where the current vehicle 1 exists and the current direction of the vehicle 1 are specified in the region shown in the image. The direction of the optical axis and the angle of view of the imaging unit 16 are known or can be acquired.
 位置合わせにおいて、出力部424は、画像を基準として車両1の輪郭線102の位置および向きを調整してもよいし、車両1の輪郭線102を基準として画像の位置および向きを調整してもよい。出力部424は、車両1の輪郭線102が表示領域81の中央に位置し、車両1の輪郭線102の向きが表示領域81の上方を向くように、画像を調整してもよい。出力部424は、画像または車両1の輪郭線102に対し、必要に応じて視点変換などの加工を行ってもよい。 In the alignment, the output unit 424 may adjust the position and orientation of the contour line 102 of the vehicle 1 with reference to the image, or may adjust the position and orientation of the image with reference to the contour line 102 of the vehicle 1. Good. The output unit 424 may adjust the image so that the contour line 102 of the vehicle 1 is located in the center of the display area 81 and the direction of the contour line 102 of the vehicle 1 faces the display area 81. The output unit 424 may perform processing such as viewpoint conversion on the image or the outline 102 of the vehicle 1 as necessary.
 次に、出力部424は、画像に、前輪3Fの輪郭線103および後輪3Rの輪郭線104を重畳する(S207)。出力部424は、S207においても、S206と同様に、位置合わせを行う。車両1において前輪3Fおよび後輪3Rが取り付けられている位置は既知あるいは取得可能であるから、出力部424は、一例では、車両1の輪郭線102の位置と、車両1において前輪3Fおよび後輪3Rが取り付けられている位置と、に基づいて、前輪3Fの輪郭線103および後輪3Rの輪郭線104を重畳する位置をそれぞれ求める。なお、出力部424は、各車輪3の輪郭線103,104を舵角情報に応じた角度だけ回転して重畳してもよい。図5の例では、前輪3Fの輪郭線103が舵角情報に応じた角度だけ回転して表示されている。出力部424は、後輪3Rが転舵される場合には、後輪3Rの輪郭線104を回転して重畳してもよい。 Next, the output unit 424 superimposes the contour 103 of the front wheel 3F and the contour 104 of the rear wheel 3R on the image (S207). The output unit 424 performs alignment in S207 as in S206. Since the position where the front wheel 3F and the rear wheel 3R are attached in the vehicle 1 is known or can be acquired, the output unit 424, for example, the position of the contour 102 of the vehicle 1 and the front wheel 3F and the rear wheel in the vehicle 1 Based on the position at which 3R is attached, the positions at which the contour 103 of the front wheel 3F and the contour 104 of the rear wheel 3R are superimposed are determined. Note that the output unit 424 may rotate and superimpose the contour lines 103 and 104 of each wheel 3 by an angle corresponding to the steering angle information. In the example of FIG. 5, the contour 103 of the front wheel 3F is rotated and displayed by an angle corresponding to the steering angle information. When the rear wheel 3R is steered, the output unit 424 may rotate and superimpose the contour line 104 of the rear wheel 3R.
 なお、車両1の輪郭線102、前輪3Fの輪郭線103、および後輪3Rの輪郭線104が予めグループ化され、画像とグループ化された輪郭線102~104とが重畳されてもよい。 Note that the contour 102 of the vehicle 1, the contour 103 of the front wheel 3F, and the contour 104 of the rear wheel 3R may be grouped in advance, and the grouped contours 102 to 104 may be superimposed.
 また、出力部424は、画像に、前輪3Fの通過軌跡105を重畳し(S208)、画像に、前輪3Fの予測軌跡101および後輪3Rの予測軌跡106を重畳する(S209)。出力部424は、S208およびS209においても、S207などと同様の手法で、画像と各軌跡との位置合わせを行う。 Further, the output unit 424 superimposes the passing trajectory 105 of the front wheel 3F on the image (S208), and superimposes the predicted trajectory 101 of the front wheel 3F and the predicted trajectory 106 of the rear wheel 3R on the image (S209). In S208 and S209, the output unit 424 aligns the image and each locus using the same method as in S207.
 そして、出力部424は、各表示オブジェクトが重畳された画像を表示画面8に出力する(S210)。 And the output part 424 outputs the image on which each display object was superimposed on the display screen 8 (S210).
 次に、受付部405は、床下表示モードを終了する操作を受け付けたか否かを判断する(S211)。一例では、乗員による切り替えボタン85のタッチを検知した場合、操作入力部10が、その旨を検知してECU24に通知する。受付部405は、表示モードが床下表示モードであるときに切り替えボタン85のタッチの通知を受信した場合、床下表示モードを終了する操作を受け付けたと判断する。 Next, the receiving unit 405 determines whether or not an operation for ending the underfloor display mode has been received (S211). In one example, when the touch of the switching button 85 by the occupant is detected, the operation input unit 10 detects that and notifies the ECU 24 of the fact. When the notification of the touch of the switching button 85 is received when the display mode is the underfloor display mode, the accepting unit 405 determines that an operation for ending the underfloor display mode has been accepted.
 床下表示モードを終了する操作を受け付けていないと受付部405によって判断された場合(S211:No)、処理がS202に移る。 If the accepting unit 405 determines that an operation to end the underfloor display mode is not accepted (S211: No), the process proceeds to S202.
 S202からS211までの処理は、床下表示モードが終了するまで、所定の短い周期でループされる。各ループにおけるS210の処理によって、床下画像が逐次更新されるので、乗員は、車両1が走行している際に、表示画面8において、現在の車両1の床下の路面を映像として視認することができる。また、後輪3Rの輪郭線104は、各ループにおいて更新される。また、走行履歴としての位置情報が走行中に蓄積され、各ループにおいて、前輪3Fの通過軌跡105が、蓄積された走行履歴に応じて更新される。また、各ループにおいて舵角情報が取得され、取得された舵角情報に応じて後輪3Rの予測軌跡106が更新される。よって、乗員は、車両1の走行に応じて変化する、現在までに前輪3Fが通過した軌跡、後輪3Rの位置、および後輪3Rの予想される軌跡を、表示画面8を介して略リアルタイムに確認することができる。 The processing from S202 to S211 is looped at a predetermined short cycle until the underfloor display mode ends. Since the underfloor image is sequentially updated by the processing of S210 in each loop, the occupant can visually recognize the current road surface under the floor of the vehicle 1 as an image on the display screen 8 when the vehicle 1 is traveling. it can. Further, the contour line 104 of the rear wheel 3R is updated in each loop. In addition, position information as a travel history is accumulated during travel, and in each loop, the passing trajectory 105 of the front wheel 3F is updated according to the accumulated travel history. Further, the steering angle information is acquired in each loop, and the predicted trajectory 106 of the rear wheel 3R is updated according to the acquired steering angle information. Therefore, the occupant can change the locus of the front wheel 3F, the position of the rear wheel 3R, and the expected locus of the rear wheel 3R, which change according to the traveling of the vehicle 1 to the present, through the display screen 8 in substantially real time. Can be confirmed.
 床下表示モードを終了する操作を受け付けたと受付部405によって判断された場合(S211:Yes)、表示処理部406は、通常表示モードの表示内容を表示画面8に表示する(S212)。そして、処理はS201に戻る。 When the reception unit 405 determines that an operation to end the underfloor display mode has been received (S211: Yes), the display processing unit 406 displays the display content of the normal display mode on the display screen 8 (S212). Then, the process returns to S201.
 なお、以上の説明においては、前輪3Fの通過軌跡105を演算するための走行履歴は、位置情報および向き情報を含む、として説明した。走行履歴は、これに限定されない。 In the above description, the traveling history for calculating the passing trajectory 105 of the front wheel 3F has been described as including position information and direction information. The travel history is not limited to this.
 一例では、走行履歴は、位置情報のみを含む。その場合、通過軌跡演算部422は、保存されたタイミングが連続する2つの位置情報を結ぶベクトルを、各撮像タイミングにおける車両1の向きとして演算してもよい。 In one example, the travel history includes only position information. In that case, the passage trajectory calculation unit 422 may calculate a vector connecting two pieces of position information with consecutive stored timings as the direction of the vehicle 1 at each imaging timing.
 別の例では、走行履歴は、前輪3Fの位置を含んでいてもよい。推定部412は、車両1の位置と、車両1において前輪3Fが取り付けられている位置と、に基づいて、前輪3Fの位置を推定し、保存部413は、推定された前輪3Fの位置をリングバッファ404に保存する。通過軌跡演算部422は、間近の所定期間に保存された複数の前輪3Fの位置をリングバッファ404から読み出すことによって、前輪3Fの通過軌跡105を示す離散データを取得する。 In another example, the traveling history may include the position of the front wheel 3F. The estimation unit 412 estimates the position of the front wheel 3F based on the position of the vehicle 1 and the position where the front wheel 3F is attached in the vehicle 1, and the storage unit 413 uses the estimated position of the front wheel 3F as a ring. Save to buffer 404. The passage trajectory calculation unit 422 obtains discrete data indicating the passage trajectory 105 of the front wheel 3F by reading out the positions of the plurality of front wheels 3F stored during a predetermined period of time from the ring buffer 404.
 また、車両1が前進する場合について説明したが、以上の技術は、車両1が後進する場合においても適用可能である。車両1が後進する場合、車両1の進行方向は、車両1の後方向である。したがって、画像取得部421は、撮像部16cによって撮像された画像から床下画像を取得することができる。また、車両1が後進する場合、周辺監視装置は、後輪3Rの通過軌跡と、前輪3Fの位置を示す表示オブジェクトと、を画像に重畳して表示画面8に表示してもよい。さらに、周辺監視装置は、車両1が後進する場合、前輪3Fの予測軌跡を画像に重畳して表示画面8に表示してもよい。 Further, although the case where the vehicle 1 moves forward has been described, the above technique can also be applied when the vehicle 1 moves backward. When the vehicle 1 moves backward, the traveling direction of the vehicle 1 is the backward direction of the vehicle 1. Therefore, the image acquisition unit 421 can acquire an underfloor image from the image captured by the imaging unit 16c. In addition, when the vehicle 1 moves backward, the periphery monitoring device may display the passing trajectory of the rear wheel 3R and the display object indicating the position of the front wheel 3F on the display screen 8 so as to be superimposed on the image. Further, when the vehicle 1 moves backward, the periphery monitoring device may superimpose the predicted trajectory of the front wheel 3F on the image and display it on the display screen 8.
 以上述べたように、実施形態によれば、リングバッファ404は、車両1の走行履歴が蓄積されている。画像取得部421は、車両1の床下の路面を示す画像を取得する。通過軌跡演算部422は、前輪3Fの通過軌跡105を、走行履歴に基づいて演算する。出力部424は、前輪3Fの通過軌跡105と、後輪3Rの位置を示す後輪3Rの輪郭線104と、車両1の床下の路面を示す画像とを、表示画面8に重畳表示する。これにより、運転者は、いちど前輪3Fが通過した軌跡上を後輪3Rが通過するように操舵する、といった運転が可能となるので、周辺監視装置の利便性が高い。 As described above, according to the embodiment, the ring buffer 404 stores the travel history of the vehicle 1. The image acquisition unit 421 acquires an image indicating a road surface under the floor of the vehicle 1. The passage locus calculation unit 422 calculates the passage locus 105 of the front wheel 3F based on the travel history. The output unit 424 superimposes and displays the trajectory 105 of the front wheel 3F, the contour 104 of the rear wheel 3R indicating the position of the rear wheel 3R, and an image showing the road surface under the floor of the vehicle 1 on the display screen 8. As a result, the driver can perform a driving operation such that the rear wheel 3R passes on the trajectory through which the front wheel 3F has passed. Therefore, the convenience of the periphery monitoring device is high.
 また、予測軌跡演算部423は、後輪3Rの予測軌跡106を操舵情報に基づいて演算する。そして、出力部424は、後輪3Rの予測軌跡106を表示画面8にさらに重畳表示する。これにより、運転者は、前輪3Fが通過した軌跡上を後輪3Rを通過させるといった運転をより容易に行うことが可能となる。 Also, the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R based on the steering information. Then, the output unit 424 further superimposes and displays the predicted trajectory 106 of the rear wheel 3R on the display screen 8. As a result, the driver can more easily perform driving such that the rear wheel 3R is allowed to pass on the locus through which the front wheel 3F has passed.
 また、予測軌跡演算部423は、前輪3Fの予測軌跡101を操舵情報に基づいて演算する。そして、出力部424は、前輪3Fの予測軌跡101を表示画面8にさらに重畳表示する。運転者は、前輪3Fがこれから進行する位置を表示画面8で確認することが可能となるので、より安全に車両1を運転することが可能となる。 Also, the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F based on the steering information. Then, the output unit 424 further superimposes and displays the predicted trajectory 101 of the front wheel 3F on the display screen 8. Since the driver can confirm the position where the front wheel 3F will proceed from now on the display screen 8, the driver 1 can drive the vehicle 1 more safely.
 また、リングバッファ404は、車両1の進行方向の路面を撮像する撮像部16によって第1のタイミングに撮像された画像を記憶する。画像取得部421は、リングバッファ404に保存されている第1のタイミングに撮像された画像を、当該画像に写っている路面上を車両1が通過する第2のタイミングに、車両1の床下を示す画像として取得する。第2のタイミングは第1のタイミングよりも後のタイミングである。これにより、過去に撮像された画像が現在の車両1の床下の路面を示す画像として使用されるので、車両1の床下の路面を撮像する撮像部が不要となる。 Further, the ring buffer 404 stores an image captured at the first timing by the imaging unit 16 that images the road surface in the traveling direction of the vehicle 1. The image acquisition unit 421 moves the image captured at the first timing stored in the ring buffer 404 under the floor of the vehicle 1 at the second timing when the vehicle 1 passes on the road surface reflected in the image. Get as an image to show. The second timing is a timing after the first timing. Thereby, since the image imaged in the past is used as an image which shows the road surface under the floor of the present vehicle 1, the imaging part which images the road surface under the floor of the vehicle 1 becomes unnecessary.
 以上、本発明の実施形態を例示したが、上記実施形態および変形例はあくまで一例であって、発明の範囲を限定することは意図していない。上記実施形態や変形例は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、組み合わせ、変更を行うことができる。また、各実施形態や各変形例の構成や形状は、部分的に入れ替えて実施することも可能である。 As mentioned above, although embodiment of this invention was illustrated, the said embodiment and modification are an example to the last, Comprising: It is not intending limiting the range of invention. The above-described embodiments and modifications can be implemented in various other forms, and various omissions, replacements, combinations, and changes can be made without departing from the scope of the invention. In addition, the configuration and shape of each embodiment and each modification may be partially exchanged.
 1…車両、2…車体、2a…車室、2b…座席、2c,2d,2e,2f…端部、2g…ドアミラー、2h…ドア、3…車輪、3F…前輪、3R…後輪、4…操舵部、5…加速操作部、6…制動操作部、7…変速操作部、8…表示画面、9…音声出力装置、10…操作入力部、11…モニタ装置、13…操舵システム、13a…アクチュエータ、13b…トルクセンサ、16,16a,16b,16c,16d…撮像部、18…ブレーキシステム、18a…アクチュエータ、18b…ブレーキセンサ、19…舵角センサ、20…アクセルセンサ、21…シフトセンサ、22…車輪速センサ、23…車内ネットワーク、24…ECU、24a…CPU、24b…ROM、24c…RAM、24d…表示制御部、24e…音声制御部、24f…SSD、26,26a,26b…加速度センサ、38…後輪車軸、38a…中心点、38b…円弧、81,82,83,84…表示領域、85…切り替えボタン、100…周辺監視システム、101,106…予測軌跡、102,103,104…輪郭線、105…通過軌跡、401…取得部、402…角度算出部、403…保存処理部、404…リングバッファ、405…受付部、406…表示処理部、411…補正部、412…推定部、413…保存部、421…画像取得部、422…通過軌跡演算部、423…予測軌跡演算部、424…出力部、501…路面、502,503…撮像領域。 DESCRIPTION OF SYMBOLS 1 ... Vehicle, 2 ... Vehicle body, 2a ... Cabin, 2b ... Seat, 2c, 2d, 2e, 2f ... End part, 2g ... Door mirror, 2h ... Door, 3 ... Wheel, 3F ... Front wheel, 3R ... Rear wheel, 4 DESCRIPTION OF SYMBOLS ... Steering part, 5 ... Acceleration operation part, 6 ... Braking operation part, 7 ... Shift operation part, 8 ... Display screen, 9 ... Audio | voice output apparatus, 10 ... Operation input part, 11 ... Monitor apparatus, 13 ... Steering system, 13a ... Actuator, 13b ... Torque sensor, 16, 16a, 16b, 16c, 16d ... Imaging unit, 18 ... Brake system, 18a ... Actuator, 18b ... Brake sensor, 19 ... Steering angle sensor, 20 ... Accelerator sensor, 21 ... Shift sensor 22 ... wheel speed sensor, 23 ... in-vehicle network, 24 ... ECU, 24a ... CPU, 24b ... ROM, 24c ... RAM, 24d ... display control unit, 24e ... voice control unit, 24f ... SD, 26, 26a, 26b ... acceleration sensor, 38 ... rear wheel axle, 38a ... center point, 38b ... arc, 81, 82, 83, 84 ... display area, 85 ... switching button, 100 ... peripheral monitoring system, 101, 106 ... Predicted locus, 102, 103, 104 ... Contour line, 105 ... Passing locus, 401 ... Acquisition unit, 402 ... Angle calculation unit, 403 ... Storage processing unit, 404 ... Ring buffer, 405 ... Reception unit, 406 ... Display processing 411 ... correction unit 412 ... estimation unit 413 ... storage unit 421 ... image acquisition unit 422 ... passing trajectory calculation unit 423 ... predicted trajectory calculation unit 424 ... output unit 501 ... road surface 502, 503 ... Imaging area.

Claims (4)

  1.  車両の走行履歴を記憶する記憶部と、
     前記車両の床下の路面を示す第1の画像を取得する画像取得部と、
     前記車両の前輪が通過した第1の軌跡を前記走行履歴に基づいて演算する第1演算部と、
     前記第1の画像と、前記第1の軌跡と、前記車両の後輪の位置を示す表示オブジェクトとを、前記車両の車室内に設けられた表示画面に重畳表示する出力部と、
     を備える周辺監視装置。
    A storage unit for storing a running history of the vehicle;
    An image acquisition unit for acquiring a first image indicating a road surface under the floor of the vehicle;
    A first calculation unit that calculates a first trajectory through which the front wheels of the vehicle have passed, based on the travel history;
    An output unit that superimposes and displays the first image, the first trajectory, and a display object indicating a position of a rear wheel of the vehicle on a display screen provided in a vehicle interior of the vehicle;
    A peripheral monitoring device comprising:
  2.  前記車両の後輪の予測軌跡を示す第2の軌跡を操舵情報に基づいて演算する第2演算部をさらに備え、
     前記出力部は、前記第2の軌跡を前記表示画面にさらに重畳表示する、
     請求項1に記載の周辺監視装置。
    A second computing unit that computes a second locus indicating a predicted locus of the rear wheel of the vehicle based on steering information;
    The output unit further superimposes and displays the second locus on the display screen;
    The periphery monitoring device according to claim 1.
  3.  前記第2演算部は、さらに、前記車両の前輪の予測軌跡を示す第3の軌跡を前記操舵情報に基づいて演算し、
     前記出力部は、前記第3の軌跡を前記表示画面にさらに重畳表示する、
     請求項2に記載の周辺監視装置。
    The second calculation unit further calculates a third locus indicating a predicted locus of the front wheels of the vehicle based on the steering information,
    The output unit further superimposes and displays the third locus on the display screen;
    The periphery monitoring apparatus according to claim 2.
  4.  前記記憶部は、前記車両に設けられ前記車両の進行方向の路面を撮像する撮像装置によって第1のタイミングに撮像された第2の画像を記憶し、
     前記画像取得部は、前記第2の画像に写っている路面上を前記車両が通過する前記第1のタイミングよりも後の第2のタイミングに、前記記憶部に記憶された前記第2の画像を前記第1の画像として取得する、
     請求項1に記載の周辺監視装置。
    The storage unit stores a second image captured at a first timing by an imaging device that is provided in the vehicle and captures a road surface in a traveling direction of the vehicle.
    The image acquisition unit is configured to store the second image stored in the storage unit at a second timing after the first timing when the vehicle passes on the road surface reflected in the second image. As the first image,
    The periphery monitoring device according to claim 1.
PCT/JP2017/010537 2016-08-05 2017-03-15 Periphery monitoring device WO2018025441A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016154549A JP2018020724A (en) 2016-08-05 2016-08-05 Periphery monitoring device
JP2016-154549 2016-08-05

Publications (1)

Publication Number Publication Date
WO2018025441A1 true WO2018025441A1 (en) 2018-02-08

Family

ID=61073175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010537 WO2018025441A1 (en) 2016-08-05 2017-03-15 Periphery monitoring device

Country Status (2)

Country Link
JP (1) JP2018020724A (en)
WO (1) WO2018025441A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112309157A (en) * 2019-07-23 2021-02-02 丰田自动车株式会社 Image display device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7319593B2 (en) * 2020-02-13 2023-08-02 トヨタ自動車株式会社 Vehicle perimeter monitoring device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014067299A (en) * 2012-09-26 2014-04-17 Aisin Seiki Co Ltd Vehicle driving assist device
JP2016021653A (en) * 2014-07-14 2016-02-04 アイシン精機株式会社 Periphery monitoring device and program
JP2016101872A (en) * 2014-11-28 2016-06-02 アイシン精機株式会社 Vehicle periphery monitoring device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4696691B2 (en) * 2005-05-27 2011-06-08 アイシン・エィ・ダブリュ株式会社 Parking support method and parking support device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014067299A (en) * 2012-09-26 2014-04-17 Aisin Seiki Co Ltd Vehicle driving assist device
JP2016021653A (en) * 2014-07-14 2016-02-04 アイシン精機株式会社 Periphery monitoring device and program
JP2016101872A (en) * 2014-11-28 2016-06-02 アイシン精機株式会社 Vehicle periphery monitoring device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112309157A (en) * 2019-07-23 2021-02-02 丰田自动车株式会社 Image display device

Also Published As

Publication number Publication date
JP2018020724A (en) 2018-02-08

Similar Documents

Publication Publication Date Title
US9902323B2 (en) Periphery surveillance apparatus and program
JP6060910B2 (en) Perimeter monitoring apparatus and program
US9216765B2 (en) Parking assist apparatus, parking assist method and program thereof
JP6015314B2 (en) Device for calculating parking target position, method and program for calculating parking target position
US10077045B2 (en) Parking assist system, method, and non-transitory computer readable medium storing program
JP6380002B2 (en) Perimeter monitoring device
US9505436B2 (en) Parking assist system
JP6447060B2 (en) Vehicle periphery monitoring device
US9902427B2 (en) Parking assistance device, parking assistance method, and non-transitory computer readable medium storing program
US9919735B2 (en) Control system and control method for vehicle
WO2018150642A1 (en) Surroundings monitoring device
JP6642306B2 (en) Perimeter monitoring device
CN107791951B (en) Display control device
US10353396B2 (en) Vehicle periphery monitoring device
JP6629156B2 (en) Parking assistance device
WO2018198531A1 (en) Parking assistance device
CN109314770B (en) Peripheral monitoring device
JP2018144567A (en) Driving support device
WO2018025441A1 (en) Periphery monitoring device
US20200140011A1 (en) Parking assistance apparatus
JP2019138655A (en) Traveling support device
JP2017211814A (en) Parking support device
JP2019018616A (en) Parking support display controller, parking support system, method, and program
JP2018006943A (en) Vehicle periphery monitoring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17836548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17836548

Country of ref document: EP

Kind code of ref document: A1