WO2018025441A1 - Dispositif de surveillance de périphérie - Google Patents

Dispositif de surveillance de périphérie Download PDF

Info

Publication number
WO2018025441A1
WO2018025441A1 PCT/JP2017/010537 JP2017010537W WO2018025441A1 WO 2018025441 A1 WO2018025441 A1 WO 2018025441A1 JP 2017010537 W JP2017010537 W JP 2017010537W WO 2018025441 A1 WO2018025441 A1 WO 2018025441A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
image
unit
trajectory
display
Prior art date
Application number
PCT/JP2017/010537
Other languages
English (en)
Japanese (ja)
Inventor
拓也 橋川
Original Assignee
アイシン精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン精機株式会社 filed Critical アイシン精機株式会社
Publication of WO2018025441A1 publication Critical patent/WO2018025441A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • Embodiments of the present invention relate to a periphery monitoring device mounted on a vehicle.
  • an imaging device installed in a vehicle images the surrounding environment of the vehicle and displays an image as an imaging result.
  • One of the problems of the present invention is to provide a peripheral monitoring device that is highly convenient.
  • a storage unit that stores a travel history of a vehicle, an image acquisition unit that acquires a first image showing a road surface under the floor of the vehicle, and a front wheel of the vehicle have passed.
  • a first computing unit that computes the first trajectory based on the travel history, a first image, a first trajectory, and a display object that indicates the position of the rear wheel of the vehicle are provided in the vehicle interior of the vehicle.
  • an output unit for superimposing and displaying on the display screen Therefore, the driver can perform a driving operation such that the rear wheel passes on the trajectory through which the front wheel has passed. Therefore, the periphery monitoring device according to the embodiment of the present invention is highly convenient.
  • the periphery monitoring device further includes, as an example, a second calculation unit that calculates a second locus indicating the predicted locus of the rear wheel of the vehicle based on the steering information, and the output unit includes 2 is further superimposed on the display screen.
  • the second calculation unit further calculates a third locus indicating the predicted locus of the front wheels of the vehicle based on the steering information, and the output unit 3 trajectories are further superimposed on the display screen. Therefore, the driver can check the position where the front wheel will advance from the display screen, and can drive the vehicle more safely.
  • the storage unit stores a second image captured at a first timing by an imaging device that is provided in the vehicle and images a road surface in the traveling direction of the vehicle. Then, the image acquisition unit receives the second image stored in the storage unit at the second timing after the first timing when the vehicle passes on the road surface reflected in the second image. Obtain as an image. Therefore, it is not necessary to install an imaging device that directly images the road surface under the floor of the vehicle.
  • FIG. 1 is a perspective view illustrating an example of a state in which a part of a passenger compartment of a vehicle on which the periphery monitoring device according to the embodiment is mounted is seen through.
  • FIG. 2 is a plan view (bird's eye view) illustrating an example of a vehicle on which the periphery monitoring device according to the embodiment is mounted.
  • FIG. 3 is a block diagram illustrating an example of a configuration of a periphery monitoring system including the periphery monitoring device according to the embodiment.
  • FIG. 4 is a diagram illustrating a display example in the normal display mode output by the periphery monitoring device according to the embodiment.
  • FIG. 5 is a diagram illustrating a display example in the underfloor display mode output from the periphery monitoring device according to the embodiment.
  • FIG. 6 is a diagram for explaining a concept of a method by which the periphery monitoring device according to the embodiment acquires an underfloor image.
  • FIG. 7 is a block diagram illustrating a functional configuration of the ECU as the periphery monitoring device according to the embodiment.
  • FIG. 8 is a diagram illustrating a structure of a ring buffer included in the periphery monitoring device according to the embodiment.
  • FIG. 9 is a diagram for explaining an example of a method for calculating the predicted trajectory of the rear wheels by the periphery monitoring device according to the embodiment.
  • FIG. 10 is a flowchart illustrating a procedure of image storage processing in the periphery monitoring device according to the embodiment.
  • FIG. 11 is a flowchart illustrating a procedure of processing for controlling display in the periphery monitoring device according to the embodiment.
  • FIG. 1 is a perspective view showing an example of a state in which a part of a passenger compartment 2a of a vehicle 1 equipped with a periphery monitoring device according to an embodiment is seen through.
  • FIG. 2 is a plan view (bird's eye view) illustrating an example of a vehicle 1 on which the periphery monitoring device according to the embodiment is mounted.
  • FIG. 3 is a block diagram illustrating an example of a configuration of the periphery monitoring system 100 including the periphery monitoring device according to the embodiment.
  • the vehicle 1 may be, for example, an automobile having an internal combustion engine (not shown) as a drive source, that is, an internal combustion engine automobile, or an automobile having an electric motor (not shown) as a drive source, that is, an electric vehicle or a fuel cell vehicle. Alternatively, an automobile using both of them as a driving source may be used.
  • the vehicle 1 can be mounted with various transmissions, and can be mounted with various devices, systems, components, or the like necessary for driving the internal combustion engine or the electric motor.
  • the system, number, layout, and the like of devices related to driving of the wheels 3 in the vehicle 1 can be variously set.
  • the vehicle body 2 constitutes a passenger compartment 2a in which a passenger (not shown) gets.
  • a steering unit 4, an acceleration operation unit 5, a braking operation unit 6, and a transmission operation unit 7 are provided in the passenger compartment 2 a so as to face the driver's seat 2 b as an occupant.
  • the steering unit 4 is a steering wheel protruding from a dashboard (instrument panel)
  • the acceleration operation unit 5 is an accelerator pedal positioned under the driver's feet
  • a braking operation unit Reference numeral 6 denotes a brake pedal positioned under the driver's feet
  • the shift operation unit 7 is a shift lever protruding from the center console.
  • the steering unit 4, the acceleration operation unit 5, the braking operation unit 6, and the speed change operation unit 7 are not limited to these.
  • a monitor device 11 is provided in the passenger compartment 2a.
  • the monitor device 11 has a display screen 8 and an audio output device 9.
  • the display screen 8 is configured by a display device.
  • the display device is, for example, an LCD (liquid crystal display) or an OELD (organic electroluminescent display).
  • the audio output device 9 is a speaker as an example.
  • the display screen 8 is covered with a transparent operation input unit 10.
  • the operation input unit 10 is, for example, a touch panel. An occupant or the like can visually recognize the image displayed on the display screen 8 via the operation input unit 10.
  • the monitor apparatus 11 is provided in the center part of the vehicle width direction (left-right direction) of a dashboard as an example.
  • the monitor device 11 may include an operation input unit other than the touch panel.
  • the monitor device 11 may be provided with a switch, a dial, a joystick, or a push button as another operation input unit.
  • the monitor device 11 can be used as a navigation system or an audio system, for example, or can be provided separately from these systems.
  • the vehicle 1 is a four-wheeled vehicle and includes two left and right front wheels 3 ⁇ / b> F and two right and left rear wheels 3 ⁇ / b> R.
  • the tire angle of the front wheel 3 ⁇ / b> F changes corresponding to the operation of the steering unit 4.
  • the steering system 13 is, for example, an electric power steering system or an SBW (steer by wire) system.
  • the steering system 13 supplements the steering force by adding an assist torque to the steering unit 4 by the actuator 13a or steers the wheel 3.
  • the steering system 13 detects the torque that the driver gives to the steering unit 4 by the torque sensor 13b.
  • the steering system 13 may be a rear wheel steering device (ARS: Active Rear Steering).
  • ARS Active Rear Steering
  • the rear wheel steering device steers the rear wheel 3R. Specifically, when the rear wheel steering device is employed, the rear wheel 3R is steered in the same phase as or opposite to the steering angle of the front wheel 3F depending on the driving state of the vehicle 1 (for example, vehicle speed or turning state). Is done.
  • the steering system 13 may be configured to be able to steer the front wheel 3F and the rear wheel 3R independently or in association with each other.
  • the two front wheels 3F are steered substantially in parallel with each other in the same phase (the same phase, the same turning direction, and the same turning direction), and the two rear wheels 3R are steered in the same phase and substantially parallel.
  • the drive wheels can be set in various ways.
  • the vehicle 1 is provided with a plurality of imaging units 16.
  • the imaging unit 16 is an imaging device including an imaging device such as a CCD (charge coupled device) or a CIS (CMOS image sensor).
  • the imaging unit 16 can output an image at a predetermined frame rate.
  • Each of the imaging units 16 includes a wide-angle lens or a fish-eye lens, and can capture a range of 140 ° to 190 °, for example, in the horizontal direction.
  • the optical axis of the imaging unit 16 is set downward (for example, vertically or obliquely downward).
  • the imaging unit 16a is provided on the front end 2c (for example, a front grill) of the vehicle body 2.
  • the imaging unit 16a can image the surrounding environment in the forward direction of the vehicle 1.
  • the direction in which the driver seated on the seat 2 b faces the front that is, the windshield side viewed from the driver, is the front direction of the vehicle 1 and the front side of the vehicle body 2.
  • the imaging unit 16b is provided on the left end 2d of the vehicle body 2, more specifically, on the left door mirror 2g.
  • the imaging unit 16b can capture the surrounding environment in the left direction of the vehicle 1.
  • the imaging unit 16c is provided at a rear end 2e of the vehicle body 2 (for example, a wall portion below the rear trunk door 2h).
  • the imaging unit 16c can capture the surrounding environment in the rear direction of the vehicle 1.
  • the imaging unit 16d is provided on the right end 2f of the vehicle body 2, more specifically, on the right door mirror 2g.
  • the imaging unit 16d can capture the surrounding environment in the right direction of the vehicle 1.
  • the surrounding environment refers to the situation around the vehicle 1 including the road surface around the vehicle 1.
  • any one of the imaging units 16 can output an image showing the road surface in the traveling direction of the vehicle 1, the configuration of each imaging unit 16, the number of imaging units 16, the location of each imaging unit 16, and each The orientation of the imaging unit 16 is not limited to the above-described content.
  • the ECU (Electronic Control Unit) 24 is an example of the periphery monitoring device of the embodiment.
  • the ECU 24 can perform arithmetic processing and image processing based on the images obtained by the plurality of imaging units 16 and can display the image subjected to the image processing on the display screen 8.
  • the perimeter monitoring device can control the display content of the display screen 8 at least in the underfloor display mode.
  • the underfloor display mode is a display mode in which an image showing at least the road surface under the current floor of the vehicle 1 is displayed.
  • the display mode of the periphery monitoring device includes an underfloor display mode and a normal display mode.
  • FIG. 4 is a diagram showing a display example in the normal display mode.
  • the display screen 8 includes a display area 81, a display area 82, a display area 83, and a display area 84.
  • the normal display mode an image of the current surrounding environment in the front direction of the vehicle 1 captured by the imaging unit 16a is displayed in the display area 81.
  • the display area 82 an image captured by the imaging unit 16b and showing the current surrounding environment in the left direction of the vehicle 1 is displayed.
  • the display area 83 an image captured by the imaging unit 16d and showing the current surrounding environment in the right direction of the vehicle 1 is displayed.
  • a display object indicating the rear shape of the vehicle 1 that indicates the inclination of the vehicle 1 or the like is displayed.
  • a switching button 85 is displayed on the right side of the display area 84.
  • the predicted trajectory 101 of the front wheel 3F is superimposed on the display area 81 and displayed.
  • FIG. 5 is a diagram showing a display example in the underfloor display mode.
  • the display area 81 displays at least an image showing the current underfloor road surface of the vehicle 1 in the surrounding environment.
  • the road surface around the vehicle 1 including the road surface under the floor of the vehicle 1 is displayed in the display area 81.
  • an image showing at least the current road surface under the floor is referred to as an underfloor image.
  • a contour line 102 which is a display object indicating the outer shape of the vehicle 1 indicates a contour line 103 which is a display object indicating the outer shape of the front wheel 3F of the vehicle 1 and the outer shape of the rear wheel 3R of the vehicle 1. It is displayed together with the outline 104 which is a display object.
  • the outline 102 of the vehicle 1 is an example of identification information that can identify the current position of the vehicle 1.
  • the contour 103 of the front wheel 3F is an example of identification information that can identify the current position of the front wheel 3F.
  • the contour 104 of the rear wheel 3R is an example of identification information that can identify the current position of the rear wheel 3R.
  • the contour lines 102 to 104 are superimposed on the underfloor image. A region surrounded by the contour line 102 indicates a road surface under the current floor.
  • a passage locus 105 which is a display object indicating a locus through which the front wheel 3F has passed, is further displayed in the display area 81.
  • an off-road such as a rocky place, a muddy road, or a snowy road
  • a point with a track record through which the front wheel 3F has passed is relatively safe.
  • the periphery monitoring device of the embodiment presents at least the passing trajectory 105 and the contour line 104 of the rear wheel 3R together with an image showing the current road surface under the floor to the driver, so that the driver is once in front wheel 3F. It is possible to drive the vehicle so that the rear wheel 3R passes on the trajectory through which the vehicle has passed. Since safer driving is possible, the periphery monitoring device is more convenient than the case where the passing track 105 or the contour line 104 of the rear wheel 3R is not presented.
  • the predicted trajectory 106 of the rear wheel 3R is further displayed in the display area 81.
  • the predicted trajectory 106 of the rear wheel 3R changes based on the tire angle. Since the tire angle is linked to the steering amount of the steering unit 4, the predicted locus 106 of the rear wheel 3 ⁇ / b> R is linked to the steering amount of the steering unit 4.
  • the periphery monitoring device of the embodiment further presents the predicted trajectory 106 of the rear wheel 3R to the driver, so that the driver can adjust the rear wheel 3R to the current steering amount together with the trajectory that the front wheel 3F has passed so far.
  • the trajectory that travels accordingly can be visually recognized on the display screen 8. Therefore, the driver can more easily perform the driving of passing the rear wheel 3R on the trajectory through which the front wheel 3F has passed.
  • the predicted locus 101 of the front wheel 3F is further displayed in the display area 81 in the underfloor display mode.
  • the predicted trajectory 101 of the front wheel 3F may not necessarily be displayed.
  • the passing trajectory 105 of the front wheel 3F is displayed as a dotted line, but the passing trajectory 105 of the front wheel 3F can be displayed in any other display manner.
  • the passing trajectory 105 of the front wheel 3F can be displayed by, for example, a solid line or a broken line.
  • the passage trajectory 105 of the front wheel 3F may be displayed with a frame line having the same width as the width of each wheel 3.
  • each of the predicted trajectories 101 and 106 can be displayed in an arbitrary display mode.
  • each of the predicted trajectories 101 and 106 has the same width as the width of each wheel 3 and is displayed with a frame line having a predetermined length.
  • condition for switching between the normal display mode and the underfloor display mode is not limited to a specific condition.
  • switching between the normal display mode and the underfloor display mode is performed by touching the switching button 85.
  • the normal display mode may not be provided. There may be other display modes different from the normal display mode.
  • the underfloor display mode In the underfloor display mode, arbitrary information can be displayed in the display areas 82 to 84 of the display screen 8. In the underfloor display mode, images different from the normal display mode may be displayed in the display areas 82 to 84. In the underfloor display mode, the underfloor image may be displayed on the display screen 8 together with the respective display objects (the contour lines 102 to 104, the passing trajectory 105 of the front wheel 3F, and the predicted trajectories 101 and 106). In the example of FIG. 5, in the underfloor display mode, the display areas 82 to 84 are displayed in the same manner as in the normal display mode.
  • the periphery monitoring device updates the display content at a predetermined rate in both the display mode under the floor and the normal display mode. Therefore, the occupant can visually recognize the current surrounding environment of the vehicle 1 as an image from the display screen 8 regardless of the display mode of the underfloor display mode or the normal display mode.
  • the update rate of the display content may be the same as or different from the rate at which the imaging unit 16 outputs an image.
  • an image showing the current road surface under the floor is selected from a plurality of images captured in the past, and the selected image is displayed as the underfloor image.
  • FIG. 6 is a diagram for explaining the concept of a method for acquiring an underfloor image according to the present embodiment.
  • the periphery monitoring device stores an image captured at a slightly previous timing by the imaging unit 16a, and uses the image as an image indicating a road surface on which the vehicle 1 is currently passing, that is, an underfloor image. be able to.
  • FIG. 6 is a diagram for explaining the concept of a method for acquiring an underfloor image according to the present embodiment.
  • the road surface 501 through which the vehicle 1 is passing is not included in the imaging region 502 of the imaging unit 16a, but the imaging unit 16a at time t0.
  • the time t0 is a time before the time t1, and is a time when the vehicle 1 is located on the opposite side of the traveling direction from the position of the vehicle 1 at the time t0.
  • the periphery monitoring device uses the image captured by the imaging unit 16a at time t0 as the underfloor image at time t1.
  • the method for acquiring the underfloor image is not limited to the above method.
  • an imaging unit that images the current road surface under the floor may be provided under the floor of the vehicle 1, and the periphery monitoring device may use an image output by the imaging unit as the underfloor image.
  • the vehicle 1 includes a steering system 13, a brake system 18, a steering angle sensor 19, an accelerator sensor 20, a shift sensor 21, a wheel speed sensor 22, an in-vehicle network 23, an ECU 24, and two acceleration sensors 26. (26a, 26b).
  • the ECU 24 is an example of a periphery monitoring device.
  • the monitor device 11, the steering system 13, the imaging unit 16, the brake system 18, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the ECU 24, and the acceleration sensor 26 (26a, 26b) are a peripheral monitoring system. 100 is configured.
  • the monitor device 11, the steering system 13, the brake system 18, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the ECU 24, and the acceleration sensor 26 are electrically connected via an in-vehicle network 23 that is an electric communication line. Connected.
  • the in-vehicle network 23 is configured as, for example, a CAN (Controller Area Network).
  • the acceleration sensor 26a is a sensor that detects acceleration in the front-rear direction
  • the acceleration sensor 26b is a sensor that detects acceleration in the left-right direction.
  • the brake system 18 includes an actuator 18a and a brake sensor 18b.
  • the brake system 18 applies a braking force to the wheel 3 via the actuator 18a.
  • the brake sensor 18b transmits, for example, the position of a brake pedal as a movable part of the braking operation unit 6 to the ECU 24.
  • the steering angle sensor 19 is, for example, a sensor that detects the steering amount of the steering unit 4.
  • the steering angle sensor 19 transmits the steering angle information such as the steering angle of the steering unit 4 by the driver and the steering angle at the time of automatic steering to the ECU 24.
  • the accelerator sensor 20 is a displacement sensor that detects the position of an accelerator pedal as a movable part of the acceleration operation unit 5, for example.
  • the accelerator sensor 20 transmits the position of the acceleration operation unit 5 to the ECU 24.
  • the shift sensor 21 is, for example, a sensor that detects the position of the movable part of the shift operation unit 7 and transmits the position of the movable part of the shift operation unit 7 to the ECU 24 as shift information.
  • the wheel speed sensor 22 is a sensor that detects the amount of rotation of the wheel 3 or the number of rotations per unit time.
  • the wheel speed sensor 22 transmits the wheel speed pulse number indicating the detected rotation speed to the ECU 24 as wheel speed information.
  • the ECU 24 detects the detection results by sensors such as the torque sensor 13b, the brake sensor 18b, the steering angle sensor 19, the accelerator sensor 20, the shift sensor 21, the wheel speed sensor 22, the acceleration sensor 26, and the like via the in-vehicle network 23, and An instruction signal (control signal, operation signal, input signal, data) from the operation input unit 10 or the like can be received.
  • the ECU 24 receives an image from the imaging unit 16.
  • the ECU 24 may receive an image from the imaging unit 16 via the in-vehicle network 23.
  • the ECU 24 is, for example, a computer.
  • the ECU 24 includes a CPU (Central Processing Unit) 24a, a ROM (Read Only Memory) 24b, a RAM (Random Access Memory) 24c, a display control unit 24d, a voice control unit 24e, an SSD (Solid State Drive) 24f, Is provided.
  • the CPU 24a, ROM 24b, and RAM 24c may be integrated in the same package.
  • the CPU 24a reads a program stored in a nonvolatile storage device such as the ROM 24b, and executes various arithmetic processes and controls according to the program.
  • the CPU 24a executes, for example, image processing related to an image to be displayed on the display screen 8.
  • the ROM 24b stores a program and parameters necessary for executing the program.
  • the RAM 24c temporarily stores various types of data used in computations by the CPU 24a.
  • the display control unit 24d mainly performs processing of a captured image that is acquired from the imaging unit 16 and output to the CPU 24a among arithmetic processing in the ECU 24, data conversion of a display image that is acquired from the CPU 24a and displayed on the display screen 8, and the like. Execute.
  • the voice control unit 24e mainly performs a voice process acquired from the CPU 24a and output to the voice output device 9 among the calculation processes in the ECU 24.
  • the SSD 24f is a rewritable nonvolatile storage unit, and maintains data acquired from the CPU 24a even when the power of the ECU 24 is turned off.
  • FIG. 7 is a block diagram showing a functional configuration of the ECU 24 as the periphery monitoring device according to the present embodiment.
  • the CPU 24a executes a program stored in advance in the ROM 24b, thereby realizing an acquisition unit 401, an angle calculation unit 402, a storage processing unit 403, a reception unit 405, and a display processing unit 406.
  • the program for realizing these functional configurations may be provided via any recording medium other than the ROM 24b that can be read by the computer.
  • the ECU 24 implements a ring buffer 404 on the RAM 24c.
  • Part or all of the acquisition unit 401, the angle calculation unit 402, the storage processing unit 403, the reception unit 405, and the display processing unit 406 is a hardware circuit or a combination of a hardware circuit and software (program). It may be realized by.
  • the acquisition unit 401 acquires various information from various sensors provided in the vehicle 1.
  • the acquisition unit 401 according to the present embodiment is output from the imaging units 16a to 16d that are provided in the vehicle 1 and images the periphery of the vehicle 1, and the acceleration sensors 26a and 26b that are provided in the vehicle 1. Acceleration data obtained. Further, the acquisition unit 401 acquires the steering angle information from the steering angle sensor 19 and the wheel speed information from the wheel speed sensor 22.
  • the acquisition unit 401 associates images, acceleration data, rudder angle information, and wheel speed information whose acquired times are substantially the same with each other.
  • the angle calculation unit 402 calculates the tilt angle (pitch angle and roll angle) of the vehicle 1 based on the acceleration data acquired from the acceleration sensors 26a and 26b.
  • the pitch angle is an angle indicating the inclination around the left and right axes of the vehicle 1
  • the roll angle is an angle indicating the inclination around the longitudinal axis of the vehicle 1.
  • the angle calculation unit 402 associates the roll angle and pitch angle calculated from the acceleration data with an image associated with the acceleration data. This makes it possible to recognize the roll angle and pitch angle of the vehicle 1 when the image is captured.
  • the storage processing unit 403 includes a correction unit 411, an estimation unit 412, and a storage unit 413, and generates and stores an image to be displayed on the display screen 8.
  • the correction unit 411 performs rotation correction on the image captured by the imaging unit 16a and showing the surrounding environment in front of the vehicle 1.
  • the correction unit 411 performs rotation correction according to the roll angle associated with the image, with the position coordinate in the image corresponding to the center of the lens used for imaging by the imaging unit 16a as the origin. .
  • the estimation unit 412 estimates the current position of the vehicle 1.
  • the position of the vehicle 1 is a predetermined position of the vehicle 1 (for example, the center of the vehicle 1).
  • the method for estimating the position of the vehicle 1 is not limited to a specific method.
  • the estimation unit 412 calculates the amount of movement of the vehicle 1 from the previously imaged position based on the acceleration data, the steering angle information, and the wheel speed information acquired by the acquisition unit 401. Then, the estimation unit 412 estimates the position of the vehicle 1 from the movement amount.
  • the estimation unit 412 calculates an optical flow using an image picked up in the past and an image picked up in the past, calculates the movement amount of the vehicle 1 based on the calculated optical flow, and Estimate the position.
  • the estimation unit 412 calculates an optical flow using an image picked up in the past and an image picked up in the past, calculates the movement amount of the vehicle 1 based on the calculated optical flow, and Estimate the position.
  • the vehicle 1 may include a GPS (not shown), and the estimation unit 412 may estimate the position of the vehicle 1 by using a signal from the GPS.
  • the estimation unit 412 estimates the current direction of the vehicle 1.
  • the method of estimating the orientation of the vehicle 1 is not limited to a specific method, the orientation of the vehicle 1 can be estimated by a method similar to the position of the vehicle 1.
  • the storage unit 413 stores the image corrected by the correction unit 411 in the ring buffer 404 together with the position information and the direction information of the vehicle 1.
  • the present embodiment does not limit the interval at which images are stored in the ring buffer 404, for example, an image showing a road surface that is passing at any timing can be acquired from an image stored in the past.
  • the storage interval is set to a value that is not too long.
  • the position information and the orientation information stored in the ring buffer 404 are used as a travel history for calculating the passing trajectory 105 of the front wheel 3F. That is, the ring buffer 404 corresponds to a storage unit that stores a travel history of the vehicle 1.
  • the periphery monitoring device obtains discrete data indicating the passing trajectory 105 of the front wheel 3F by calculating the position of the front wheel 3F for each piece of position information.
  • the passing trajectory 105 of the front wheel 3F is preferably accurate to some extent, and for this purpose, the granularity of the stored position information and orientation information is desirably fine to some extent. Therefore, the storage interval is set to a value that is not too long.
  • an image, position information, and orientation information may be stored every time a distance of 0.1 m is traveled.
  • the travel history may be stored separately from the position information and the orientation information associated with the image, and in this case, the image storage interval is not limited to these.
  • FIG. 8 is a diagram showing the structure of the ring buffer 404.
  • the ring buffer 404 stores an image and the position information and direction information of the vehicle 1 at the time of capturing the image in association with each other.
  • the ring buffer 404 is a storage area logically arranged in a ring shape.
  • the image requested to be stored is overwritten and stored in the oldest updated area.
  • the accepting unit 405 accepts an instruction signal (control signal) from the operation input unit 10 or the like.
  • the receiving unit 405 according to the present embodiment receives a display mode switching operation from these instruction signals.
  • the display processing unit 406 performs display processing on the display screen 8 in a display mode according to the operation received by the receiving unit 405.
  • the display processing unit 406 includes an image acquisition unit 421, a passage locus calculation unit 422, a prediction locus calculation unit 423, and an output unit 424.
  • the image acquisition unit 421 acquires an image indicating a road surface under the floor of the vehicle 1, that is, an underfloor image.
  • the image acquisition unit 421 reads from the ring buffer 404 an image showing a road surface on which the vehicle 1 is currently passing.
  • the image acquisition unit 421 uses the image captured at the first timing stored in the ring buffer 404 as an underfloor image at the second timing when the vehicle 1 passes on the road surface reflected in the image. get.
  • the second timing is a timing after the first timing.
  • the image acquisition unit 421 selects, from a plurality of images stored in the ring buffer 404, an image showing a road surface on which the vehicle 1 is currently passing. Which image shows the road surface on which the vehicle 1 is currently passing can be determined based on position information associated with each image, for example.
  • an image is selected as follows.
  • the direction of the optical axis and the angle of view of the imaging unit 16 are known or can be acquired.
  • the image acquisition unit 421 arranges an object indicating the vehicle 1 on a virtual plan view, and further maps the imaging range of each stored image on the plan view.
  • the image acquisition unit 421 specifies the position where the imaging range of each image is mapped based on the position information and orientation information associated with each image and the direction and angle of view of the optical axis of the imaging unit 16. Can do.
  • the image acquisition unit 421 selects an image including an object indicating the vehicle 1 in the imaging range as an image showing a road surface on which the vehicle 1 is currently passing.
  • the image acquisition unit 421 may select two or more images from a plurality of images stored in the ring buffer 404 and combine the selected two or more images. Combining is to connect a plurality of images seamlessly.
  • the timing at which two or more images before composition are taken may be different.
  • the image acquisition unit 421 includes a part or all of the image captured by the imaging unit 16a at time t2, and a part or all of the image captured by the imaging unit 16a at time t3 after time t2.
  • the underfloor image at the current time t4 after the time t3 may be generated.
  • two or more images before synthesis may be different from each other in the imaging unit 16 that has captured the images.
  • the image acquisition unit 421 synthesizes part or all of the image captured by the imaging unit 16a at time t5 and part or all of the image captured by the imaging unit 16b at time t5.
  • An underfloor image at the current time t6 after the time t5 may be generated.
  • the image acquisition unit 421 may perform processing other than synthesis on the image before synthesis or the image after synthesis.
  • the processing includes, for example, clipping, masking, filtering on a part or the whole of the image, correction, or viewpoint conversion.
  • the correction is, for example, distortion correction or gamma correction.
  • the viewpoint conversion is, for example, generating a bird's-eye view image from an image captured by the imaging unit 16.
  • the image acquisition unit 421 may process the selected image even when only one image is selected as the underfloor image. The processing can be executed at an arbitrary timing from when the image is acquired until it is output to the display screen 8.
  • the passing locus calculation unit 422 calculates the passing locus 105 of the front wheel 3F based on the traveling history accumulated in the ring buffer 404.
  • the passage locus calculating unit 422 calculates the passage locus 105 for each of the two front wheels 3F.
  • the calculation method of the passage locus 105 of the front wheel 3F is not limited to a specific method.
  • the passage trajectory calculation unit 422 reads a plurality of position information and direction information stored in a nearby predetermined period from the ring buffer 404. Then, the passage trajectory calculation unit 422 calculates the position of the front wheel 3F corresponding to each of the plurality of read position information for each read position information.
  • the position where the front wheel 3F is attached in the vehicle 1 is known or can be acquired.
  • the position of the front wheel 3F is uniquely determined by the combination of the position information, the direction information of the vehicle 1, and the position where the front wheel 3F is attached in the vehicle 1.
  • the passage trajectory calculation unit 422 can acquire the position of the front wheel 3F in chronological order by calculating the position of the front wheel 3F in order from the oldest timing when the corresponding position information is stored. Note that the passing locus calculation unit 422 may calculate the position of the front wheel 3F in the reverse order of the time series. A column in which the positions of the front wheels 3F are arranged in time series order or in the reverse order of the time series corresponds to discrete data indicating the passage trajectory 105 of the front wheels 3F.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectories 101 and 106 of the wheels 3 based on the steering information acquired by the acquisition unit 401. In particular, in the present embodiment, the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R based on the steering information.
  • the calculation method of the predicted trajectory 106 of the rear wheel 3R is not limited to a specific method.
  • FIG. 9 is a diagram for explaining an example of a calculation method of the predicted trajectory 106 of the rear wheel 3R.
  • the vehicle When the driver changes the tire angle of the front wheel 3F by operating the steering unit 4, the vehicle is located at the intersection of the direction orthogonal to the direction 3Fa of the front wheel 3F and the extension direction of the rear wheel axle 38 that supports the rear wheel 3R.
  • the rear wheel 3R also moves along an arc centered on the turning center G1.
  • the predicted trajectory calculation unit 423 is a part of an arc drawn by the rear wheel 3R centered on the turning center G1, and a line segment that the rear wheel 3R follows from the present to a predetermined short time is used as a predicted trajectory of the rear wheel 3R. It calculates as 106.
  • the predicted trajectory calculation unit 423 calculates the tire angle of the front wheel 3F from the steering information, and calculates the predicted trajectory 106 of the rear wheel 3R based on the calculated tire angle and the relationship shown in FIG.
  • the acquisition unit 401 may acquire the tire angle of the front wheel 3F as steering information, and the predicted trajectory calculation unit 423 may use the acquired tire angle of the front wheel 3F for calculation of the predicted trajectory 106 of the rear wheel 3R. .
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R in consideration of the tire angle of the rear wheel 3R. May be.
  • the predicted trajectory 106 of the rear wheel 3R can be obtained by calculating an arc in which the rear wheel 3R moves when turning is performed during normal traveling.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F in the same procedure as the predicted trajectory 106 of the rear wheel 3R.
  • the predicted trajectory calculation unit 423 is a part of an arc drawn by the front wheel 3F centered on the turning center G1, and the line segment that the front wheel 3F follows from the present to a predetermined short time is used as the predicted trajectory 101 of the front wheel 3F.
  • the output unit 424 displays the underfloor image, the contour lines 102 to 104, the passing trajectory 105 of the front wheel 3F, and the predicted trajectories 101 and 106 on the display screen 8.
  • FIG. 10 is a flowchart illustrating a procedure of image storage processing in the periphery monitoring device according to the embodiment.
  • the process of FIG. 10 is executed for each control cycle.
  • the control cycle is sufficiently shorter than the cycle determined as Yes in the determination process of S106.
  • the imaging unit 16 images the surrounding environment of the vehicle 1 (S101).
  • the imaging unit 16 a images an area including the road surface in the traveling direction of the vehicle 1.
  • the acquisition unit 401 acquires an image from the imaging unit 16 and acceleration data from the acceleration sensor 26 (S102). Then, the angle calculation unit 402 calculates the roll angle and pitch angle of the vehicle 1 from the acceleration data (S103). Then, the correction unit 411 performs rotation correction corresponding to the roll angle on the captured image (S104).
  • the estimation unit 412 calculates the movement amount of the vehicle 1 based on the acceleration data, the steering angle information, and the wheel speed information, and estimates the current position and orientation of the vehicle 1 (S105).
  • the storage unit 413 determines whether or not the position of the vehicle 1 has changed by a predetermined distance (for example, 0.1 m) from the position when the image was stored last time (S106).
  • a predetermined distance for example, 0.1 m
  • the storage unit 413 determines that the position of the vehicle 1 has not changed by a predetermined distance or more (S106: No)
  • a predetermined distance or more for example, 0.1 m
  • the present invention is not limited to such a storage method, and the steering angle is a predetermined angle or more. You may preserve
  • the storage unit 413 determines that the position of the vehicle 1 has changed by a predetermined distance (for example, 0.1 m) or more from the position when the image was stored last time (S106: Yes), the storage unit 413. However, the corrected current image is overwritten and saved in the oldest updated area of the ring buffer 404 (S107). At that time, the storage unit 413 stores the position information and orientation information of the captured image in association with each other.
  • a predetermined distance for example, 0.1 m
  • the storage unit 413 updates the position information and orientation information at the time of imaging of each image stored in the ring buffer 404 to position information and orientation information based on the current position and orientation (S108).
  • FIG. 11 is a flowchart illustrating a procedure of processing for controlling display in the periphery monitoring device according to the embodiment.
  • the receiving unit 405 determines whether or not an operation for starting the underfloor display mode has been received (S201). In one example, when the touch of the switching button 85 by the occupant is detected, the operation input unit 10 detects that and notifies the ECU 24 of the fact. When the notification of the touch of the switching button 85 is received when the display mode is the normal display mode, the reception unit 405 determines that an operation for starting the underfloor display mode has been received.
  • condition for starting the underfloor display mode is not limited to this.
  • reception unit 405 determines that an operation for starting the underfloor display mode has been received (S201: No). If the reception unit 405 does not determine that an operation for starting the underfloor display mode has been received (S201: No), the process proceeds to S212 described below.
  • the image acquisition unit 421 acquires an image of the road surface currently passing from the ring buffer 404 (S202). ).
  • the passage locus calculation unit 422 calculates the passage locus 105 of the front wheel 3F based on the past position information and orientation information stored in the ring buffer 404 (S203). Further, the predicted trajectory calculation unit 423 acquires the steering angle information via the acquisition unit 401 (S204). Then, the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F and the predicted trajectory 106 of the rear wheel 3R based on the acquired steering angle information (S205).
  • passage trajectory 105 and the predicted trajectories 101 and 106 can be expressed using an arbitrary coordinate system.
  • the passage trajectory 105 and the predicted trajectories 101 and 106 are expressed in a coordinate system based on the current position and orientation of the vehicle 1.
  • the output unit 424 superimposes the contour line 102 of the vehicle 1 on the image acquired by the image acquisition unit 421 (S206). At this time, the output unit 424 aligns the contour line 102 of the vehicle 1 and the image so that the vehicle 1 in which the contour line 102 of the vehicle 1 is included in the image corresponds to the position where the vehicle 1 is currently passing. In the alignment, the contour 102 of the vehicle 1 is superimposed so that the correspondence between the image and the contour 102 of the vehicle 1 matches the correspondence between the current vehicle 1 and the current road surface under the floor of the vehicle 1. It is to adjust the position and the direction of the outline 102 of the vehicle 1.
  • Alignment method is not limited to a specific method.
  • the output unit 424 indicates the current position and orientation of the vehicle 1 in the region shown in the acquired image, the position information and orientation information associated with the image, and the optical axis of the imaging unit 16. Based on the direction and the angle of view, the position where the current vehicle 1 exists and the current direction of the vehicle 1 are specified in the region shown in the image.
  • the direction of the optical axis and the angle of view of the imaging unit 16 are known or can be acquired.
  • the output unit 424 may adjust the position and orientation of the contour line 102 of the vehicle 1 with reference to the image, or may adjust the position and orientation of the image with reference to the contour line 102 of the vehicle 1. Good.
  • the output unit 424 may adjust the image so that the contour line 102 of the vehicle 1 is located in the center of the display area 81 and the direction of the contour line 102 of the vehicle 1 faces the display area 81.
  • the output unit 424 may perform processing such as viewpoint conversion on the image or the outline 102 of the vehicle 1 as necessary.
  • the output unit 424 superimposes the contour 103 of the front wheel 3F and the contour 104 of the rear wheel 3R on the image (S207).
  • the output unit 424 performs alignment in S207 as in S206. Since the position where the front wheel 3F and the rear wheel 3R are attached in the vehicle 1 is known or can be acquired, the output unit 424, for example, the position of the contour 102 of the vehicle 1 and the front wheel 3F and the rear wheel in the vehicle 1 Based on the position at which 3R is attached, the positions at which the contour 103 of the front wheel 3F and the contour 104 of the rear wheel 3R are superimposed are determined.
  • the output unit 424 may rotate and superimpose the contour lines 103 and 104 of each wheel 3 by an angle corresponding to the steering angle information.
  • the contour 103 of the front wheel 3F is rotated and displayed by an angle corresponding to the steering angle information.
  • the output unit 424 may rotate and superimpose the contour line 104 of the rear wheel 3R.
  • contour 102 of the vehicle 1, the contour 103 of the front wheel 3F, and the contour 104 of the rear wheel 3R may be grouped in advance, and the grouped contours 102 to 104 may be superimposed.
  • the output unit 424 superimposes the passing trajectory 105 of the front wheel 3F on the image (S208), and superimposes the predicted trajectory 101 of the front wheel 3F and the predicted trajectory 106 of the rear wheel 3R on the image (S209).
  • the output unit 424 aligns the image and each locus using the same method as in S207.
  • the output part 424 outputs the image on which each display object was superimposed on the display screen 8 (S210).
  • the receiving unit 405 determines whether or not an operation for ending the underfloor display mode has been received (S211).
  • the operation input unit 10 detects that and notifies the ECU 24 of the fact.
  • the accepting unit 405 determines that an operation for ending the underfloor display mode has been accepted.
  • the processing from S202 to S211 is looped at a predetermined short cycle until the underfloor display mode ends. Since the underfloor image is sequentially updated by the processing of S210 in each loop, the occupant can visually recognize the current road surface under the floor of the vehicle 1 as an image on the display screen 8 when the vehicle 1 is traveling. it can. Further, the contour line 104 of the rear wheel 3R is updated in each loop. In addition, position information as a travel history is accumulated during travel, and in each loop, the passing trajectory 105 of the front wheel 3F is updated according to the accumulated travel history. Further, the steering angle information is acquired in each loop, and the predicted trajectory 106 of the rear wheel 3R is updated according to the acquired steering angle information.
  • the occupant can change the locus of the front wheel 3F, the position of the rear wheel 3R, and the expected locus of the rear wheel 3R, which change according to the traveling of the vehicle 1 to the present, through the display screen 8 in substantially real time. Can be confirmed.
  • the display processing unit 406 displays the display content of the normal display mode on the display screen 8 (S212). Then, the process returns to S201.
  • the traveling history for calculating the passing trajectory 105 of the front wheel 3F has been described as including position information and direction information.
  • the travel history is not limited to this.
  • the travel history includes only position information.
  • the passage trajectory calculation unit 422 may calculate a vector connecting two pieces of position information with consecutive stored timings as the direction of the vehicle 1 at each imaging timing.
  • the traveling history may include the position of the front wheel 3F.
  • the estimation unit 412 estimates the position of the front wheel 3F based on the position of the vehicle 1 and the position where the front wheel 3F is attached in the vehicle 1, and the storage unit 413 uses the estimated position of the front wheel 3F as a ring. Save to buffer 404.
  • the passage trajectory calculation unit 422 obtains discrete data indicating the passage trajectory 105 of the front wheel 3F by reading out the positions of the plurality of front wheels 3F stored during a predetermined period of time from the ring buffer 404.
  • the above technique can also be applied when the vehicle 1 moves backward.
  • the traveling direction of the vehicle 1 is the backward direction of the vehicle 1. Therefore, the image acquisition unit 421 can acquire an underfloor image from the image captured by the imaging unit 16c.
  • the periphery monitoring device may display the passing trajectory of the rear wheel 3R and the display object indicating the position of the front wheel 3F on the display screen 8 so as to be superimposed on the image. Further, when the vehicle 1 moves backward, the periphery monitoring device may superimpose the predicted trajectory of the front wheel 3F on the image and display it on the display screen 8.
  • the ring buffer 404 stores the travel history of the vehicle 1.
  • the image acquisition unit 421 acquires an image indicating a road surface under the floor of the vehicle 1.
  • the passage locus calculation unit 422 calculates the passage locus 105 of the front wheel 3F based on the travel history.
  • the output unit 424 superimposes and displays the trajectory 105 of the front wheel 3F, the contour 104 of the rear wheel 3R indicating the position of the rear wheel 3R, and an image showing the road surface under the floor of the vehicle 1 on the display screen 8.
  • the driver can perform a driving operation such that the rear wheel 3R passes on the trajectory through which the front wheel 3F has passed. Therefore, the convenience of the periphery monitoring device is high.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 106 of the rear wheel 3R based on the steering information. Then, the output unit 424 further superimposes and displays the predicted trajectory 106 of the rear wheel 3R on the display screen 8. As a result, the driver can more easily perform driving such that the rear wheel 3R is allowed to pass on the locus through which the front wheel 3F has passed.
  • the predicted trajectory calculation unit 423 calculates the predicted trajectory 101 of the front wheel 3F based on the steering information. Then, the output unit 424 further superimposes and displays the predicted trajectory 101 of the front wheel 3F on the display screen 8. Since the driver can confirm the position where the front wheel 3F will proceed from now on the display screen 8, the driver 1 can drive the vehicle 1 more safely.
  • the ring buffer 404 stores an image captured at the first timing by the imaging unit 16 that images the road surface in the traveling direction of the vehicle 1.
  • the image acquisition unit 421 moves the image captured at the first timing stored in the ring buffer 404 under the floor of the vehicle 1 at the second timing when the vehicle 1 passes on the road surface reflected in the image. Get as an image to show.
  • the second timing is a timing after the first timing.
  • SYMBOLS 1 ... Vehicle, 2 ... Vehicle body, 2a ... Cabin, 2b ... Seat, 2c, 2d, 2e, 2f ... End part, 2g ... Door mirror, 2h ... Door, 3 ... Wheel, 3F ... Front wheel, 3R ... Rear wheel, 4 DESCRIPTION OF SYMBOLS ... Steering part, 5 ... Acceleration operation part, 6 ... Braking operation part, 7 ... Shift operation part, 8 ... Display screen, 9 ... Audio
  • Torque sensor 16, 16a, 16b, 16c, 16d ... Imaging unit, 18 ... Brake system, 18a ... Actuator, 18b ... Brake sensor, 19 ... Steering angle sensor, 20 ... Accelerator sensor, 21 ... Shift sensor 22 ... wheel speed sensor, 23 ... in-vehicle network, 24 ... ECU, 24a ... CPU, 24b ... ROM, 24c ... RAM, 24d ... display control unit, 24e ... voice control unit, 24f ... SD, 26, 26a, 26b ... acceleration sensor, 38 ... rear wheel axle, 38a ... center point, 38b ... arc, 81, 82, 83, 84 ... display area, 85 ... switching button, 100 ...
  • peripheral monitoring system 101, 106 ... Predicted locus, 102, 103, 104 ... Contour line, 105 ... Passing locus, 401 ... Acquisition unit, 402 ... Angle calculation unit, 403 ... Storage processing unit, 404 ... Ring buffer, 405 ... Reception unit, 406 ... Display processing 411 ... correction unit 412 ... estimation unit 413 ... storage unit 421 ... image acquisition unit 422 ... passing trajectory calculation unit 423 ... predicted trajectory calculation unit 424 ... output unit 501 ... road surface 502, 503 ... Imaging area.

Abstract

Un mode de réalisation de l'invention concerne un dispositif de surveillance de périphérie comprenant : une unité de stockage permettant de stocker un historique de déplacement d'un véhicule ; une unité d'acquisition d'image permettant d'acquérir une première image représentant la surface de route sous le sol du véhicule ; une première unité de calcul permettant de calculer, sur la base de l'historique de déplacement, une première trajectoire sur laquelle passent les roues avant du véhicule ; et une unité de sortie qui produit un affichage superposé de la première image, de la première trajectoire, et d'objets d'affichage indiquant les positions des roues arrière du véhicule sur un écran d'affichage disposé à l'intérieur de l'habitacle du véhicule.
PCT/JP2017/010537 2016-08-05 2017-03-15 Dispositif de surveillance de périphérie WO2018025441A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-154549 2016-08-05
JP2016154549A JP2018020724A (ja) 2016-08-05 2016-08-05 周辺監視装置

Publications (1)

Publication Number Publication Date
WO2018025441A1 true WO2018025441A1 (fr) 2018-02-08

Family

ID=61073175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010537 WO2018025441A1 (fr) 2016-08-05 2017-03-15 Dispositif de surveillance de périphérie

Country Status (2)

Country Link
JP (1) JP2018020724A (fr)
WO (1) WO2018025441A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112309157A (zh) * 2019-07-23 2021-02-02 丰田自动车株式会社 图像显示装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7319593B2 (ja) * 2020-02-13 2023-08-02 トヨタ自動車株式会社 車両周辺監視装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014067299A (ja) * 2012-09-26 2014-04-17 Aisin Seiki Co Ltd 車両運転支援装置
JP2016021653A (ja) * 2014-07-14 2016-02-04 アイシン精機株式会社 周辺監視装置、及びプログラム
JP2016101872A (ja) * 2014-11-28 2016-06-02 アイシン精機株式会社 車両周辺監視装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4696691B2 (ja) * 2005-05-27 2011-06-08 アイシン・エィ・ダブリュ株式会社 駐車支援方法及び駐車支援装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014067299A (ja) * 2012-09-26 2014-04-17 Aisin Seiki Co Ltd 車両運転支援装置
JP2016021653A (ja) * 2014-07-14 2016-02-04 アイシン精機株式会社 周辺監視装置、及びプログラム
JP2016101872A (ja) * 2014-11-28 2016-06-02 アイシン精機株式会社 車両周辺監視装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112309157A (zh) * 2019-07-23 2021-02-02 丰田自动车株式会社 图像显示装置

Also Published As

Publication number Publication date
JP2018020724A (ja) 2018-02-08

Similar Documents

Publication Publication Date Title
US9902323B2 (en) Periphery surveillance apparatus and program
JP6060910B2 (ja) 周辺監視装置、及びプログラム
US9216765B2 (en) Parking assist apparatus, parking assist method and program thereof
JP6015314B2 (ja) 駐車目標位置を算出する装置、駐車目標位置を算出する方法およびプログラム
US10077045B2 (en) Parking assist system, method, and non-transitory computer readable medium storing program
JP6380002B2 (ja) 周辺監視装置
US9505436B2 (en) Parking assist system
JP6447060B2 (ja) 車両周辺監視装置
US9902427B2 (en) Parking assistance device, parking assistance method, and non-transitory computer readable medium storing program
US9919735B2 (en) Control system and control method for vehicle
JP6642306B2 (ja) 周辺監視装置
WO2018150642A1 (fr) Dispositif de surveillance d'environnement
CN107791951B (zh) 显示控制装置
US10353396B2 (en) Vehicle periphery monitoring device
JP6629156B2 (ja) 駐車支援装置
WO2018198531A1 (fr) Dispositif d'aide au stationnement
CN109314770B (zh) 周边监控装置
JP2018144567A (ja) 運転支援装置
WO2018025441A1 (fr) Dispositif de surveillance de périphérie
US20200140011A1 (en) Parking assistance apparatus
JP2019138655A (ja) 走行支援装置
JP2017211814A (ja) 駐車支援装置
JP2019018616A (ja) 駐車支援用表示制御装置、駐車支援システム、方法及びプログラム
JP2018006943A (ja) 車両周辺監視装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17836548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17836548

Country of ref document: EP

Kind code of ref document: A1