WO2019026348A1 - Dispositif de commande d'affichage, système de commande d'affichage, procédé de commande d'affichage, et programme - Google Patents

Dispositif de commande d'affichage, système de commande d'affichage, procédé de commande d'affichage, et programme Download PDF

Info

Publication number
WO2019026348A1
WO2019026348A1 PCT/JP2018/014137 JP2018014137W WO2019026348A1 WO 2019026348 A1 WO2019026348 A1 WO 2019026348A1 JP 2018014137 W JP2018014137 W JP 2018014137W WO 2019026348 A1 WO2019026348 A1 WO 2019026348A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
time
storage
display control
Prior art date
Application number
PCT/JP2018/014137
Other languages
English (en)
Japanese (ja)
Inventor
清水 義之
一郎 石田
拓之 照内
林 直人
昇 勝俣
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017148385A external-priority patent/JP6763357B2/ja
Priority claimed from JP2017148469A external-priority patent/JP6763358B2/ja
Priority claimed from JP2017148336A external-priority patent/JP6787272B2/ja
Priority claimed from JP2017148384A external-priority patent/JP6763356B2/ja
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Priority to CN201880004577.6A priority Critical patent/CN109997356B/zh
Priority to EP18840930.4A priority patent/EP3550830B1/fr
Publication of WO2019026348A1 publication Critical patent/WO2019026348A1/fr
Priority to US16/424,691 priority patent/US11117520B2/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/002Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles specially adapted for covering the peripheral part of the vehicle, e.g. for viewing tyres, bumpers or the like
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/173Reversing assist
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/176Camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/802Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30264Parking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Definitions

  • the present invention relates to a display control device, a display control system, a display control method, and a program.
  • the overhead view image of the vehicle is displayed at the time of backward warehousing to the parking area, mainly for parking assistance at the time of warehousing to the parking area.
  • the driver visually confirms the surrounding situation including the blind spot range in the mirror by means of overhead view video.
  • the driver checks the surroundings by visually or by mirror.
  • the operation is easier than when moving backward.
  • the surrounding confirmation conditions change, such as changes in the surrounding situation, or changes in the driver, or changes in visibility
  • the surrounding image is confirmed by the overhead image. It may be desirable to
  • the overhead view video is displayed regardless of the surrounding conditions at the time of advance advance from the parking lot, the overhead view video is displayed in a situation where there is no need to display the overhead view video or the overhead view video Will be displayed.
  • the present invention has been made in view of the above, and it is an object of the present invention to properly display a bird's eye view image according to a change in surrounding confirmation conditions at the time of leaving the house.
  • a display control device includes: a video data acquisition unit that acquires video data from a plurality of imaging units that capture the periphery of a vehicle; A bird's-eye view image generation unit that performs viewpoint conversion processing and synthesis processing on the image data acquired by the unit, and generates a bird's-eye view image; first information on surrounding condition confirmation conditions of the vehicle when the vehicle recedes and stores; At the time of warehousing of the vehicle by comparing the first information and the second information with the information acquisition unit for acquiring the second information on the surrounding condition confirmation condition of the vehicle when the vehicle advances from the warehousing state and exits.
  • a display control unit for displaying the overhead view image generator generates the display unit, characterized in that it comprises a.
  • a display control system includes the display control device described above, a plurality of imaging units in which the video data acquisition unit acquires video data, and the adjacent information acquisition unit includes the first obstacle information and the second obstacle. At least one of an obstacle detection unit that acquires information and a display unit that causes the display control unit to display overhead view video.
  • a display control method includes a video data acquisition step of acquiring video data from a plurality of imaging units for imaging the surroundings of a vehicle, a viewpoint conversion process and a synthesis process on the video data acquired in the video data acquisition step.
  • An overhead view video generation step of generating an overhead view video, first information regarding a confirmation condition of a surrounding area of the vehicle when the vehicle recedes and stores the vehicle, and the vehicle when the vehicle moves forward from the storage state and exits Comparing the first information and the second information with the adjacent information acquisition step of acquiring second information related to the surrounding confirmation condition, and comparing the change in position of the surrounding confirmation condition at the time of storage and removal of the vehicle
  • the Including a display control step of displaying the overhead view image image generating step generates the display unit.
  • the program according to the present invention performs a video image data acquisition step of acquiring video data from a plurality of imaging units for imaging the surroundings of a vehicle, and performs viewpoint conversion processing and synthesis processing on the video data acquired in the video data acquisition step.
  • An overhead image generation step of generating an image, first information on a surrounding condition confirmation condition of the vehicle when the vehicle recedes and enters the warehousing, and a periphery of the vehicle when the vehicle moves forward from the warehousing state and exits
  • the adjacent information acquisition step for acquiring the second information related to the confirmation condition, and the comparison of the change of the position of the surrounding confirmation condition at the time of storage and exit of the vehicle by comparing the first information and the second information. If it is determined in the step and the comparison step that the surrounding area confirmation condition has changed when the vehicle is leaving the storage room, To execute a display control step of displaying an overhead image generating step generates the display unit to a computer which operates as a display control device.
  • ADVANTAGE OF THE INVENTION According to this invention, it is effective in the ability to display a bird's-eye view image appropriately according to the change of circumference
  • FIG. 1 is a block diagram showing a configuration example of a display control system according to the first embodiment.
  • FIG. 2 is a diagram for explaining a parking lot, and shows a state at the time of storage.
  • FIG. 3 is a diagram for explaining the parking section, and shows a state at the time of leaving the house.
  • FIG. 4 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the first embodiment.
  • FIG. 5 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the first embodiment.
  • FIG. 6 is a flowchart showing another example of the flow of processing in the display control device of the display control system according to the first embodiment.
  • FIG. 1 is a block diagram showing a configuration example of a display control system according to the first embodiment.
  • FIG. 2 is a diagram for explaining a parking lot, and shows a state at the time of storage.
  • FIG. 3 is a diagram for explaining
  • FIG. 7 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the second embodiment.
  • FIG. 8 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the third embodiment.
  • FIG. 9 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the fourth embodiment.
  • FIG. 10 is a diagram for explaining a parking area of a home parking lot.
  • FIG. 11 is a diagram for explaining a parking section of parallel parking, and shows a state at the time of storage.
  • FIG. 12 is a diagram for explaining a parking section of parallel parking, and shows a state at the time of leaving the house.
  • FIG. 11 is a diagram for explaining a parking section of parallel parking, and shows a state at the time of leaving the house.
  • FIG. 13 is a diagram for explaining a parking lot, and shows a state at the time of storage.
  • FIG. 14 is a diagram for explaining the parking section, and shows a state at the time of leaving the house.
  • FIG. 15 is a flowchart illustrating another example of the flow of processing in the display control device of the display control system according to the fifth embodiment.
  • FIG. 16 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the eighth embodiment.
  • FIG. 17 is a diagram showing a parking lot for explaining processing in the display control device of the display control system according to the ninth embodiment.
  • FIG. 18 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the ninth embodiment.
  • FIG. 19 is a diagram for explaining a parking lot of a home parking lot.
  • FIG. 20 is a diagram for explaining a parking section for parallel parking.
  • FIG. 21 is a block diagram showing a configuration example of a display control system according to the tenth embodiment.
  • FIG. 22 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the tenth embodiment.
  • FIG. 23 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the tenth embodiment.
  • FIG. 24 is a flowchart illustrating another example of the flow of processing in the display control device of the display control system according to the tenth embodiment.
  • FIG. 25 is a flowchart illustrating an example of the flow of processing in the display control device of the display control system according to the eleventh embodiment.
  • FIG. 26 is a block diagram showing a configuration example of a display control system according to the twelfth embodiment.
  • FIG. 27 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the twelfth embodiment.
  • FIG. 28 is a flowchart illustrating another example of the flow of processing in the display control device of the display control system according to the twelfth embodiment.
  • FIG. 29 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the thirteenth embodiment.
  • FIG. 30 is a flow chart showing an example of the flow of processing in the display control apparatus of the display control system according to the fourteenth embodiment.
  • FIG. 31 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the fifteenth embodiment.
  • FIG. 32 is a block diagram showing a configuration example of a display control system according to the sixteenth embodiment.
  • FIG. 33 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the sixteenth embodiment.
  • FIG. 34 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the seventeenth embodiment.
  • FIG. 35 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the eighteenth embodiment.
  • FIG. 1 is a block diagram showing a configuration example of a display control system according to the first embodiment.
  • FIG. 2 is a diagram for explaining a parking lot, and shows a state at the time of storage.
  • FIG. 3 is a diagram for explaining the parking section, and shows a state at the time of leaving the house.
  • the display control system 1 appropriately displays a bird's-eye view image according to a change in surrounding conditions (surrounding confirmation condition) at the time of leaving the house.
  • the display control system 1 is mounted on a vehicle V1.
  • the display control system 1 may be a portable device that can be used in the vehicle V1, in addition to the one placed on the vehicle V1.
  • the vehicle V1 is parked in the parking section P1.
  • the parking area PA, the parking area PB, and the parking area PC are arranged side by side facing the parking area P1.
  • the parking section PB faces the front of the parking section P1 across the aisle.
  • the parking section PA is on the left of the parking section PB as viewed from the vehicle V1.
  • the parking section PC is on the right of the parking section PB as viewed from the vehicle V1.
  • the parking lot PD is on the left side of the parking lot P1.
  • the parking section PE is on the right side of the parking section P1.
  • the display control system 1 will be described with reference to FIG.
  • the display control system 1 includes a front camera (shooting unit) 11, a rear camera (shooting unit) 12, a left side camera (shooting unit) 13, a right side camera (shooting unit) 14, and a sensor unit (obstacle detection) Unit 21, a display panel (display unit) 31, and a display control device 40.
  • the front camera 11 is an overhead video camera.
  • the front camera 11 is disposed in front of the vehicle V1 and captures an area around the front of the vehicle V1.
  • the front camera 11 shoots a shooting range of, for example, about 180 °.
  • the shooting range includes a wider range in front of the vehicle V1 than the display range of the overhead image 100.
  • the front camera 11 outputs the captured video to the video data acquisition unit 41 of the display control device 40.
  • the rear camera 12 is an overhead video camera.
  • the rear camera 12 is disposed at the rear of the vehicle V1 and captures an image of the area around the rear of the vehicle V1.
  • the rear camera 12 shoots a shooting range of, for example, about 180 °.
  • the shooting range includes a wider range behind the vehicle V1 than the display range of the overhead view video 100.
  • the rear camera 12 outputs the captured video to the video data acquisition unit 41 of the display control device 40.
  • the left side camera 13 is an overhead video camera.
  • the left side camera 13 is disposed on the left side of the vehicle V1 and captures an image of the area around the left side of the vehicle V1.
  • the left side camera 13 captures an imaging range of, for example, about 180 °.
  • the shooting range includes a wider range on the left side of the vehicle V1 than the display range of the overhead image 100.
  • the left side camera 13 outputs the photographed video to the video data acquisition unit 41 of the display control device 40.
  • the right side camera 14 is a bird's eye view camera.
  • the right side camera 14 is disposed on the right side of the vehicle V1 and captures an area around the right side of the vehicle V1.
  • the right side camera 14 captures an imaging range of, for example, about 180 °.
  • the shooting range includes a wider range on the right side of the vehicle V1 than the display range of the overhead image 100.
  • the right side camera 14 outputs the captured video to the video data acquisition unit 41 of the display control device 40.
  • the forward camera 11, the rear camera 12, the left side camera 13 and the right side camera 14 capture images of all directions of the vehicle V ⁇ b> 1.
  • the sensor unit 21 includes a plurality of sensors installed around the vehicle V1.
  • the sensor unit 21 can detect an obstacle present in the vicinity of the vehicle V1.
  • the sensor unit 21 is, as an obstacle existing in the vicinity of the vehicle V1, an adjacent vehicle existing in the parking area PA or the parking area PE adjacent to the parking area P1 which is a parking area for receiving the vehicle V1.
  • the sensor unit 21 includes a front center sensor, a front left sensor, a front right sensor, a rear center sensor, a rear left sensor, a rear right sensor, a left direction sensor, and a right direction sensor. Since each sensor is configured similarly, the front center sensor will be described, and the description of the other sensors will be omitted.
  • the front center sensor is disposed at the front center of the vehicle V1 and detects an obstacle at the front center of the vehicle V1.
  • the front center sensor detects an adjacent vehicle present in the parking section PB.
  • the front center sensor is, for example, an infrared sensor or an ultrasonic sensor, a millimeter wave radar, or the like, and may be configured by a combination thereof.
  • the front center sensor detects, for example, an adjacent vehicle at a distance of about 5 m from the vehicle V1.
  • the front center sensor detects, for example, an adjacent vehicle in a range of about 40 ° centered on the center of the sensor in the vertical direction.
  • the detection range of the front center sensor may overlap with part of the detection range of the front left sensor and the front right sensor.
  • the front center sensor outputs obstacle information indicating the presence or absence of an adjacent vehicle, which is a detection result, to the adjacent information acquisition unit (information acquisition unit) 43 of the display control device 40.
  • Such a sensor unit 21 can detect adjacent vehicles in all directions of the vehicle V1.
  • the detection result of the sensor unit 21 can detect at least one of the left and right sides, the left and right front, and the front adjacent vehicles of the vehicle V1. More specifically, the detection result of the sensor unit 21 can detect an adjacent vehicle present in the parking space PA or the parking space PE. More specifically, a sensor that detects an adjacent vehicle and a horizontal presence range of the adjacent vehicle detected by the sensor, which are included in the detection result, identify in which parking area the adjacent vehicle is detected.
  • the display panel 31 is, for example, a display device shared with other systems including a navigation system.
  • the display panel 31 is a monitor for confirming the periphery of the vehicle V1 at a necessary timing.
  • the display panel 31 can take various forms as long as the surroundings of the vehicle V1 can be confirmed.
  • As an example of the display panel it is possible to use an electronic rearview mirror or to have a function of an instrument panel.
  • the display panel 31 is a display that includes, for example, a liquid crystal display (LCD) or an organic electro-luminescence (EL) display.
  • the display panel 31 is disposed at a position easily visible by the driver. In the present embodiment, the display panel 31 is disposed on a dashboard, an instrument panel, a center console or the like in front of the driver of the vehicle V1.
  • the display panel 31 displays the overhead view video 100 of the vehicle V1 based on the video signal output from the display control unit 48 of the display control device 40.
  • the display control device 40 provides information for assisting parking. More specifically, the display control device 40 generates and displays the overhead view video 100 at the time of storage and at the time of storage.
  • the display control device 40 is, for example, an arithmetic processing device (control unit) configured of a CPU (Central Processing Unit), a processor for video processing, and the like.
  • the display control device 40 loads the program stored in the storage unit 49 into the memory and executes an instruction included in the program.
  • the display control device 40 is a video data acquisition unit 41, a host vehicle information acquisition unit 42, an adjacent information acquisition unit 43, an overhead image generation unit 44, a comparison unit 45, a display control unit 48, and an internal memory.
  • a storage unit 49 is included.
  • the display control device 40 may be configured of one or more devices.
  • the video data acquisition unit 41 acquires video data obtained by imaging the periphery of the vehicle V1. More specifically, the video data acquisition unit 41 acquires video data output from the front camera 11, the rear camera 12, the left side camera 13 and the right side camera 14. The video data acquisition unit 41 outputs the acquired video data to the adjacent information acquisition unit 43 and the overhead video generation unit 44.
  • the host vehicle information acquisition unit 42 senses the state of CAN (Controller Area Network) or vehicle V1 such as vehicle information serving as a parking start trigger or parking end trigger as a parking assistance display trigger, such as gear operation information of the vehicle V1. Acquire from a sensor etc.
  • the host vehicle information acquisition unit 42 acquires, as vehicle information, operation information of a steering operation performed at the time of parking acquired from CAN, various sensors, and the like.
  • the host vehicle information acquisition unit 42 outputs the acquired vehicle information to the adjacent information acquisition unit 43 and the display control unit 48.
  • the adjacent information acquisition unit 43 receives the first obstacle information (first information) when the vehicle V1 recedes and stores in, and the second obstacle information (second time when the vehicle V1 moves out from the storage state and exits) 2) to get information. More specifically, the adjacent information acquisition unit 43 acquires the first obstacle information from the sensor unit 21 when the vehicle V1 retreats and stores the goods. The adjacent information acquisition unit 43 causes the storage unit 49 to store the first obstacle information at the time of storage. Further, the adjacent information acquisition unit 43 acquires the second obstacle information from the sensor unit 21 when the vehicle V1 moves forward from the storage state and exits.
  • the determination that the vehicle V1 has moved backward and stored, and the determination that the vehicle V1 has advanced and left the storage are determined based on the gear operation information of the vehicle V1 acquired from the host vehicle information acquisition unit 42 and the on / off information of the engine.
  • the adjacent information acquisition unit 43 outputs the adjacent vehicle information at the time of leaving the storage unit to the comparison unit 45.
  • the first obstacle information is information on obstacles around the vehicle V1 when the vehicle V1 moves backward and is stored.
  • the first obstacle information is information including the presence or absence of an adjacent vehicle present in the parking section PA or the parking section PE when the vehicle V1 retreats and is stored.
  • the second obstacle information is information on obstacles around the vehicle V1 when the vehicle V1 advances from the storage state and exits.
  • the second obstacle information is information including the presence or absence of an adjacent vehicle present in the parking section PA or the parking section PE when the vehicle V1 advances from the storage state and exits.
  • the fact that the vehicle V1 has moved backward and has been put into storage means for example, that the shift position has been set to "Parking” or “Neutral” after the shift position has been set to “Reverse” and the vehicle V1 has traveled to the rear. It detects that the speed has become zero for a time longer than a second, or when the engine is stopped or the side brake or foot brake is operated. Alternatively, the fact that the vehicle V1 has moved backward and received may be detected by an arbitrary trigger such as a user operation.
  • the shift position is "parking” or “neutral”, or that the speed is zero for a time of 5 seconds or more, or that the engine is stopped.
  • the vehicle V1 moving forward from the storage state may be detected by an arbitrary trigger such as a user operation.
  • the overhead video generation unit 44 performs viewpoint conversion processing and synthesis processing on the peripheral video data acquired by the video data acquisition unit 41, and generates overhead video 100.
  • the overhead image generation unit 44 generates the overhead image 100 at the time of storage. When it is determined that the surrounding situation of the vehicle V1 has changed since the time of storage, the overhead video generation unit 44 generates the overhead video 100.
  • the overhead view video generation unit 44 generates the overhead view video 100 when the comparison unit 45 determines that the adjacent vehicle is increasing when the vehicle V1 is leaving.
  • the overhead view video generation unit 44 outputs the generated overhead view video 100 to the display control unit 48.
  • the overhead image generation unit 44 includes a viewpoint conversion processing unit 441, a clipping processing unit 442, and a combining processing unit 443.
  • the viewpoint conversion processing unit 441 performs viewpoint conversion processing on the surrounding video data acquired by the video data acquisition unit 41 so as to look down on the vehicle V1 from above. More specifically, the viewpoint conversion processing unit 441 generates an image on which viewpoint conversion processing has been performed, based on surrounding image data captured by the front camera 11, the rear camera 12, the left direction camera 13 and the right direction camera 14.
  • the method of viewpoint conversion processing may be any known method and is not limited.
  • the viewpoint conversion processing unit 441 outputs the peripheral video data subjected to the viewpoint conversion processing to the clipping processing unit 442.
  • the clipping processing unit 442 performs clipping processing on a video within a predetermined range from the peripheral video data subjected to the viewpoint conversion processing. Which range is to be cut out is registered and stored in advance.
  • the clipping processing unit 442 outputs the video data of the video subjected to the clipping processing to the combining processing unit 443.
  • the combining processing unit 443 performs combining processing to combine the video data subjected to the clipping processing.
  • the composition processing unit 443 generates the overhead image 100 in which the vehicle icon 110 is displayed on the composite image.
  • the overhead view image 100 will be described with reference to FIG. FIG. 4 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the first embodiment.
  • the overhead image 100 displays a range of about 2 m from the vehicle V1.
  • the adjacent vehicle Vd present in the parking section PD and the adjacent vehicle Ve present in the parking section PE are included in the display range.
  • the overhead image 100 is located at a central portion surrounded by the front image 101, the rear image 102, the left side image 103, the right side image 104, the front image 101, the rear image 102, the left direction image 103, and the right side image 104.
  • a vehicle icon 110 indicates the position and the direction of the vehicle V1.
  • the vehicle icon 110 is disposed at the center with the front-rear direction parallel to the front-rear direction of the overhead view video 100.
  • the comparison unit 45 compares the first obstacle information and the second obstacle information, and compares whether the obstacle of the vehicle V1 is increased when the vehicle V1 is leaving.
  • the comparison unit 45 compares the first obstacle information and the second obstacle information, and compares whether the adjacent vehicle is increasing when the vehicle V1 is leaving. More specifically, the comparison unit 45 compares the presence or absence of the vehicle in the adjacent parking area at the time of storage with the presence or absence of the vehicle in the adjacent parking area at the time of leaving, and the adjacent vehicle is not detected in the first obstacle information. In the obstacle information, when there is a parking area in which an adjacent vehicle is detected, it is determined that the adjacent vehicle is increasing.
  • the display control unit 48 causes the display panel 31 to display the overhead view video 100 generated by the overhead view video generation unit 44 at the time of storage.
  • the display control unit 48 causes the display panel 31 to display the overhead view video 100 generated by the overhead view video generation unit 44 when the comparison unit 45 determines that the adjacent obstacle is increasing when the vehicle V1 is delivered.
  • the display control unit 48 causes the display panel 31 to display the overhead image 100 when the comparison unit 45 determines that the adjacent vehicle is increasing when the vehicle V1 is leaving. More specifically, if the display control unit 48 determines that the vehicle is not present in the adjacent parking area at the time of storage and the vehicle is present in the adjacent parking area at the time of storage, the display control unit 48 displays the overhead image 100 at the time of storage It is displayed on the display panel 31.
  • the storage unit 49 is used for temporary storage of data in the display control device 40 or the like.
  • the storage unit 49 is, for example, a semiconductor memory device such as a random access memory (RAM), a read only memory (ROM), a flash memory, or a storage device such as a hard disk or an optical disk. Alternatively, it may be an external storage device wirelessly connected via a communication device (not shown).
  • FIG. 5 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the first embodiment.
  • FIG. 6 is a flowchart showing another example of the flow of processing in the display control device of the display control system according to the first embodiment.
  • the display control device 40 When the display control system 1 is activated, the display control device 40 acquires video data by the video data acquisition unit 41.
  • the display control device 40 causes the host vehicle information acquisition unit 42 to acquire vehicle information.
  • the display control device 40 determines whether or not the reverse is started (step S11).
  • the display control device 40 determines the presence or absence of a reverse trigger based on the host vehicle information acquired by the host vehicle information acquisition unit 42.
  • the reverse trigger is detected, for example, when the shift position is set to "reverse" or when the traveling direction of the vehicle V1 is changed to the rear. If the display control device 40 determines that the retraction is not started (No in step S11), the processing in step S11 is performed again. If the display control device 40 determines that the reverse is started (Yes in step S11), the process proceeds to step S12.
  • step S11 If it is determined in step S11 that retraction has been started (Yes in step S11), the display control device 40 starts overhead view image display (step S12). More specifically, the display control device 40 generates the overhead view video 100 by the overhead view video generation unit 44 and causes the display control unit 48 to display the overhead view video 100 on the display panel 31. The display control device 40 proceeds to step S13.
  • the display control device 40 determines whether the storage is completed (step S13). More specifically, based on the host vehicle information acquired by the host vehicle information acquisition unit 42, the display control device 40 may, for example, determine that the shift position is "parking" or "neutral", or for a time of 5 seconds or more, When it is detected that the speed is zero, or when the engine stop or the operation of the side brake or the foot brake is detected, it is determined that the storage is completed. If the display control device 40 determines that the storage is completed, the display control device 40 determines that the overhead image display is to be ended (Yes in step S13), and proceeds to step S14. If the display control device 40 determines that the storage is not completed, the display control device 40 determines that the video display is not ended (No in step S13), and executes the process of step S13 again.
  • step S13 When it is determined that the storage is completed (Yes in step S13), the display control device 40 ends the overhead image display (step S14). The display control device 40 proceeds to step 15.
  • the display control device 40 causes the adjacent information acquisition unit 43 to acquire the first obstacle information and causes the storage unit 49 to store the information (step S15). Then, the display control device 40 ends the process.
  • the first obstacle information indicating the surrounding condition of the vehicle V1 at the time of storage will be described with reference to FIG.
  • an adjacent vehicle Va exists in the parking area PA
  • an adjacent vehicle Vc exists in the parking area PC
  • an adjacent vehicle Vd exists in the parking area PD.
  • Adjacent vehicles do not exist in the parking section PB and the parking section PE.
  • the first obstacle information is stored in the storage unit 49 in step S15.
  • the display control device 40 determines the presence or absence of the advance trigger (step S21).
  • the display control device 40 determines the presence or absence of the advance trigger based on the host vehicle information acquired by the host vehicle information acquisition unit 42.
  • the forward trigger is detected, for example, by the fact that the engine has been started, that the shift position has been set to "forward" or the like, and that the side brake or the foot brake has been released. If the display control device 40 determines that there is no advance trigger (No in step S21), the display control device 40 executes the process of step S21 again. If the display control device 40 determines that there is a forward trigger (Yes in step S21), the process proceeds to step S22.
  • step S21 determines whether it is forward from the warehousing (step S22). More specifically, the display control device 40 has a shift position of, for example, "parking" or "neutral" immediately before the forward trigger is detected based on the host vehicle information acquired by the host vehicle information acquisition unit 42. Or, if it is determined that the speed is zero for a time of 5 seconds or more, or the engine is stopped, or that the side brake or foot brake has been operated, the advance from the warehousing judge. Alternatively, when the first obstacle information is stored in the storage unit 49, the display control device 40 may determine that it is an advance from the storage.
  • the display control device 40 may determine that it is an advance from the storage.
  • step S22 When it is determined that the display control device 40 does not advance from the storage (No in step S22), the display control device 40 executes the process of step S21 again. If the display control device 40 determines that the vehicle is moving forward from the storage (Yes in step S22), the process proceeds to step S23.
  • the display control device 40 causes the adjacent information acquisition unit 43 to acquire the second obstacle information and causes the comparison unit 45 to compare the first obstacle information with the second obstacle information (step S23).
  • the display control device 40 obtains a change in the surrounding condition of the vehicle V1 between the time of storage and the time of storage by comparison by the comparison unit 45.
  • the display control device 40 proceeds to step S24.
  • second obstacle information indicating the peripheral situation of the vehicle V1 at the time of leaving will be described.
  • the adjacent vehicle Va exists in the parking area PA
  • the adjacent vehicle Vc exists in the parking area PC
  • the adjacent vehicle Vd exists in the parking area PD
  • the adjacent vehicle Ve exists in the parking area PE.
  • These pieces of information are detected by the sensor unit 21 and acquired by the adjacent information acquisition unit 43 as second obstacle information.
  • the comparison unit 45 finds that the adjacent vehicles Ve of the parking section PE, which did not exist at the time of storage, are increasing.
  • the display control device 40 determines whether the number of adjacent vehicles has increased (step S24). If the display control device 40 determines that the adjacent vehicle is not increasing as a result of the comparison by the comparison unit 45 (No in step S24), the process ends. In this case, the overhead image 100 is not displayed. If the display control device 40 determines that the adjacent vehicle has increased as a result of the comparison by the comparison unit 45 (Yes in step S24), the process proceeds to step S25.
  • the display control device 40 determines whether or not forward movement has been started (step S25). For example, when it is detected that the speed is zero or more or that the vehicle V1 has traveled forward, the display control device 40 advances based on the host vehicle information acquired by the host vehicle information acquisition unit 42. Determine that it has started. If the display control device 40 determines that the vehicle V1 is not moving forward (No in step S25), the processing in step S25 is performed again. If the display control device 40 determines that the vehicle V1 has moved forward (Yes in step S25), the process proceeds to step S26.
  • step S25 When it is determined in step S25 that forward movement has been started (Yes in step S25), the display control device 40 starts overhead image display (step S26). More specifically, the display control device 40 generates the overhead view video 100 by the overhead view video generation unit 44 and causes the display control unit 48 to display the overhead view video 100 on the display panel 31. The display control device 40 proceeds to step S27.
  • step S26 The timing at which the overhead image 100 is displayed on the display panel 31 is exemplified by step S26 as an example, but when the vehicle V1 starts the engine, when the shift position is set to "drive", the vehicle V1 starts to move forward For example, it is optional if it is an appropriate timing to check the surrounding situation when leaving the machine.
  • the overhead view image 100 displayed in step S26 will be described with reference to FIG.
  • the adjacent vehicle Vd present in the parking section PD and the adjacent vehicle Ve present in the parking section PE are included in the display range.
  • the display control device 40 determines whether the end condition of the overhead image display is satisfied (step S27). More specifically, based on the host vehicle information acquired by the host vehicle information acquisition unit 42, the display control device 40, for example, has traveled a predetermined distance or more from the position at which forward movement has started, or the vehicle speed has reached a predetermined speed or more. When it is detected, it is determined that the end condition of the overhead image display is satisfied. If the display control device 40 determines that the overhead video display end condition is satisfied, the display control device 40 determines that the overhead video display is to be ended (Yes in step S27), and the process proceeds to step S28. If the display control device 40 determines that the overhead video display end condition is not satisfied, it determines that the video display is not to be ended (No in step S27), and executes the processing of step S27 again.
  • step S27 If it is determined in step S27 that the end condition of the overhead view video display is satisfied (Yes in step S27), the display control device 40 ends the overhead view video display (step S28). The display control device 40 deletes the first obstacle information stored in the storage unit 49. Then, the display control device 40 ends the process.
  • the comparison unit 45 determines that the adjacent vehicle is increasing when the vehicle V1 is leaving
  • the overhead image 100 generated by the overhead image generation unit 44 is displayed on the display panel 31. Display.
  • the overhead view video 100 can be appropriately displayed according to the change of the surrounding condition at the time of leaving the house.
  • the overhead view video 100 is displayed only when it is necessary to display the overhead view video 100 according to the change in the surrounding conditions at the time of leaving the house. As described above, according to the present embodiment, it is possible to suppress that the overhead view video is displayed when there is no need to display the overhead view video or when it is desired to check the route by navigation. .
  • FIG. 7 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the second embodiment.
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the first embodiment.
  • the same components as those of the display control system 1 are denoted by the same reference numerals or the corresponding reference numerals, and the detailed description thereof is omitted.
  • the display control system 1 is different from the first embodiment in that the increased adjacent vehicles are highlighted in the overhead view image 100 to be generated.
  • the combining processing unit 443 When the comparing unit 45 determines that the adjacent vehicle is increasing when the vehicle V1 is leaving the composition processing unit 443, the combining processing unit 443 generates the overhead image 100 in which the increased adjacent vehicle is emphasized. For example, the composition processing unit 443 highlights the adjacent vehicle whose position has been changed by coloring it or by surrounding it with a thick line.
  • the display control unit 48 causes the display panel 31 to display the overhead image 100 in which the adjacent vehicle determined to be increasing as the comparison unit 45 is highlighted.
  • the overhead view image 100 will be described with reference to FIG.
  • the increased adjacent vehicle Ve is colored and highlighted.
  • the comparison unit 45 determines that the adjacent vehicle is increasing when the vehicle V1 is leaving
  • the overhead image 100 in which the increased adjacent vehicle is highlighted is displayed Display on panel 31.
  • the present embodiment can easily grasp the adjacent vehicles increased at the time of storage and at the time of storage.
  • the present embodiment in the overhead view image 100, it is easy to recognize a point that has changed since the time of storage, so that it is possible to easily confirm the point to be kept in mind when the driver leaves the house.
  • FIG. 8 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the third embodiment.
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the first embodiment.
  • the display control system 1 is different from the first embodiment in that a notification icon 120 for notifying the direction in which the increased adjacent vehicle is present is displayed in the overhead image 100 to be generated.
  • the comparison processing unit 443 determines that the adjacent vehicle is increasing when the comparison unit 45 leaves the vehicle V1
  • the overhead image 100 displaying the notification icon 120 indicating the direction in which the increased adjacent vehicle exists is displayed.
  • the display control unit 48 includes the notification icon 120 indicating the direction of the adjacent vehicle determined to be increasing when the comparison unit 45 determines that the adjacent vehicle is increasing when the vehicle V1 is leaving.
  • the image 100 is displayed on the display panel 31.
  • the overhead image 100 will be described with reference to FIG. In the present embodiment, it is assumed that the adjacent vehicle is not present in the parking section PC at the time of storage, and the adjacent vehicle Vc is present in the parking section PC at the time of leaving.
  • a notification icon 120 indicating the direction in which the increased adjacent vehicle Vc exists is displayed.
  • the notification icon 120 indicating the direction in which the increased adjacent vehicle is present is displayed.
  • the displayed overhead view image 100 is displayed on the display panel 31.
  • it is easy to recognize a point that has changed from the time of storage so it is possible to easily confirm the point to be kept in mind when the driver leaves the house.
  • FIG. 9 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the fourth embodiment.
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the first embodiment.
  • the display control system 1 is different from the first embodiment in that the overhead image 100 is displayed in which the display range is changed so that the increased adjacent vehicle is included in the display range in the overhead image 100 to be generated.
  • the cropping processing unit 442 performs cropping processing in the cropping range such that the increased adjacent vehicles are included in the display range from the peripheral video data subjected to the viewpoint conversion processing.
  • the increased number of adjacent vehicles may be partially included in the display range of the overhead image 100.
  • the comparison control unit 45 determines that the adjacent vehicle is increasing when the vehicle V1 is leaving the display control unit 48, the display range is changed so as to include the adjacent vehicle determined to be increasing.
  • the image 100 is displayed on the display panel 31.
  • the overhead view image 100 will be described with reference to FIG. In the present embodiment, it is assumed that the adjacent vehicle is not present in the parking section PC at the time of storage, and the adjacent vehicle Vc is present in the parking section PC at the time of leaving.
  • the overhead view video 100 the overhead view video 100 is displayed in which the display range is expanded forward so that the increased adjacent vehicle Vc is displayed.
  • the display range is changed so that the increased adjacent vehicle is displayed.
  • the overhead image 100 is displayed on the display panel 31.
  • the adjacent vehicles increased in the overhead view image 100 are displayed, it is possible to easily grasp the adjacent vehicles increased at the time of storage and at the time of storage.
  • it is easy to recognize a point that has changed since the time of storage so that it is possible to easily confirm the point to be kept in mind when the driver leaves the house.
  • FIG. 13 is a diagram for explaining a parking lot, and shows a state at the time of storage.
  • FIG. 14 is a diagram for explaining the parking section, and shows a state at the time of leaving the house.
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the first embodiment.
  • the sensor unit 21 can further detect the position in addition to the presence or absence of an obstacle present in the vicinity of the vehicle V1.
  • the sensor unit 21 is, as an obstacle existing in the vicinity of the vehicle V1, an adjacent vehicle existing in the parking area PA or the parking area PE adjacent to the parking area P1 which is a parking area for receiving the vehicle V1. Detect the position of
  • the front center sensor is disposed at the front center of the vehicle V1 and detects the position of an obstacle at the front center of the vehicle V1.
  • the front center sensor detects the position of the adjacent vehicle present in the parking section PB.
  • the front center sensor outputs obstacle position information indicating the position of the adjacent vehicle, which is a detection result, to the adjacent information acquisition unit 43 of the display control device 40.
  • obstacle position information include the presence or absence of an adjacent vehicle in the detection range of the front center sensor, the distance to the adjacent vehicle, and the existing range of the adjacent vehicle in the horizontal direction.
  • the distance to the adjacent vehicle is the distance from the vehicle V1 to the closest part of the adjacent vehicle.
  • the distance La to the adjacent vehicle Va is the distance from the left front end of the vehicle V1 to the portion closest to the vehicle V1 of the adjacent vehicle Va.
  • the distance Lb to the adjacent vehicle Vb is the distance from the front end of the vehicle V1 to the portion of the adjacent vehicle Vb closest to the vehicle V1.
  • the distance Lc to the adjacent vehicle Vc is the distance from the right front end of the vehicle V1 to the portion closest to the vehicle V1 of the adjacent vehicle Vc.
  • the distance Ld to the adjacent vehicle Vd is the distance from the left side surface of the vehicle V1 to the right side surface of the adjacent vehicle Vd.
  • the distance Le to the adjacent vehicle Ve is the distance from the right side surface of the vehicle V1 to the left side surface of the adjacent vehicle Ve.
  • Such sensor unit 21 can detect the positions of adjacent vehicles in all directions of the vehicle V1. In the present embodiment, based on the detection result of the sensor unit 21, it is possible to detect the position of at least one of the left and right sides of the vehicle V1, the left and right front, and the front adjacent vehicle. More specifically, the detection result of the sensor unit 21 can detect the position of the adjacent vehicle present in the parking section PA or the parking section PE. More specifically, a sensor that detects an adjacent vehicle, a distance to an adjacent vehicle detected by the sensor, and a horizontal presence range of the adjacent vehicle detected by the sensor, which are included in the detection result, The position is identified.
  • the adjacent information acquisition unit 43 acquires first obstacle position information when the vehicle V1 retreats and stores in, and second obstacle position information when the vehicle V1 moves forward from the storage state and leaves the house. More specifically, the adjacent information acquisition unit 43 acquires the first obstacle position information from the sensor unit 21 when the vehicle V1 retreats and stores the goods. The adjacent information acquisition unit 43 causes the storage unit 49 to store the first obstacle position information at the time of storage. Further, the adjacent information acquisition unit 43 acquires the second obstacle position information from the sensor unit 21 when the vehicle V1 moves forward from the storage state and exits. The adjacent information acquisition unit 43 outputs the second obstacle position information at the time of leaving the storage unit to the comparison unit 45.
  • the first obstacle position information is position information of an obstacle around the vehicle V1 when the vehicle V1 moves backward and is stored.
  • the first obstacle position information is position information of an adjacent vehicle present in the parking section PA or the parking section PE when the vehicle V1 retreats and is stored.
  • the second obstacle position information is position information of an obstacle around the vehicle V1 when the vehicle V1 advances from the storage state and exits.
  • the second obstacle position information is position information of an adjacent vehicle present in the parking section PA or the parking section PE when the vehicle V1 advances from the storage state and exits.
  • the overhead video generation unit 44 generates the overhead video 100 when the comparison unit 45 determines that the position of the adjacent vehicle changes in the direction approaching the vehicle V1 when the vehicle V1 is leaving.
  • the comparison unit 45 compares the first obstacle position information and the second obstacle position information, and compares whether the position of the obstacle of the vehicle V1 has changed when the vehicle V1 is leaving. In the present embodiment, the comparison unit 45 compares the first obstacle position information and the second obstacle position information, and changes the position of the adjacent vehicle in the direction closer to the vehicle V1 when the vehicle V1 exits. Compare whether or not More specifically, the comparison unit 45 compares the position of the adjacent vehicle in each adjacent parking area at the time of storage with the position of the adjacent vehicle in each adjacent parking area at the time of leaving, and the distance from the vehicle V1 becomes shorter than at the time of storage. When there is an adjacent vehicle, it is determined that the position of the adjacent vehicle is changing in the direction approaching the vehicle V1.
  • the comparison unit 45 compares the position of the adjacent vehicle in each adjacent parking area at the time of storage with the position of the adjacent vehicle in each adjacent parking area at the time of leaving, and the distance from the vehicle V1 is longer than at the time of storage When there is only an adjacent vehicle that has not changed, it is determined that the position of the adjacent vehicle has not changed in the direction approaching the vehicle V1.
  • the comparison unit 45 determines that there is no change when the position does not change even if the adjacent vehicle is replaced at the time of storage and at the time of storage.
  • the case where the position does not change here may include, for example, a change of less than 5 cm in addition to the completely same position.
  • the display control unit 48 displays the overhead view video 100 generated by the overhead view video generation unit 44 on the display panel 31. Display.
  • the display control unit 48 Display on 31 More specifically, when the comparison unit 45 determines that there is an adjacent vehicle whose distance from the vehicle V1 is shorter than that at the time of storage of the vehicle V1 when the comparison unit 45 leaves the vehicle, the display control unit 48 The image 100 is displayed on the display panel 31.
  • FIG. 15 is a flowchart illustrating another example of the flow of processing in the display control device of the display control system according to the fifth embodiment.
  • the display control device 40 performs the same process as the flowchart shown in FIG. In step S ⁇ b> 15, the display control device 40 causes the adjacent information acquisition unit 43 to acquire the first obstacle position information and causes the storage unit 49 to store the first obstacle position information.
  • the first obstacle position information indicating the surrounding situation of the vehicle V1 at the time of storage will be described with reference to FIG.
  • the adjacent vehicle Va exists in the parking area PA
  • the adjacent vehicle Vb exists in the parking area PB
  • the adjacent vehicle Vc exists in the parking area PC
  • the adjacent vehicle Vd exists in the parking area PD
  • the parking area PE There is an adjacent vehicle Ve.
  • the positional information of the adjacent vehicle Va or the adjacent vehicle Ve is detected by the sensor unit 21 at the time of storage of the vehicle V1, and is acquired by the adjacent information acquisition unit 43 as first obstacle position information.
  • the first obstacle position information is stored in the storage unit 49 in step S15.
  • step SA21, step SA22, and step SA26 to step SA29 performs the same process as step S21, step S22, and step S25 to step S28 of the flowchart shown in FIG.
  • the display control device 40 causes the adjacent information acquisition unit 43 to acquire the second obstacle position information and causes the comparison unit 45 to compare the first obstacle position information and the second obstacle position information (step SA23).
  • the display control device 40 obtains a change in the surrounding condition of the vehicle V1 between the time of storage and the time of storage by comparison by the comparison unit 45.
  • the display control device 40 proceeds to step SA24.
  • the second obstacle position information indicating the surrounding situation of the vehicle V1 at the time of leaving will be described with reference to FIG.
  • the positions of the vehicle Va adjacent to the parking space PA and the vehicle Ve adjacent to the parking space PE have changed.
  • Distance La1 from the vehicle V1 of the adjacent vehicle Va at the time of leaving is shorter than the distance La from the vehicle V1 of the adjacent vehicle Va at the time of storage.
  • Distance Le1 from the vehicle V1 of the adjacent vehicle Ve at the time of leaving the vehicle is shorter than the distance Le from the vehicle V1 of the adjacent vehicle Ve at the time of storage.
  • the positions of the adjacent vehicle Va and the adjacent vehicle Ve at the time of storage are indicated by broken lines.
  • the position information of the adjacent vehicle Va or the adjacent vehicle Ve is detected by the sensor unit 21 and acquired by the adjacent information acquisition unit 43 as second obstacle position information.
  • the first obstacle position information and the second obstacle position information are compared by the comparison unit 45, it can be understood that the positions of the adjacent vehicle Va and the adjacent vehicle Ve are changed.
  • the display control device 40 determines whether the position of the adjacent vehicle has changed (step SA24). If the display control device 40 determines that the position of the adjacent vehicle has not changed as a result of the comparison by the comparison unit 45 (No in step SA24), the process ends. In this case, the overhead image 100 is not displayed. If the display control device 40 determines that the position of the adjacent vehicle has changed as a result of the comparison by the comparison unit 45 (Yes in step SA24), the process proceeds to step SA25.
  • the display control device 40 determines whether the position of the adjacent vehicle has changed in the direction approaching the vehicle V1 (step SA25). Display control device 40 compares the first obstacle position information and the second obstacle position information, and when it is determined that the position of the adjacent vehicle has not changed in the direction approaching to vehicle V1 (at step SA25) No), end the process. The display control device 40 compares the first obstacle position information and the second obstacle position information, and it is determined that the position of the adjacent vehicle has changed in the direction approaching the vehicle V1 (Yes in step SA25) Proceed to step SA26.
  • the overhead image generation unit 44 generates the overhead image when the comparison unit 45 determines that the position of the adjacent vehicle changes in the direction approaching the vehicle V1 when the vehicle V1 is leaving.
  • the overhead image 100 is displayed on the display panel 31.
  • the overhead view video 100 can be appropriately displayed according to the change of the surrounding condition at the time of leaving the house.
  • the overhead view video 100 is displayed only when it is necessary to display the overhead view video 100 according to the change in the surrounding conditions at the time of leaving the house. As described above, according to the present embodiment, it is possible to suppress that the overhead view video is displayed when there is no need to display the overhead view video or when it is desired to check the route by navigation. .
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the fifth embodiment.
  • the display control system 1 is different from the fifth embodiment in that the adjacent vehicle whose position is changed is highlighted in the overhead image 100 to be generated.
  • the combining processing unit 443 When the comparing unit 45 determines that the position of the adjacent vehicle has changed when the vehicle V1 is leaving, the combining processing unit 443 generates the overhead image 100 in which the adjacent vehicle whose position has changed has been emphasized. For example, the composition processing unit 443 highlights the adjacent vehicle whose position has been changed by coloring it or by surrounding it with a thick line.
  • the display control unit 48 causes the display panel 31 to display the overhead image 100 in which the adjacent vehicle determined to have a changed position by the comparison unit 45 is highlighted.
  • the overhead view image 100 will be described with reference to FIG. 7 used in the description of the second embodiment. In the present embodiment, it is assumed that the position of the vehicle Ve adjacent to the parking section PE is changed. In the overhead image 100, the adjacent vehicle Ve whose position has been changed is highlighted and displayed.
  • the comparison unit 45 determines that the position of the adjacent vehicle is changing when the vehicle V1 is leaving, the adjacent vehicle whose position has been changed is highlighted.
  • the image 100 is displayed on the display panel 31.
  • the present embodiment can easily grasp the adjacent vehicle whose position has changed between the time of storage and the time of storage.
  • the present embodiment in the overhead view image 100, it is easy to recognize a point that has changed since the time of storage, so that it is possible to easily confirm the point to be kept in mind when the driver leaves the house.
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the fifth embodiment.
  • the display control system 1 is different from the fifth embodiment in that the display control system 1 displays a notification icon 120 for notifying the direction in which the adjacent vehicle whose position has changed is present in the overhead image 100 to be generated.
  • the combining processing unit 443 displays the notification icon 120 indicating the direction in which the adjacent vehicle whose position is changed is present.
  • An overhead video 100 is generated.
  • the combining processing unit 443 causes the notification icon 120 in the arrow shape to be displayed.
  • the overhead view image 100 will be described with reference to FIG. 8 used for the description in the second embodiment. In the present embodiment, it is assumed that the position of the adjacent vehicle Vc in the parking section PC has changed. In the overhead view video 100, a notification icon 120 indicating the direction in which the adjacent vehicle Vc whose position has changed is present is displayed.
  • the comparison unit 45 determines that the position of the adjacent vehicle changes when the vehicle V1 is leaving, it indicates the direction in which the adjacent vehicle whose position has changed is present.
  • the bird's-eye view image 100 displaying the notification icon 120 is displayed on the display panel 31.
  • the adjacent vehicle whose position has been changed is not displayed in the overhead image 100, it is possible to easily grasp the direction in which the adjacent vehicle whose position is changed between the time of storage and the time of departure is present.
  • even if it is not displayed in the overhead view video 100 it is easy to recognize a point that has changed from the time of storage, so it is possible to easily confirm the point to be kept in mind when the driver leaves the house.
  • FIG. 16 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the eighth embodiment.
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the fifth embodiment.
  • the display control system 1 differs from the fifth embodiment in that, in the overhead view image 100 to be generated, the overhead view image 100 is displayed in which the display range is changed such that the adjacent vehicle whose position has changed is included in the display range.
  • the clipping processing unit 442 performs clipping processing in the clipping range such that the adjacent vehicle whose position has been changed is included in the display range from the peripheral video data subjected to the viewpoint conversion processing.
  • the increased number of adjacent vehicles may be partially included in the display range of the overhead image 100.
  • the display control unit 48 determines that the position of the adjacent vehicle has changed when the comparison unit 45 has left the vehicle V1, the display range is set so that the adjacent vehicle determined to have a changed position is included. Is displayed on the display panel 31. FIG.
  • the overhead view image 100 will be described with reference to FIG. In the present embodiment, it is assumed that the position of the adjacent vehicle Vc in the parking section PC has changed.
  • the overhead view video 100 is displayed in which the display range is expanded forward so that the adjacent vehicle Vc whose position has changed is displayed.
  • the comparison unit 45 determines that the position of the adjacent vehicle is changing when the vehicle V1 is leaving, the adjacent vehicle whose position is changed is displayed.
  • the overhead view image 100 whose display range has been changed is displayed on the display panel 31.
  • the adjacent vehicle whose position has been changed is displayed on the overhead image 100, it is possible to easily grasp the adjacent vehicle whose position has changed between the time of storage and the time of leaving.
  • it is easy to recognize a point that has changed since the time of storage so that it is possible to easily confirm the point to be kept in mind when the driver leaves the house.
  • FIG. 17 is a diagram showing a parking lot for explaining processing in the display control device of the display control system according to the ninth embodiment.
  • FIG. 18 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the ninth embodiment.
  • the basic configuration of the display control system 1 is the same as that of the display control system 1 of the fifth embodiment.
  • the display control system 1 is different from the fifth embodiment in that a bird's-eye view image is displayed when the position of the adjacent vehicle is changing in the leaving direction of the vehicle at the time of leaving.
  • the overhead video generation unit 44 generates the overhead video 100 when the comparison unit 45 determines that the position of the adjacent vehicle changes in the exit direction of the vehicle V1 when the vehicle V1 is exited.
  • the change in the exit direction of the vehicle V1 means that the position of the adjacent vehicle changes in front of each parking section.
  • the positions of the adjacent vehicle Va and the adjacent vehicle Ve are changed forward, in other words, the positions are changed in the delivery direction of the vehicle V1.
  • the front end of the adjacent vehicle Va at the time of leaving is moved forward by the distance Da than the front end of the adjacent vehicle Va at the time of storage.
  • the front end of the adjacent vehicle Ve at the time of leaving is moved forward by a distance De from the front end of the adjacent vehicle Ve at the time of storage.
  • the positions of the adjacent vehicle Va and the adjacent vehicle Ve at the time of storage are indicated by broken lines.
  • the comparison unit 45 compares the first obstacle position information and the second obstacle position information, and compares whether or not the position of the adjacent vehicle changes in the leaving direction of the vehicle V1 when the vehicle V1 exits. . More specifically, the comparison unit 45 compares the position of the adjacent vehicle in each adjacent parking area at the time of storage with the position of the adjacent vehicle in each adjacent parking area at the time of leaving, and there is an adjacent vehicle whose position has changed forward. At this time, it is determined that the position of the adjacent vehicle is changing in the delivery direction of the vehicle V1.
  • the comparison unit 45 compares the position of the adjacent vehicle in each adjacent parking area at the time of storage with the position of the adjacent vehicle in each adjacent parking area at the time of leaving, and the position changes backward or does not change When only the adjacent vehicle exists, it is determined that the position of the adjacent vehicle has not changed in the leaving direction of the vehicle V1.
  • the comparison unit 45 determines that there is no change when the position does not change even if the adjacent vehicle is replaced at the time of storage and at the time of storage.
  • the display control unit 48 causes the display panel 31 to display the overhead view image 100 when the comparison unit 45 determines that the position of the adjacent vehicle is changing in the delivery direction of the vehicle V1 when the vehicle V1 is leaving. More specifically, when the display control unit 48 determines that the position of the adjacent vehicle at the time of leaving is changed in the direction of leaving the vehicle V1 with respect to the position of the adjacent vehicle at the time of entering, At the same time, the overhead image 100 is displayed on the display panel 31.
  • steps SA31 to SA34 and steps SA36 to SA39 perform the same processes as steps SA21 to SA24 and steps SA26 to SA29 of the flowchart shown in FIG.
  • the display control device 40 determines whether or not the position of the adjacent vehicle has changed in the delivery direction (step SA35). If the display control device 40 compares the first obstacle position information with the second obstacle position information and it is determined that the position of the adjacent vehicle has not changed in the delivery direction of the vehicle V1 (No in step SA35) ), End the process. If the display control device 40 determines that the position of the adjacent vehicle has changed in the exit direction of the vehicle V1 by comparing the first obstacle position information and the second obstacle position information (Yes in step SA35), Proceed to step SA36.
  • the overhead-view video generation unit 44 generates the comparison image when the comparison unit 45 determines that the position of the adjacent vehicle changes in the direction of exit of the vehicle V1 when the vehicle V1 exits.
  • the overhead image 100 is displayed on the display panel 31.
  • the overhead view video 100 can be appropriately displayed according to the change of the surrounding condition at the time of leaving the house. According to the present embodiment, it is possible to display the overhead view video 100 when the position of the adjacent vehicle is changing in the leaving direction of the vehicle V ⁇ b> 1 than at the time of entering the storage.
  • the driver checks the periphery of the vehicle V1 by the overhead image 100 in addition to visual or mirror confirmation. Can be checked properly.
  • FIG. 21 is a block diagram showing a configuration example of a display control system according to the tenth embodiment.
  • the basic configuration of the display control system 1B is the same as that of the display control system 1 of the first embodiment.
  • the display control system 1B properly displays a bird's-eye view image according to the driver's change (surrounding confirmation condition) at the time of leaving the house.
  • the display control system 1B will be described with reference to FIG.
  • the display control system 1B includes a front camera (shooting unit) 11, a rear camera (shooting unit) 12, a left side camera (shooting unit) 13, a right side camera (shooting unit) 14, and a driver's seat camera (driving A seat detection unit 21B, a display panel (display unit) 31, and a display control device 40B.
  • the driver's seat camera 21B is disposed toward the seating position of the driver's seat of the vehicle V1.
  • the driver's seat camera 21 ⁇ / b> B captures an image of the driver sitting in the driver's seat so that the person can be recognized.
  • the video taken by the driver's seat camera 21B includes information that allows the driver to be recognized.
  • the driver's seat camera 21B outputs the captured video to the driver information acquisition unit (information acquisition unit) 43B of the display control device 40B.
  • the display control device 40B includes an image data acquisition unit 41, a host vehicle information acquisition unit 42, a driver information acquisition unit 43B, an overhead image generation unit 44B, a comparison unit 45B, a display control unit 48B, and an internal memory.
  • a certain storage unit 49 is included.
  • the driver information acquisition unit 43B receives the first driver information (first information) that recognizes the driver of the vehicle V1 when the vehicle V1 retreats and stores by the camera 21B for the driver's seat, and the vehicle V1 stores The second driver information (second information) that recognizes the driver of the vehicle V1 when advancing from the state and leaving the vehicle is acquired. More specifically, the driver information acquisition unit 43B acquires, as the first driver information, video data captured by the driver seat camera 21B when the vehicle V1 moves backward and stores the goods. The driver information acquisition unit 43B causes the storage unit 49 to store the first driver information at the time of storage.
  • the driver information acquisition unit 43B acquires, as second driver information, video data captured by the driver seat camera 21B.
  • the determination that the vehicle V1 has moved backward and stored, and the determination that the vehicle V1 has advanced and left the storage are determined based on the gear operation information of the vehicle V1 acquired from the host vehicle information acquisition unit 42 and the on / off information of the engine.
  • the driver information acquisition unit 43B causes the storage unit 49 to store the second driver information at the time of leaving the house.
  • the first driver information is video data obtained by capturing the face of the driver of the vehicle V1 when the vehicle V1 retreats and stores.
  • the second driver information is video data obtained by capturing the face of the driver of the vehicle V1 when the vehicle V1 advances from the storage state and exits.
  • the overhead image generation unit 44 ⁇ / b> B When it is determined that the second driver information when the vehicle V ⁇ b> 1 leaves the vehicle is different from the first driver information, the overhead image generation unit 44 ⁇ / b> B generates the overhead image 100.
  • the overhead view video generation unit 44B generates the overhead view video 100 when the comparison unit 45B determines that the driver is different from the driver at the time of storage when the vehicle V1 is leaving.
  • the overhead view image 100 will be described with reference to FIG. FIG. 22 is a view showing an example of a bird's-eye view image generated by the display control device of the display control system according to the tenth embodiment.
  • the overhead image 100 displays a range of about 2 m from the vehicle V1.
  • the overhead image 100 is located at a central portion surrounded by the front image 101, the rear image 102, the left side image 103, the right side image 104, the front image 101, the rear image 102, the left direction image 103, and the right side image 104.
  • a vehicle icon 110 indicates the position and the direction of the vehicle V1.
  • the vehicle icon 110 is disposed at the center with the front-rear direction parallel to the front-rear direction of the overhead view video 100.
  • the comparison unit 45 ⁇ / b> B compares the first driver information and the second driver information, and compares the drivers at the time of entering and leaving the vehicle V ⁇ b> 1.
  • the comparison unit 45 performs image processing to compare the image obtained by photographing the driver at the time of storage with the image obtained by photographing the driver at the time of storage with the driver at the time of storage and driving at the time of storage It is determined whether or not the person is the same.
  • the display control unit 48B causes the display panel 31 to display the overhead view video 100 generated by the overhead view video generation unit 44B at the time of storage. If the display control unit 48B determines that the second driver information at the time when the vehicle V1 leaves the storage unit is different from the first driver information, the display control unit 48B displays the overhead video 100 generated by the overhead video generation unit 44B as a display panel Display on 31 In the present embodiment, the display control unit 48B causes the display panel 31 to display the overhead image 100 when the comparison unit 45B determines that the driver at the time of storage and the driver at the time of storage are different.
  • FIG. 23 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the tenth embodiment.
  • FIG. 24 is a flowchart illustrating another example of the flow of processing in the display control device of the display control system according to the tenth embodiment.
  • steps SB12 to SB15 are similar to the processes in steps S11 to S14 of the flowchart shown in FIG.
  • the display control device 40B causes the driver information acquisition unit 43B to acquire the first driver information and causes the storage unit 49 to store the first driver information (step SB11). Then, the display control device 40B proceeds to step SB12.
  • step SB22, step SB23, and step SB26 to step SB29 perform the same processes as step S21, step S22, and step S25 to step S28 of the flowchart shown in FIG.
  • the display control device 40B causes the driver information acquisition unit 43B to acquire the second driver information and causes the storage unit 49 to store the second driver information (step SB21). Then, the display control device 40B proceeds to step SB22.
  • the display control device 40B causes the comparison unit 45B to compare the first driver information with the second driver information (step SB24).
  • the display control device 40B compares the comparison unit 45B to determine whether the drivers of the vehicle V1 at the time of storage and at the time of storage are the same.
  • the display control device 40B proceeds to step SB25.
  • the display control device 40B determines whether the driver is different (step SB25). If the display control device 40B determines that the drivers are the same as a result of comparison by the comparison unit 45B (No in step SB25), the process ends. In this case, the overhead image 100 is not displayed. If the display control device 40B determines that the driver is different as a result of comparison by the comparison unit 45B (Yes in step SB25), the process proceeds to step SB26.
  • the overhead image 100 generated by the overhead image generating unit 44B is displayed on the display panel 31. Display.
  • the overhead view image 100 can be appropriately displayed according to the change of the driver at the time of leaving the house.
  • the driver can appropriately confirm the periphery of the vehicle V1 by the overhead image 100 in addition to the visual or mirror confirmation.
  • the overhead view video 100 is displayed only when it is necessary to display the overhead view video 100 according to the change of the driver at the time of leaving the house.
  • the overhead video is displayed when the overhead video is displayed in a situation where there is no need to display the overhead video, or when it is desired to check the route by navigation. It can be suppressed that it is displayed.
  • FIG. 25 is a flowchart illustrating an example of the flow of processing in the display control device of the display control system according to the eleventh embodiment.
  • the basic configuration of the display control system 1B is the same as that of the display control system 1B of the tenth embodiment.
  • the display control system 1B is different from the tenth embodiment in that the display control system 1B displays the overhead image 100 when the driver's face position at the time of leaving is lower than or equal to the predetermined position by the driver's face position at the time of storage.
  • the driver's seat camera 21B captures an image of the face position of the driver relative to the vehicle V1 in a recognizable manner.
  • the camera 21B for the driver's seat photographs a part of the vehicle V1 whose position is fixed and the face of the driver as an object to be photographed.
  • the camera 21B for the driver's seat captures a change in the face position of the driver in a recognizable manner.
  • the camera 21B for the driver's seat captures images in the same imaging range at the time of storage and at the time of storage.
  • the driver's face position is acquired by performing image processing on the image captured by the driver's seat camera 21B.
  • the driver's face position referred to here may be replaced with the driver's eye position.
  • the driver information acquisition unit 43B acquires information on the face position of the driver.
  • the driver information acquisition unit 43B acquires information on the driver's face position at the time of storage as the first driver information, and acquires information on the driver's face position at the time of leaving as the second driver information.
  • the overhead video generation unit 44B generates the overhead video 100 when the comparison unit 45B determines that the driver's face position at the time of leaving is lower than or equal to a predetermined level based on the driver's face position at the time of storage.
  • the comparison unit 45B performs image processing, compares the driver's face position at storage and the driver's face position at storage, and compares the driver's face position at storage and the driver's face position at storage It is determined whether or not there is a difference of a predetermined position or more in the vertical direction. For example, the difference of the predetermined position or more is 5 cm.
  • the display control unit 48B causes the overhead image 100 to be displayed on the display panel 31 when the comparison unit 45B determines that the driver's face position at the time of leaving is lower than or equal to a predetermined level. .
  • steps SB31 to SB34 and steps SB36 to SB39 perform the same processes as steps SB21 to SB24 and steps SB26 to SB29 of the flowchart shown in FIG.
  • the display control device 40B determines whether or not the face position of the driver at the time of leaving is lower than the face position of the driver at the time of storage (step SB35).
  • the display control device 40B compares the first driver information and the second driver information and determines that the driver's face position at the time of leaving is not lower than a predetermined level or more based on the driver's face position at the time of storage. If it is (No at step SB35), the process ends.
  • the display control device 40B compares the first driver information and the second driver information and determines that the driver's face position at the time of leaving is lower than or equal to a predetermined level by the driver's face position at the time of storage. If (Yes at step SB35), the process proceeds to step SB36.
  • the comparison unit 45B determines that the driver's face position at the time of leaving is lower than the driver's face position at the time of storage
  • the overhead image 100 generated by the overhead image generation unit 44B is displayed on the display panel 31.
  • the overhead view video 100 can be appropriately displayed according to the change in the face position of the driver at the time of leaving the house. As a result, even if the driver's face position is lower than at the time of storage, and the visible range from the driver's seat changes, the driver checks the vehicle V1 by the overhead image 100 in addition to visual or mirror confirmation. The area around can be checked properly.
  • FIG. 26 is a block diagram showing a configuration example of a display control system according to the twelfth embodiment.
  • the basic configuration of the display control system 1C is the same as that of the display control system 1 of the first embodiment.
  • the display control system 1C appropriately displays the overhead view image 100, for example, when the visibility (surrounding confirmation condition) around the vehicle at the time of leaving is lower than the visibility around the vehicle V1 at the time of storage.
  • the display control system 1C will be described with reference to FIG.
  • the display control system 1C includes a front camera (shooting unit) 11, a rear camera (shooting unit) 12, a left side camera (shooting unit) 13, a right side camera (shooting unit) 14, an illuminance sensor 21C, and a display. It has a panel (display unit) 31 and a display control device 40C.
  • the illuminance sensor 21C is disposed in front of or to the left or right of the vehicle V1, and measures the illuminance of the road surface in front of the vehicle V1.
  • the illuminance sensor 21C outputs the measurement result to the visibility information acquisition unit (information acquisition unit) 43C of the display control device 40C.
  • the display control device 40C includes an image data acquisition unit 41, a host vehicle information acquisition unit 42, a visibility information acquisition unit 43C, an overhead image generation unit 44C, a comparison unit 45C, a display control unit 48C, and an internal memory. A certain storage unit 49 is included.
  • the display control device 40C may be configured of one or more devices.
  • the host vehicle information acquisition unit 42 outputs the acquired vehicle information to the visibility information acquisition unit 43C and the display control unit 48C.
  • the visibility information acquisition unit 43C is a first visibility information (first information) which is information indicating the visibility around the vehicle V1 when the vehicle V1 recedes and is stored by the illuminance sensor 21C, and the vehicle V1
  • the second visibility information (second information) is acquired, which is information indicating the visibility of the surroundings of the vehicle V1 when advancing from the storage state and leaving the storage state. More specifically, the visibility information acquisition unit 43C acquires, as the first visibility information, the illuminance information measured by the illuminance sensor 21C when the vehicle V1 retreats and stores the goods.
  • the visibility information acquisition unit 43C causes the storage unit 49 to store the first visibility information at the time of storage.
  • the visibility information acquisition unit 43C acquires the illuminance information measured by the illuminance sensor 21C as second visibility information.
  • the determination that the vehicle V1 has moved backward and stored, and the determination that the vehicle V1 has advanced and left the storage are determined based on the gear operation information of the vehicle V1 acquired from the host vehicle information acquisition unit 42 and the on / off information of the engine.
  • the visibility information acquisition unit 43C outputs the second visibility information at the time of leaving the storage unit to the comparison unit 45C.
  • the first visibility information is illuminance information of the road surface in front of or to the left or right of the vehicle V1 when the vehicle V1 moves backward and is stored.
  • the second visibility information is illuminance information of the road surface in front of or to the left or right of the vehicle V1 when the vehicle V1 moves forward from the storage state and exits.
  • the second visibility information is from the illuminance sensor 21C disposed in front of the vehicle V1
  • the headlights of the vehicle V1 and the like use the illuminance information when the vehicle is not lighted.
  • the second visibility information is from the illuminance sensor 21C disposed on the left and right sides of the vehicle V1
  • the illuminance information before leaving is used regardless of the state of the headlight of the vehicle V1 and the like.
  • the overhead video generation unit 44C When it is determined that the second visibility information when the vehicle V1 is leaving is lower in visibility than the first visibility information, the overhead video generation unit 44C generates the overhead video 100.
  • the overhead view video generation unit 44C generates the overhead view video 100 when the comparison unit 45C determines that the illuminance at the time of leaving is lower than the illuminance at the time of storage. More specifically, in the present embodiment, the overhead image generation unit 44C generates the overhead image 100 when the comparison unit 45C determines that the illuminance at the time of storage is daytime illuminance and the luminance at the time of leaving is illuminance at nighttime.
  • the comparison unit 45C compares the first visibility information and the second visibility information, and compares the visibility around the vehicle V1 when the vehicle V1 is received and when the vehicle V1 exits.
  • the comparison unit 45C compares the illuminance around the vehicle V1 at the time of storage with the illuminance around the vehicle V1 at the time of storage, and determines whether the illuminance at the time of storage is lower than the illuminance at the storage Determine if More specifically, the comparison unit 45C compares the illuminance around the vehicle V1 at the time of storage with the illuminance around the vehicle V1 at the time of storage, and the illuminance at the time of storage is daytime illuminance, and the illuminance at the time of storage is the nighttime illuminance Determine if there is.
  • the daytime illuminance is 2000 lux or more, and the nighttime illuminance is 500 lux or less.
  • the display control unit 48C determines that the second visibility information when the vehicle V1 is leaving is lower in visibility than the first visibility information. If the comparison control unit 45C determines that the second visibility information when the vehicle V1 is leaving is lower in visibility than the first visibility information, the display control unit 48C generates the overhead view video generation unit 44C.
  • the overhead image 100 is displayed on the display panel 31.
  • the display control unit 48C causes the display panel 31 to display the overhead view image 100 when the comparison unit 45C determines that the illuminance at the time of leaving is lower than the illuminance at the time of storage. More specifically, the display control unit 48C causes the overhead image 100 to be displayed on the display panel 31 when the comparing unit 45C determines that the illuminance at the time of storage is the daytime illuminance and the luminance at the leaving is the illuminance at night.
  • FIG. 27 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the twelfth embodiment.
  • FIG. 28 is a flowchart illustrating another example of the flow of processing in the display control device of the display control system according to the twelfth embodiment.
  • steps SC11 to SC14 are similar to the processes of steps S11 to S14 of the flowchart shown in FIG.
  • the display control device 40C causes the visibility information acquisition unit 43C to acquire the first visibility information and causes the storage unit 49 to store the first visibility information (Step SC15). Then, the display control device 40C ends the process.
  • step SC21, step SC22, and step SC25 to step SC28 perform the same processes as step S21, step S22, and step S25 to step S28 of the flowchart shown in FIG.
  • the display control device 40C acquires the second visibility information by the visibility information acquiring unit 43C and compares the second visibility information (Step SC23). More specifically, the display control device 40C acquires the second visibility information by the visibility information acquisition unit 43C. The display control device 40C causes the comparison unit 45C to compare the first visibility information with the second visibility information. The display control device 40C determines the presence or absence of a change in visibility around the vehicle V1 at the time of storage and at the time of storage by comparison by the comparison unit 45C. The display control device 40C proceeds to step SC24.
  • the display control device 40C determines whether the illuminance at the time of storage is daytime illuminance and the luminance at the time of leaving is illuminance at nighttime (step SC24). In the present embodiment, the display control device 40C ends the process when it is determined that the illuminance at the time of storage is not the illuminance at daytime or the luminance at the time of leaving is not the illuminance at night (No in step SC24). In this case, the overhead image 100 is not displayed. The display control device 40C proceeds to step SC25 when it determines that the illuminance at the time of storage is daytime illuminance and the luminance at the leaving time is illuminance at nighttime (Yes in step SC24).
  • the overhead image 100 generated by the overhead image generating unit 44C is displayed Display on panel 31.
  • the overhead image 100 can be appropriately displayed according to the change in the illuminance around the vehicle V1 at the time of leaving the house.
  • the driver checks the vehicle V1 with the overhead image 100 in addition to visual or mirror confirmation. The area around can be checked properly.
  • the overhead view video 100 is displayed only when it is necessary to display the overhead view video 100 according to the change in the illuminance around the vehicle V1 at the time of leaving the house. As described above, according to the present embodiment, it is possible to suppress that the overhead view video is displayed when there is no need to display the overhead view video or when it is desired to check the route by navigation. .
  • FIG. 29 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the thirteenth embodiment.
  • the basic configuration of the display control system 1C is the same as that of the display control system 1C of the twelfth embodiment.
  • the display control system 1C is different from the twelfth embodiment in that the overhead image 100 is displayed when the illuminance at the time of storage and the illuminance at the time of storage are a predetermined illuminance difference or more.
  • the overhead image generation unit 44C When the comparing unit 45C determines that the illuminance at the time of storage and the illuminance at the time of leaving the storage unit are equal to or greater than a predetermined illuminance, the overhead image generation unit 44C generates the overhead image 100.
  • Comparison unit 45C compares the illuminance around vehicle V1 at the time of storage with the illuminance around vehicle V1 at the time of storage, and the illuminance at the time of storage and the illuminance at the time of storage are not less than a predetermined illuminance difference. Determine if For example, the illuminance difference equal to or more than a predetermined value is 5000 lux.
  • the display control unit 48C causes the display panel 31 to display the overhead image 100 when the comparison unit 45C determines that the illuminance at the time of storage and the illuminance at the time of storage are the predetermined illuminance difference or more.
  • steps SC31 to SC33 and steps SC35 to SC38 perform the same processes as steps SC21 to SC23 and steps SC25 to SC28 of the flowchart shown in FIG.
  • the display control device 40C determines, as a result of comparison by the comparison unit 45C, whether the illuminance at the time of storage and the illuminance at the time of storage are equal to or greater than a predetermined illuminance (step SC34). In the present embodiment, the display control device 40C ends the process when it is determined that the illuminance at the time of storage and the illuminance at the time of storage are not a predetermined illuminance difference or more (No in step SC34). In this case, the overhead image 100 is not displayed.
  • the illuminance difference of a predetermined level or more referred to here is a difference from a high illuminance state to a low illuminance state.
  • the comparison unit 45C determines that the illuminance at the time of storage and the illuminance at the time of storage are equal to or larger than the predetermined illuminance
  • the overhead image 100 generated by the overhead image generator 44C is used. It is displayed on the display panel 31.
  • the overhead image 100 can be appropriately displayed when the illuminance at the time of storage and the illuminance at the time of storage are equal to or larger than a predetermined illuminance.
  • the driver checks the overhead image 100 in addition to visual or mirror confirmation.
  • the periphery of the vehicle V1 can be appropriately confirmed.
  • FIG. 30 is a flow chart showing an example of the flow of processing in the display control apparatus of the display control system according to the fourteenth embodiment.
  • the display control system 1C is different from the twelfth embodiment in that the overhead image 100 is displayed when the visibility is lower than at the time of storage due to the precipitation at the time of leaving.
  • the visibility information acquisition unit 43C acquires first visibility information that is information indicating the weather at the current position at the time of storage and second visibility information that is information indicating the weather at the current position at the time of leaving. For example, the visibility information acquisition unit 43C determines the weather at the current position based on at least one of detection information by a precipitation sensor disposed in the vehicle V1, operation information of a wiper, and weather information acquired via a communication network. You may get Alternatively, for example, the visibility information acquisition unit 43C may perform image processing on the peripheral video data acquired by the video data acquisition unit 41 to acquire the weather at the current position.
  • the looking-down image generating unit 44C When the comparing unit 45C determines that the visibility is lower than that at the time of leaving due to a change in weather, the looking-down image generating unit 44C generates the looking-down image 100. In the present embodiment, if the comparing unit 45C determines that the visibility is lower due to precipitation when leaving the warehouse than when entering the warehouse, the comparing unit 45C determines that the visibility is lower due to precipitation than leaving the warehouse when the comparison is performed.
  • the comparison unit 45C compares the weather at the current position at the time of storage with the weather at the current position at the time of storage, and determines whether the visibility is lower than at the time of storage due to changes in the weather. In the present embodiment, the comparison unit 45C compares the weather at the current position at the time of storage with the weather at the current position at the time of storage, and determines whether the visibility is lower than at the time of storage .
  • the display control unit 48C causes the display panel 31 to display the overhead view image 100 when the comparison unit 45C determines that the visibility is lower than that at the time of storage due to a change in weather.
  • the display control unit 48C causes the display panel 31 to display the overhead image 100 when the comparison unit 45C determines that the visibility is lower than that at the time of storage due to the precipitation at the time of storage.
  • steps SC41 to SC43 and steps SC45 to SC48 perform the same processes as steps SC21 to SC23 and steps SC25 to SC28 of the flowchart shown in FIG.
  • the display control device 40C determines whether the visibility is lower than that at the time of storage due to the precipitation at the time of leaving (step SC44).
  • the display control device 40C determines that the visibility is not lowered by precipitation at the time of leaving from the time of storage (No at step SC44)
  • the process ends. In this case, the overhead image 100 is not displayed.
  • the display control device 40C determines that the visibility is lower than that at the time of storage due to the precipitation at the time of leaving (Yes at step SC44)
  • the process proceeds to step SC45.
  • the comparing unit 45C determines that the visibility is lower than that at the time of storage due to precipitation when leaving the storage area
  • the overhead image 100 generated by the overhead image generation unit 44C is displayed on the display panel 31. Display.
  • the present embodiment can appropriately display the overhead view video 100 when the visibility is lower than that at the time of storage due to the precipitation at the time of leaving the house.
  • the driver can appropriately confirm the surroundings of the vehicle V1 with the overhead image 100 in addition to visual or mirror confirmation.
  • FIG. 31 is a flowchart showing an example of the flow of processing in the display control device of the display control system according to the fifteenth embodiment.
  • the display control system 1C is different from the twelfth embodiment in that the overhead image 100 is displayed when the visibility is lower than at the time of storage due to the freezing of the window glass at the time of storage.
  • the window glass includes window glass of at least one of front, rear, left and right of the vehicle V1.
  • Window glass includes a windshield.
  • the window glass shows a portion visible through the outside of the vehicle V1.
  • the visibility information acquisition unit 43C is the first visibility information that is information indicating the frozen state of the window glass of the vehicle V1 at the time of storage, and the information second visibility that indicates the frozen state of the window glass of the vehicle V1 at the time of leaving Get information.
  • the visibility information acquisition unit 43C is based on at least one of the operation information of the defroster and the defogger disposed in the vehicle V1, the temperature information of the outside air measured by the thermometer, and the weather information acquired via the communication network.
  • Information indicating the frozen state of the window glass may be obtained.
  • the visibility information acquisition unit 43C may perform image processing on the peripheral video data acquired by the video data acquisition unit 41 to acquire information indicating a frozen state of the window glass.
  • the overhead video generating unit 44C When the comparing unit 45C determines that the visibility is lower than that at the time of storage due to the freezing of the window glass at the time of leaving, the overhead video generating unit 44C generates the overhead image 100.
  • the comparing unit 45C determines whether the visibility is lower than that at the time of storage due to the freezing of the window glass at the time of leaving.
  • the display control unit 48C causes the display panel 31 to display the overhead image 100 when the comparison unit 45C determines that the visibility is lower than that at the time of storage due to the freezing of the window glass at the time of storage.
  • step SC51 to step SC53 and step SC55 to step SC58 perform the same processes as step SC21 to step SC23 and step SC25 to step SC28 of the flowchart shown in FIG.
  • the display control device 40C determines whether the visibility is lower than that at the time of storage due to the freezing of the window glass at the time of leaving (step SC54). If the display control device 40C determines that the visibility is not lower than that at the time of storage due to the freezing of the window glass at the time of storage (No at step SC54), the process ends. In this case, the overhead image 100 is not displayed. If the display control device 40C determines that the visibility is lower than that at the time of storage due to the freezing of the window glass at the time of storage (Yes at step SC54), the process proceeds to step SC55.
  • the comparing unit 45C determines that the visibility is lower than that at the time of storage due to freezing of the window glass at the time of leaving
  • the overhead image 100 generated by the overhead image generating unit 44C is displayed Display on panel 31.
  • the present embodiment can appropriately display the overhead view image 100 when the visibility is lower than that at the time of storage due to the freezing of the window glass at the time of storage.
  • the driver can appropriately confirm the periphery of the vehicle V1 with the overhead image 100 in addition to the visual or mirror confirmation.
  • the defroster and the defogger are operated, it may be difficult to improve the visibility of the glass on the side of the rear seat depending on conditions such as the operating time and temperature.
  • the present embodiment in such a case, since the overhead view video 100 is displayed, the periphery of the vehicle V1 can be appropriately confirmed.
  • FIG. 32 is a block diagram showing a configuration example of a display control system according to the sixteenth embodiment.
  • FIG. 33 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the sixteenth embodiment.
  • the display control system 1C2 is different from the twelfth embodiment in that the adjacent vehicle which is the detected obstacle is highlighted in the overhead image 100 to be generated.
  • the display control system 1C2 further includes a sensor unit 50C2.
  • Sensor unit 50C2 includes a plurality of sensors installed around vehicle V1.
  • the sensor unit 50C2 can detect an obstacle in all directions adjacent to the vehicle V1.
  • the sensor unit 50C2 detects an adjacent vehicle present in an adjacent parking area as an obstacle adjacent to the vehicle V1.
  • the sensor unit 50C2 is configured in the same manner as the sensor unit 21 of the first embodiment.
  • the display control device 40C2 further includes an adjacent information acquisition unit 46C2.
  • the adjacent information acquisition unit 46C2 acquires obstacle information from the sensor unit 50C2 when the vehicle V1 advances from the storage state and exits.
  • the adjacent information acquisition unit 46C2 stores obstacle information at the time of leaving the storage unit 49C2 and outputs the obstacle information to the comparison unit 45C2.
  • the combination processing unit 443C2 of the overhead image generation unit 44C2 When the comparison unit 45C2 determines that the illuminance is lower than that at the time of storage when the vehicle V1 is leaving, the combination processing unit 443C2 of the overhead image generation unit 44C2 generates the overhead image 100 in which the detected adjacent vehicle is emphasized Do. For example, the combination processing unit 443C2 highlights the detected adjacent vehicle by coloring it or surrounding it with a thick line. Alternatively, in response to the determination result by the comparison unit 45C2, the display control unit 48C2 causes the combining processing unit 443C2 to generate the overhead view image 100 in which the adjacent vehicle detected is emphasized.
  • the display control unit 48C2 causes the display panel 31 to display the overhead image 100 in which the detected adjacent vehicle is highlighted.
  • the overhead view image 100 will be described with reference to FIG.
  • the detected adjacent vehicle Ve is highlighted and displayed.
  • the comparing unit 45C2 determines that the illuminance is lower than that at the time of storage when the vehicle V1 is leaving, the overhead image 100 in which the detected adjacent vehicle is highlighted Is displayed on the display panel 31.
  • the present embodiment can easily grasp the detected adjacent vehicle when the illuminance decreases at the time of storage and at the time of storage. According to the present embodiment, since it is easy to recognize the adjacent vehicle in the overhead view video 100, it is possible to easily confirm the point to be noted by the driver at the time of leaving the house even if the illuminance is lowered.
  • FIG. 34 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the seventeenth embodiment.
  • the basic configuration of the display control system 1C2 is the same as that of the display control system 1C2 of the sixteenth embodiment.
  • the display control system 1C2 is different from the sixteenth embodiment in that a notification icon 120 for notifying of the direction in which the detected adjacent vehicle is present is displayed in the overhead image 100 to be generated.
  • the combining processing unit 443C2 determines that the comparing unit 45C2 determines that the illuminance is lower than that at the time of storage when the vehicle V1 is leaving, the overhead image displaying the notification icon 120 indicating the direction in which the detected adjacent vehicle is present Generate 100 For example, the combining processing unit 443C2 causes the notification icon 120 in the arrow shape to be displayed.
  • the display control unit 48C2 displays the overhead image 100 including the notification icon 120 indicating the direction of the detected adjacent vehicle when the comparison unit 45C2 determines that the illuminance is lower than that at the time of storage when the vehicle V1 is leaving. Display on panel 31.
  • the overhead view image 100 will be described with reference to FIG. In this embodiment, it is assumed that an adjacent vehicle on the right front outside the display range is detected.
  • a notification icon 120 indicating a direction in which the detected adjacent vehicle is present is displayed.
  • a notification icon indicating the direction in which the detected adjacent vehicle exists is displayed on the display panel 31.
  • the direction in which the adjacent vehicle exists can be easily grasped.
  • FIG. 35 is a view showing an example of a bird's-eye view image generated by the display control apparatus of the display control system according to the eighteenth embodiment.
  • the basic configuration of the display control system 1C2 is the same as that of the display control system 1C2 of the sixteenth embodiment.
  • the display control system 1C2 is different from the twelfth embodiment in that in the overhead image 100 to be generated, the overhead image 100 whose display range is changed so that the detected adjacent vehicle is included in the display range is displayed.
  • the clipping processing unit 442 performs clipping processing in the clipping range such that the detected adjacent vehicle is included in the display range from the peripheral video data subjected to the viewpoint conversion processing.
  • the detected adjacent vehicle may be partially included in the display range of the overhead view video 100.
  • the display control unit 48C2 displays the overhead view image 100 in which the display range is changed to include the detected adjacent vehicle when the comparison unit 45C2 determines that the illuminance is lower than that at the time of storage when the vehicle V1 is leaving. Display on panel 31.
  • the overhead view image 100 will be described with reference to FIG. In the present embodiment, it is assumed that the adjacent vehicle Vc in the front right is detected. As the overhead view video 100, the overhead view video 100 in which the display range is widened forward is displayed so that the detected adjacent vehicle Vc is displayed.
  • the display range is such that the detected adjacent vehicle is displayed. Is displayed on the display panel 31.
  • FIG. when it is determined that the illuminance is lower than at the time of storage when the vehicle V1 exits, it is easy to recognize the point changed from the time of storage, so the driver should keep in mind the points at the time of storage It can be easily confirmed.
  • Each component of the illustrated display control system is functionally conceptual, and may not necessarily be physically configured as illustrated. That is, the specific form of each device is not limited to the illustrated one, and all or a part thereof is functionally or physically dispersed or integrated in an arbitrary unit according to the processing load and use condition of each device, etc. May be
  • the configuration of the display control system is realized, for example, as software or a program loaded in a memory.
  • the above embodiment has been described as a functional block realized by cooperation of these hardware or software. That is, these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • FIG. 10 is a diagram for explaining a parking area of a home parking lot.
  • the parking section P1 is a space for parking one vehicle V1.
  • the sensor unit 21 detects an obstacle, such as a bicycle B, present in the parking space P1 as an obstacle adjacent to the vehicle V1.
  • the adjacent information acquisition unit 43 acquires, as the first obstacle information, information on an obstacle in the parking section P1 when the vehicle V1 retreats and stores the goods, and the vehicle V1 moves forward from the storage state and leaves the goods storage
  • the information on the obstacle in the parking lot P1 is acquired as the second obstacle information.
  • the display control unit 48 causes the display panel 31 to display the overhead image 100 when the comparison unit 45 determines that the obstacle in the parking section P1 is increasing when the vehicle V1 is leaving.
  • the present embodiment is not limited to the vehicles adjacent to the adjacent parking area, and can display overhead images when obstacles around the vehicles increase.
  • FIG. 19 is a diagram for explaining a parking lot of a home parking lot.
  • the sensor unit 21 detects the position of an obstacle, such as the bicycle B, present in the parking section P1 as the position of the obstacle adjacent to the vehicle V1.
  • the adjacent information acquisition unit 43 acquires, as the first obstacle position information, the position information of the obstacle in the parking section P1 when the vehicle V1 retreats and stores the goods, and the vehicle V1 moves forward from the storage state and exits.
  • the position information of the obstacle in the parking section P1 is acquired as the second obstacle position information.
  • the display control unit 48 causes the display panel 31 to display the overhead view image 100 when the comparison unit 45 determines that the position of the obstacle in the parking section P1 changes when the vehicle V1 is leaving.
  • the present embodiment is not limited to the vehicles adjacent to the adjacent parking area, and can display overhead images when the position of an obstacle around the vehicle changes.
  • FIG. 11 is a diagram for explaining a parking section of parallel parking, and shows a state at the time of storage.
  • FIG. 12 is a diagram for explaining a parking section of parallel parking, and shows a state at the time of leaving the house.
  • the parking section PF is adjacent to the rear side of the parking section P1.
  • the parking section PG is adjacent to the front side of the parking section P1.
  • the adjacent vehicle Vf is present in the parking section PF, and the adjacent vehicle is not present in the parking section PG.
  • the adjacent vehicle Vf exists in the parking section PF, and the adjacent vehicle Vg exists in the parking section PG.
  • the sensor unit 21 detects an adjacent vehicle present in the parking section PF and the parking section PG as an obstacle adjacent to the vehicle V1.
  • the parking section P1 is a section for parallel parking based on map information of a navigation system (not shown)
  • the comparison control unit 45 determines that the adjacent vehicle is increasing in the parking section PF or the parking section PG when the vehicle V1 leaves the display control unit 48, the overhead image 100 is displayed on the display panel 31. Display on.
  • this embodiment can detect an adjacent vehicle of an appropriate direction, and can display a bird's-eye view picture, when an adjacent vehicle increases.
  • FIG. 20 is a diagram for explaining a parking section for parallel parking.
  • the position of the adjacent vehicle Vf changes in the direction away from the vehicle V1 at the time of leaving from storage at the time of leaving.
  • the position of the adjacent vehicle Vg changes in the direction closer to the vehicle V1 than at the time of storage when leaving.
  • the display control unit 48 displays the overhead view image 100 when the comparison unit 45 determines that the position of the adjacent vehicle is changing to the parking section PF or the parking section PG when the vehicle V1 is leaving. Display on panel 31.
  • this embodiment can detect an adjacent vehicle of a suitable direction, and can display a bird's-eye view picture, when the position of an adjacent vehicle changes.
  • the adjacent information acquisition unit 43 performs image processing on the video data acquired by the video data acquisition unit 41 to recognize an obstacle around the vehicle V1 from the object to be photographed, and the first obstacle information and the second obstacle information Alternatively, it may be acquired as the first obstacle position information and the second obstacle position information.
  • the first obstacle information or the first obstacle position information has been described as being acquired when the vehicle V ⁇ b> 1 retreats and is stored, the present invention is not limited thereto.
  • the first obstacle information or the first obstacle position information acquires information on obstacles around the vehicle V1 at the time of completion of the storage based on the information acquired during the parking operation in which the vehicle V1 retreats and is stored. You may
  • driver information acquisition unit 43B has been described as acquiring the first driver information and the second driver information from the video data captured by the driver seat camera 21B, the present invention is not limited to this.
  • whether or not the driver is different may be determined based on identification information acquired from an electronic device such as a smart key or an information terminal possessed by the driver and having identification information for identifying each individual. . More specifically, for example, the electronic device brought into the vehicle by the driver is used as a detection unit for the driver's seat.
  • the driver information acquisition unit 43B may acquire the identification information from the electronic device, and may acquire the identification information at the time of storage as the first driver information and the identification information at the time of leaving as the second driver information.
  • whether or not the driver is different may be determined based on driver's seat information including at least one of weight information of the driver's seat and position information indicating the preset position.
  • driver's seat information including at least one of weight information of the driver's seat and position information indicating the preset position.
  • a weight sensor that detects driver's seat information including weight information that acts when seated on the driver's seat is used as a driver's seat detection unit.
  • the driver information acquisition unit 43B may acquire the driver's seat information at the time of storage, which is detected by the weight sensor, as the first driver information, and acquire the driver's seat information at the time of leaving as the second driver information.
  • a sensor that detects position information including a preset position of the driver's seat is used as a driver's seat detection unit.
  • the driver information acquisition unit 43B may acquire the position information of the driver's seat at the time of storage, detected by the sensor, as the first driver information, and acquire the position information of the driver's seat at the time of leaving as the second driver information.
  • the comparison unit 45B compares the driver's seat information at the time of storage with the driver's seat information at the time of storage, and determines whether the driver at the time of storage and the driver at the time of storage are the same.
  • position information of the rearview mirror may be used as information of the driver's face position. More specifically, for example, a sensor that detects positional information of a rearview mirror is used as a driver seat detector.
  • the driver information acquisition unit 43B may acquire the position information of the room mirror at the time of storage detected by the sensor as the first driver information and the position information of the room mirror at the time of leaving as the second driver information.
  • the first driver information has been described as being acquired when the vehicle V ⁇ b> 1 retreats and is stored, but is not limited to this.
  • the first driver information may acquire information on obstacles around the vehicle V1 at the time of completion of the storage based on the information acquired during the parking operation in which the vehicle V1 retreats and is received.
  • the visibility information acquisition unit 43C has been described as acquiring the first visibility information and the second visibility information from the illuminance sensor 21C, the present invention is not limited to this.
  • the visibility information acquisition unit 43C performs image processing on the video data acquired by the video data acquisition unit 41, recognizes the illuminance around the vehicle V1, and acquires the first visibility information and the second visibility information. It is also good.
  • the overhead image generation unit 44C determines that the illuminance is lower than that at the time of storage when the vehicle V1 leaves the storage room, visibility is set based on image data obtained by photographing the surroundings of the vehicle V1 with an infrared light camera (not shown). An enhanced overhead video may be generated.
  • the overhead video generation unit 44C determines that the illuminance is lower than that at the time of storage when the vehicle V1 leaves the storage room, the overhead video generation unit 44C performs image processing on the video data acquired by the video data acquisition unit 41 to enhance visibility. You may generate an overhead video.
  • the overhead view video 100 is displayed when the visibility is lower than that at the time of storage due to precipitation when leaving the home, but the overhead view video 100 is generated when the visibility is reduced due to the occurrence of fog. May be displayed.
  • the first visibility information is described as being acquired when the vehicle V ⁇ b> 1 retreats and is stored, the present invention is not limited to this.
  • the first visibility information may acquire information on obstacles around the vehicle V1 at the time of completion of the storage based on the information acquired during the parking operation in which the vehicle V1 retreats and is received.
  • the process when it is determined from the map information of the navigation system (not shown) and the current position information of the vehicle that the current position of the vehicle is a parking lot, or the image data is acquired by the image data acquisition unit 41 When the process is performed and it is determined that the current position of the vehicle is a parking lot, it may be determined that the vehicle V1 has been received.
  • the display control unit 48 A video may be displayed.
  • the surrounding situation of the vehicle V1 at the time of leaving is the situation shown in FIG. 14 and the predicted traveling direction of the vehicle V1 is the right direction
  • the bird's-eye view image is displayed.
  • the predicted traveling direction may be acquired from an arrow or a sign indicating the traveling direction in the parking lot detected by performing image processing on the video data acquired by the video data acquisition unit 41, or navigation information.
  • Display Control System 11 Front Camera (Photographing Section) 12 Rear camera (shooting unit) 13 Left side camera (shooting unit) 14 Right side camera (shooting unit) 21 Sensor unit (obstacle detection unit) 31 Display panel (display section) 40 display control device 41 video data acquisition unit 42 own vehicle information acquisition unit 43 adjacent information acquisition unit (information acquisition unit) 44 ⁇ video generation unit 45 comparison unit 48 display control unit 49 storage unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention comprend : une unité d'acquisition de données vidéo 41 pour acquérir des données vidéo à partir d'une pluralité de caméras qui capturent les environs d'un véhicule ; une unité de génération d'image vidéo en vue aérienne 44 pour générer une image vidéo en vue aérienne via l'exécution d'un traitement de conversion de point de vue et d'une composition des données vidéo ; une unité d'acquisition d'informations d'adjacence 53 pour acquérir des premières informations d'obstacle concernant des obstacles autour du véhicule pendant que le véhicule rentre en marche arrière dans un emplacement de stationnement, et des secondes informations d'obstacle concernant des obstacles autour du véhicule pendant que le véhicule sort en marche avant de l'emplacement de stationnement ; une unité de comparaison 45 pour comparer les premières informations d'obstacle aux secondes informations d'obstacle de sorte à déterminer si le nombre d'obstacles a augmenté au moment où le véhicule quitte l'emplacement de stationnement ; et une unité de commande d'affichage 48 pour commander à un panneau d'affichage d'afficher une vidéo en vue aérienne lorsque le véhicule sort de l'emplacement de stationnement si l'unité de comparaison 45 détermine que le nombre d'obstacles autour du véhicule a augmenté au moment où le véhicule quitte l'emplacement de stationnement.
PCT/JP2018/014137 2017-07-31 2018-04-02 Dispositif de commande d'affichage, système de commande d'affichage, procédé de commande d'affichage, et programme WO2019026348A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201880004577.6A CN109997356B (zh) 2017-07-31 2018-04-02 显示控制装置、显示控制系统、显示控制方法以及程序
EP18840930.4A EP3550830B1 (fr) 2017-07-31 2018-04-02 Dispositif de commande d'affichage, système de commande d'affichage, procédé de commande d'affichage, et programme
US16/424,691 US11117520B2 (en) 2017-07-31 2019-05-29 Display control device, display control system, display control method, and program

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2017148385A JP6763357B2 (ja) 2017-07-31 2017-07-31 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP2017-148469 2017-07-31
JP2017-148336 2017-07-31
JP2017148469A JP6763358B2 (ja) 2017-07-31 2017-07-31 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP2017148336A JP6787272B2 (ja) 2017-07-31 2017-07-31 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP2017148384A JP6763356B2 (ja) 2017-07-31 2017-07-31 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP2017-148385 2017-07-31
JP2017-148384 2017-07-31

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/424,691 Continuation US11117520B2 (en) 2017-07-31 2019-05-29 Display control device, display control system, display control method, and program

Publications (1)

Publication Number Publication Date
WO2019026348A1 true WO2019026348A1 (fr) 2019-02-07

Family

ID=65232537

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/014137 WO2019026348A1 (fr) 2017-07-31 2018-04-02 Dispositif de commande d'affichage, système de commande d'affichage, procédé de commande d'affichage, et programme

Country Status (4)

Country Link
US (1) US11117520B2 (fr)
EP (1) EP3550830B1 (fr)
CN (1) CN109997356B (fr)
WO (1) WO2019026348A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309884A (zh) * 2023-05-24 2023-06-23 成都陆拓信息技术有限公司 一种三维空间区域视频盲区识别方法

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6939494B2 (ja) * 2017-12-11 2021-09-22 トヨタ自動車株式会社 画像表示装置
JP7125893B2 (ja) * 2018-11-26 2022-08-25 本田技研工業株式会社 走行制御装置、制御方法およびプログラム
JP7309480B2 (ja) * 2019-06-27 2023-07-18 フォルシアクラリオン・エレクトロニクス株式会社 車載装置、及び車載装置の制御方法
CN112849028B (zh) * 2021-01-22 2023-04-07 合众新能源汽车股份有限公司 一种应用超声波雷达泊车方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010064546A (ja) * 2008-09-09 2010-03-25 Toyota Industries Corp 駐車支援装置
JP2012056428A (ja) * 2010-09-08 2012-03-22 Aisin Seiki Co Ltd 運転支援装置
JP2013123922A (ja) * 2011-12-13 2013-06-24 Mitsubishi Motors Corp 運転支援装置
JP2015076645A (ja) 2013-10-04 2015-04-20 本田技研工業株式会社 車両周辺表示装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3803021B2 (ja) * 2000-10-02 2006-08-02 松下電器産業株式会社 運転支援装置
JP2004330891A (ja) * 2003-05-08 2004-11-25 Fujitsu Ten Ltd 利便性向上装置
CN101727756B (zh) * 2008-10-16 2012-07-25 财团法人工业技术研究院 交通工具移动图像辅助导引方法与系统
JP4831374B2 (ja) * 2009-03-27 2011-12-07 アイシン・エィ・ダブリュ株式会社 運転支援装置、運転支援方法、及び運転支援プログラム
DE102010044219A1 (de) * 2010-11-22 2012-05-24 Robert Bosch Gmbh Verfahren zur Erfassung der Umgebung eines Fahrzeugs
JP5786941B2 (ja) * 2011-08-25 2015-09-30 日産自動車株式会社 車両用自律走行制御システム
CN103782591B (zh) * 2011-08-26 2017-02-15 松下知识产权经营株式会社 驾驶辅助装置
US9697735B2 (en) * 2011-12-15 2017-07-04 Panasonic Intellectual Property Management Co., Ltd. Drive assistance device
JP5776838B2 (ja) * 2012-03-15 2015-09-09 トヨタ自動車株式会社 運転支援装置
KR101376210B1 (ko) * 2012-08-06 2014-03-21 현대모비스 주식회사 어라운드 뷰 모니터 시스템 및 모니터링 방법
JP6231345B2 (ja) * 2013-10-18 2017-11-15 クラリオン株式会社 車両用発進支援装置
JP6429452B2 (ja) * 2013-11-22 2018-11-28 ルネサスエレクトロニクス株式会社 車載用画像処理装置及び半導体装置
CN104385987B (zh) * 2014-11-14 2017-01-11 东风汽车有限公司 一种汽车监控方法及系统
CN106797452A (zh) * 2015-03-03 2017-05-31 日立建机株式会社 车辆的周围监视装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010064546A (ja) * 2008-09-09 2010-03-25 Toyota Industries Corp 駐車支援装置
JP2012056428A (ja) * 2010-09-08 2012-03-22 Aisin Seiki Co Ltd 運転支援装置
JP2013123922A (ja) * 2011-12-13 2013-06-24 Mitsubishi Motors Corp 運転支援装置
JP2015076645A (ja) 2013-10-04 2015-04-20 本田技研工業株式会社 車両周辺表示装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3550830A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116309884A (zh) * 2023-05-24 2023-06-23 成都陆拓信息技术有限公司 一种三维空间区域视频盲区识别方法

Also Published As

Publication number Publication date
EP3550830B1 (fr) 2021-09-08
CN109997356A (zh) 2019-07-09
EP3550830A1 (fr) 2019-10-09
US11117520B2 (en) 2021-09-14
US20190275942A1 (en) 2019-09-12
CN109997356B (zh) 2021-10-08
EP3550830A4 (fr) 2020-02-19

Similar Documents

Publication Publication Date Title
CN109997356B (zh) 显示控制装置、显示控制系统、显示控制方法以及程序
US10909765B2 (en) Augmented reality system for vehicle blind spot prevention
US10688868B2 (en) On-vehicle display control device, on-vehicle display system, on-vehicle display control method, and non-transitory storage medium
CN109314765B (zh) 车辆用显示控制装置、显示系统、显示控制方法以及程序
US10872249B2 (en) Display control device, display control system, display control method, and non-transitory storage medium
US20180330619A1 (en) Display device and display method for displaying pictures, and storage medium
US10946744B2 (en) Vehicular projection control device and head-up display device
US10531016B2 (en) On-vehicle display control device, on-vehicle display system, on-vehicle display control method, and non-transitory storage medium
CN109987025A (zh) 用于夜晚环境的车辆驾驶辅助系统及方法
JP2005057536A (ja) 映像提示装置
KR102061210B1 (ko) 차량 측 전방 사각 영역 감시 장치 및 그 방법
JP6763358B2 (ja) 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP2020088604A (ja) 走行制御装置、制御方法およびプログラム
JP6825522B2 (ja) 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP6763357B2 (ja) 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP6787272B2 (ja) 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP6825521B2 (ja) 表示制御装置、表示制御システム、表示制御方法およびプログラム
JP2019062414A (ja) 俯瞰映像生成装置、俯瞰映像生成システム、俯瞰映像生成方法およびプログラム
JP2019029863A (ja) 表示制御装置、表示制御システム、表示制御方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18840930

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018840930

Country of ref document: EP

Effective date: 20190702

NENP Non-entry into the national phase

Ref country code: DE