WO2017086057A1 - Display device for vehicles and display method for vehicles - Google Patents

Display device for vehicles and display method for vehicles Download PDF

Info

Publication number
WO2017086057A1
WO2017086057A1 PCT/JP2016/080091 JP2016080091W WO2017086057A1 WO 2017086057 A1 WO2017086057 A1 WO 2017086057A1 JP 2016080091 W JP2016080091 W JP 2016080091W WO 2017086057 A1 WO2017086057 A1 WO 2017086057A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
type
vehicle
overhead image
type overhead
Prior art date
Application number
PCT/JP2016/080091
Other languages
French (fr)
Japanese (ja)
Inventor
昇 勝俣
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016146181A external-priority patent/JP6699427B2/en
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Priority to CN201680051297.1A priority Critical patent/CN107950023B/en
Priority to EP16866059.5A priority patent/EP3379827B1/en
Publication of WO2017086057A1 publication Critical patent/WO2017086057A1/en
Priority to US15/935,143 priority patent/US20180208115A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to vehicle display technology, and more particularly to a vehicle display device and a vehicle display method for displaying an image.
  • a technology is widely used in which cameras are provided at a plurality of locations on a vehicle, and the captured images are synthesized by converting the viewpoints to obtain a bird's-eye view image.
  • Such a display is displayed at the time of entering a garage, for example, and is used for the purpose of confirming the periphery of the vehicle and grasping the position. For example, when there is an obstacle in the vicinity of the vehicle when entering the garage, the obstacle is also displayed in the bird's-eye view image, but since the viewpoint conversion processing is performed, it is difficult to grasp the distance to the obstacle. Therefore, when an obstacle is detected, an original image obtained by imaging the obstacle detection area is displayed (for example, see Patent Document 1).
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a technique that makes it easier for the driver to grasp the situation in the vicinity of the vehicle, particularly the presence of an object that becomes an obstacle.
  • a vehicle display device includes a first acquisition unit that acquires a first type image obtained by imaging the periphery of a vehicle, and a first type acquired by the first acquisition unit.
  • a first image generation unit that generates a first type overhead image and a first type overhead image generated by the first image generation unit by converting the viewpoint as viewed from above the vehicle.
  • a second acquisition unit that acquires a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image acquired by the first acquisition unit
  • a second image generation unit that generates a second type overhead image by converting the viewpoint as viewed from above the vehicle with respect to the second type image acquired by the second acquisition unit, and objects around the vehicle
  • an object detection unit for detecting.
  • the display control unit is a second type overhead image generated by the second image generation unit in addition to the first type overhead image, and corresponds to the direction of the detected object.
  • a second type overhead image is also displayed.
  • a first type overhead image is obtained by acquiring a first type image obtained by imaging the periphery of the vehicle and converting the viewpoint of the acquired first type image as viewed from above the vehicle.
  • a step of displaying the generated first type overhead image and a step of acquiring a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image.
  • Generating a second type overhead image by converting the viewpoint as seen from above the vehicle with respect to the acquired second type image, detecting an object around the vehicle, A second type overhead image generated corresponding to the direction of the detected object is displayed in addition to the first type overhead image.
  • any combination of the above-described constituent elements and a representation of the present embodiment converted between a method, an apparatus, a system, a recording medium, a computer program, etc. are also effective as an aspect of the present embodiment.
  • FIGS. 1A to 1B are views showing the appearance of a vehicle according to Embodiment 1 of the present invention. It is a figure which shows the structure of the display apparatus for vehicles which concerns on Example 1 of this invention.
  • FIG. 2 is a perspective view showing an imaging range formed around the vehicle in FIGS. It is a figure which shows the 1st kind overhead view image produced
  • FIG. 2 is a perspective view showing another imaging range formed around the vehicle in FIGS. It is a figure which shows the bird's-eye view image produced
  • Embodiment 1 of the present invention relates to a vehicle display device that generates a bird's-eye view image by viewpoint conversion with respect to images taken by a plurality of image pickup units installed in a vehicle, and displays the generated bird's-eye view image.
  • a vehicle display device that generates a bird's-eye view image by viewpoint conversion with respect to images taken by a plurality of image pickup units installed in a vehicle, and displays the generated bird's-eye view image.
  • an object that exists in the vicinity of the vehicle is displayed, but an object that exists away from the vehicle, such as an obstacle that exists at a position 1 m or more away from the vehicle, is not displayed.
  • the vehicle display device executes the following process in order to display an object that exists away from the vehicle.
  • the vehicle includes a plurality of first imaging units at a lower portion of the vehicle, and the vehicle display device generates a first type overhead image from a first type image captured by each of the plurality of first imaging units.
  • the vehicle is also provided with a plurality of second imaging units at a position higher than the first imaging unit, and the vehicular display device receives the second type image captured by each of the plurality of second imaging units from the second type image.
  • a seed bird's-eye view image is generated.
  • the vehicle is also provided with a sensor, and the vehicle display device detects the presence of an object based on the detection result of the sensor.
  • the vehicular display device displays the first type overhead image when the presence of the object is not detected.
  • the vehicular display device displays the first type overhead image in addition to the first type overhead image.
  • Two types of overhead images are displayed.
  • the height at which the second imaging unit is installed is a position higher than the height at which the first imaging unit is installed, and is installed at a position where it is possible to photograph far away from the first imaging unit. For this reason, the second type overhead image can display a farther object than the first type overhead image.
  • the display of the second overhead image is unnecessary. Therefore, the display is switched depending on whether or not an object exists.
  • FIGS. 1A to 1B show the appearance of the vehicle 100 according to the first embodiment of the present invention.
  • FIG. 1A shows a top view of the vehicle 100
  • FIG. 1B shows a side view.
  • a first front imaging unit 10 is installed in a front portion of the vehicle 100, for example, a bumper, a bonnet, or the like.
  • a first rear imaging unit 12 is installed in a rear portion of the vehicle 100, such as a bumper or a trunk.
  • a first left side imaging unit 14 is installed on a left side portion of the vehicle, for example, a lower portion of the left door mirror.
  • a first right side imaging unit 16 is installed on the right side of the vehicle so as to be symmetrical with the first left side imaging unit 14.
  • the first front imaging unit 10 to the first right imaging unit 16 are collectively referred to as a first imaging unit.
  • the second front imaging unit 18 is arranged in the vehicle near the roof front of the vehicle 100, and the second rear imaging unit 20 is arranged in the vehicle near the roof rear of the vehicle 100.
  • the 2nd front image pick-up part 18 and the 2nd back image pick-up part 20 are arranged in the direction which can be photoed in a position higher than a 1st image pick-up part, and a distant place. Therefore, the second front imaging unit 18 and the second rear imaging unit 20 can image an object farther away than the first imaging unit.
  • the second front imaging unit 18 and the second rear imaging unit 20 are installed in the passenger compartment near the roof of the vehicle 100, but are higher than the first front imaging unit 10 and the first rear imaging unit 12.
  • the second front imaging unit 18 and the second rear imaging unit 20 are collectively referred to as a second imaging unit. Further, in FIGS. 1A to 1B, the second imaging units are disposed only in the front and rear of the vehicle 100, but may be disposed in the front and rear and right and left.
  • the front sensor 22 is disposed in front of the vehicle 100 as in the first front imaging unit 10, and the rear sensor 24 is disposed in the rear of the vehicle 100 as in the first rear imaging unit 12.
  • the front sensor 22 is disposed in the vicinity of the first front imaging unit 10, and the rear sensor 24 is disposed in the vicinity of the first rear imaging unit 12.
  • FIG. 2 shows a configuration of the vehicle display device 50 according to the first embodiment of the present invention.
  • the vehicle display device 50 includes a first front imaging unit (first front camera) 10, a first rear imaging unit (first rear camera) 12, a first left imaging unit (first left camera) 14, a first 1 right side imaging unit (first right side camera) 16, second front imaging unit (second front camera) 18, second rear imaging unit (second rear camera) 20, front sensor 22, rear sensor 24, display panel 52 is connected.
  • the vehicle display device 50 includes a first acquisition unit 30, a first image generation unit 32, a display control unit 34, a second acquisition unit 36, a second image generation unit 38, and an object detection unit 40.
  • the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 are arranged as shown in FIG.
  • FIG. 3 is a perspective view showing an imaging range formed around the vehicle 100.
  • the first front imaging unit 10 forms a front imaging region 60 so as to go forward from the first front imaging unit 10, and captures an image in the front imaging region 60.
  • the first rear imaging unit 12 forms a rear imaging region 62 so as to go rearward from the first rear imaging unit 12, and captures an image in the rear imaging region 62.
  • the first left side imaging unit 14 forms a left side imaging region 64 so as to go to the left from the first left side imaging unit 14, and captures an image in the left side imaging region 64.
  • the first right side imaging unit 16 forms a right side imaging region 66 so as to go right from the first right side imaging unit 16, and images an image in the right side imaging region 66.
  • the front imaging area 60 indicated by hatching in FIG. 3 is a range from the range indicated by hatching to the position immediately below the installation position of the first front imaging part 10 in the vehicle 100 in the range where the first front imaging unit 10 can shoot.
  • a range that is cut out by the first image generation unit 32 and subjected to viewpoint conversion processing is a range in which the first rear imaging unit 12 can shoot, and a range from a range indicated by hatching to a position immediately below the installation position of the first rear imaging unit 12 in the vehicle 100 is the first image generation. This indicates that the range is cut out by the unit 32 and subjected to viewpoint conversion processing.
  • the left-side imaging area 64 has a range from the range indicated by hatching to the position immediately below the installation position of the first left-side imaging unit 14 in the vehicle 100 in the range in which the first left-side imaging unit 14 can shoot. This indicates that the range is cut out by one image generation unit 32 and subjected to viewpoint conversion processing.
  • the right side imaging region 66 has a range from the range indicated by oblique lines to the position immediately below the installation position of the first right side imaging unit 16 in the vehicle 100 in the range in which the first right side imaging unit 16 can shoot. This indicates that the range is cut out by one image generation unit 32 and subjected to viewpoint conversion processing.
  • the periphery of the vehicle 100 is imaged by images captured by the first imaging unit.
  • the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 capture an image as described above.
  • the image is a moving image, but may be a still image taken continuously.
  • the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 output the captured images to the first acquisition unit 30.
  • the first acquisition unit 30 includes images (hereinafter referred to as “first type images”) from each of the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16. ) To get. That is, the first acquisition unit 30 acquires a first type image obtained by imaging the periphery of the vehicle 100.
  • the first type image acquired by the first acquisition unit 30 is processed by the first image generation unit 32.
  • the first image generation unit 32 executes processing of the first type image acquired by the first acquisition unit 30.
  • the first image generation unit 32 performs processing for converting the viewpoint of the first type image as viewed from above the vehicle 100 and generating the first type overhead image.
  • a known technique may be used for the conversion and the overhead image generation processing. For example, each pixel of the image is projected onto a three-dimensional curved surface in a virtual three-dimensional space, and a virtual viewpoint from above the vehicle 100 is used. Depending on, a necessary area of the three-dimensional curved surface is cut out. The cut out region corresponds to an image whose viewpoint has been converted.
  • An example of the generated overhead image is shown in FIG.
  • FIG. 4 shows a first type overhead image 80 generated by the first image generation unit 32.
  • the own vehicle icon 78 is an image showing the upper surface of the vehicle 100.
  • a front image 70 is arranged in front of the host vehicle icon 78
  • a rear image 72 is arranged behind the host vehicle icon 78
  • a left side image 74 is arranged on the left side of the host vehicle icon 78.
  • a right side image 76 is arranged on the right side.
  • the first image generation unit 32 generates the first type overhead image 80 by converting the viewpoint as seen from above the vehicle 100 with respect to the first type image acquired by the first acquisition unit 30. To do.
  • the first type overhead image 80 generated by the first image generation unit 32 is processed by the display control unit 34.
  • the display control unit 34 executes a process of displaying the first type overhead image 80 generated by the first image generation unit 32.
  • the display control unit 34 displays the first type overhead image 80 on the display panel 52.
  • the timing at which the first-type bird's-eye view image 80 is displayed on the display panel 52 is an arbitrary timing that requires confirmation of the surroundings of the vehicle. For example, the reverse gear of the vehicle 100 may be selected when entering the garage. .
  • a first type bird's-eye view image 80 as shown in FIG. 4 is displayed on the display panel 52.
  • the second front imaging unit 18 and the second rear imaging unit 20 are arranged as shown in FIG.
  • FIG. 5 is a perspective view showing another imaging range formed around the vehicle 100.
  • the second front imaging unit 18 forms a front imaging region 63 so as to go forward from the second front imaging unit 18, and images an image in the front imaging region 63.
  • the front imaging area 63 extends in front of the vehicle 100 more than the front imaging area 60.
  • the second rear imaging unit 20 forms a rear imaging region 65 so as to extend rearward from the second rear imaging unit 20 and captures an image in the rear imaging region 65.
  • the rear imaging area 65 extends to the rear of the vehicle 100 more than the rear imaging area 62.
  • a front imaging area 63 indicated by hatching in FIG. 5 has a range from the range indicated by hatching to the position immediately below the installation position of the second front imaging part 18 in the vehicle 100 in the range where the second front imaging unit 18 can shoot.
  • the range that is cut out by the second image generation unit 38 and subjected to the viewpoint conversion process is a range in which the second rear imaging unit 20 can shoot, and a range from a range indicated by hatching to a position immediately below the installation position of the second rear imaging unit 20 in the vehicle 100 is the second image generation. This indicates that the range is cut out by the unit 38 and the viewpoint conversion process is performed.
  • the second front imaging unit 18 captures an image forward from a position higher than the first front imaging unit 10. Therefore, the image captured by the second front imaging unit 18 includes a location farther from the vehicle 100 than the image captured by the first front imaging unit 10. That is, the second front imaging unit 18 can image farther than the first front imaging unit 10.
  • the imaging range of the second front imaging unit 18 may partially overlap the imaging range of the first front imaging unit 10 or may be a range that does not overlap with the imaging range of the first front imaging unit 10.
  • the second rear imaging unit 20 captures an image backward from a position higher than the first rear imaging unit 12. Therefore, the image captured by the second rear imaging unit 20 includes a place farther from the vehicle 100 than the image captured by the first rear imaging unit 12. That is, the second rear imaging unit 20 can image farther than the first rear imaging unit 12.
  • the imaging range of the second rear imaging unit 20 may partially overlap the imaging range of the first rear imaging unit 12 or may be a range that does not overlap with the imaging range of the first rear imaging unit 12. In this case as well, the image is a moving image, but may be a still image taken continuously.
  • the second front imaging unit 18 and the second rear imaging unit 20 output the captured image to the second acquisition unit 36.
  • the second image generation unit 38 executes processing of the second type image acquired by the second acquisition unit 36.
  • the 2nd image generation part 38 performs a process which converts a viewpoint with respect to a 2nd type image as it saw from the upper direction of the vehicle 100, and produces
  • the processing in the second image generation unit 38 is the same as the processing in the first image generation unit 32. That is, the second image generation unit 38 generates the second type overhead image 82 by converting the viewpoint as viewed from above the vehicle with respect to the second type image acquired by the second acquisition unit 36. .
  • Such a second type overhead image 82 is an overhead image in a range farther from the vehicle 100 than the first type overhead image 80.
  • first type bird's-eye view image 80 images in the four directions of the front image 70 and the right side image 76 are synthesized, but the second type bird's-eye view image 82 is generated from the second type image in one direction. .
  • the second type overhead image 82 generated by the second image generation unit 38 is processed by the display control unit 34.
  • the front sensor 22 and the rear sensor 24 are arranged as shown in FIG.
  • the front sensor 22 and the rear sensor 24 are, for example, a millimeter wave sensor or an infrared sensor. Moreover, a 2nd imaging part may be sufficient.
  • the object detection unit 40 performs edge detection processing or the like on the image captured by the second imaging unit, and detects an obstacle. An identification number for identification is assigned to each of the front sensor 22 and the rear sensor 24.
  • the object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 and detects an object around the vehicle 100. Since a known technique may be used for the detection of the object, a specific description is omitted here. For example, when an infrared laser is used for the front sensor 22 or the rear sensor 24, based on the time difference when the infrared laser is irradiated in a range that is the detection direction of the vehicle 100 and the infrared laser reflected on the object is received. , Detect objects. Note that the detection range of the object detection unit 40 is set to be farther than the imaging range of the first type image acquired by the first acquisition unit 30.
  • the object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, and notifies the display control unit 34 of the detection of the object when the distance to the detected object is larger than the threshold value. At that time, the identification number of the front sensor 22 or the rear sensor 24 that detected the object is also notified.
  • the threshold value is set to be a value on the far side of the imaging range of the first type image.
  • the display control unit 34 executes display processing of the first type overhead image 80 generated by the first image generation unit 32.
  • the display control unit 34 also executes display processing of the second type overhead image 82 generated by the second image generation unit 38. Further, when an object is detected by the object detection unit 40, the display control unit 34 acquires a notification and an identification number from the object detection unit 40. When the notification from the object detection unit 40 is not acquired, the display control unit 34 displays the first type overhead image 80 on the display panel 52 as before. On the other hand, when acquiring the notification from the object detection unit 40, the display control unit 34 causes the display panel 52 to display the second type overhead image 82 in addition to the first type overhead image 80.
  • the timing at which the second type overhead image 82 is displayed on the display panel 52 in addition to the first type overhead image 80 is the timing at which the notification from the object detection unit 40 is acquired after the first type overhead image 80 is displayed. .
  • FIG. 6 shows a bird's-eye view image generated by the display control unit 34. This corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the rear sensor 24.
  • the display control unit 34 arranges the first type overhead image 80, particularly the second type overhead image 82 in which the obstacle 84 that is an object is displayed below the rear image 72. In this way, the display control unit 34 displays the second type overhead image 82 corresponding to the direction of the detected object in the direction in which the object detection unit 40 detects the object in the first type overhead image 80.
  • the display as shown in FIG. 6 is performed based on the object detected by the rear sensor 24, it is appropriate that the vehicle 100 is moving backward.
  • FIG. 7 shows another overhead image generated by the display control unit 34. This corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the front sensor 22.
  • the display control unit 34 arranges the first type bird's-eye view image 80, particularly the second type bird's-eye view image 82 on which the obstacle 84 is displayed above the front image 70.
  • the display control unit 34 displays the second type overhead image 82 corresponding to the direction of the detected object in the direction in which the object detection unit 40 detects the object in the first type overhead image 80.
  • the display as shown in FIG. 7 is performed based on the object detected by the front sensor 22, it is appropriate that the vehicle 100 is moving forward.
  • the display control unit 34 selects the second type overhead image 82 generated from the second type image captured by the second rear imaging unit 20.
  • the display control unit 34 arranges the selected second type overhead image 82 below the rear image 72 corresponding to the first rear imaging unit 12 facing rearward in the same manner as the second rear imaging unit 20.
  • the display control unit 34 may display the angle of view of the second type overhead image 82 wider than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object.
  • the display control unit 34 may display the second type overhead image 82 in an enlarged manner than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object. It should be noted that a known technique may be used to display a wide angle of view and display an enlarged view, and thus description thereof is omitted here. Further, as shown in FIG. 6, the display control unit 34 may move the first type overhead image 80 upward and display the second type overhead image 82 below the first type overhead image 80.
  • the display control unit 34 selects the second type overhead image 82 generated from the second type image captured by the second front imaging unit 18.
  • the display control unit 34 arranges the selected second type overhead image 82 on the upper side of the front image 70 corresponding to the first front imaging unit 10 facing forward like the second front imaging unit 18.
  • the display control unit 34 may display the angle of view of the second type overhead image 82 wider than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object.
  • the display control unit 34 may display the second type overhead image 82 in an enlarged manner than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object.
  • the display control unit 34 may move the first type overhead image 80 downward and display the second type overhead image 82 above the first type overhead image 80.
  • the display range of the overhead image is substantially expanded.
  • the second type overhead image 82 is displayed in the direction in which the obstacle 84 is detected. For this reason, the driver
  • the second imaging unit from which the second acquisition unit 36 acquires the second type image is arranged at a higher position than the first imaging unit from which the first acquisition unit 30 acquires the first type image.
  • the second imaging unit is disposed near the roof of the vehicle 100 as illustrated in FIG. 1, the second imaging unit is disposed at a position higher than the position of the driver, and thus the obstacle 84 is viewed from above from its own viewpoint.
  • the stereoscopic effect of the obstacle 84 can be grasped more appropriately.
  • This configuration can be realized in terms of hardware by a CPU, memory, or other LSI of any computer, and in terms of software, it can be realized by a program loaded in the memory, but here it is realized by their cooperation.
  • Draw functional blocks Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • FIG. 8 is a flowchart showing a display procedure by the vehicle display device 50.
  • the display control unit 34 causes the display panel 52 to display the first type overhead image 80 (S12). If the front sensor 22 or the rear sensor 24 does not detect the obstacle 84 (N in S14), the process returns to step 10.
  • the front sensor 22 or the rear sensor 24 detects the obstacle 84 (Y in S14), if the obstacle 84 does not exist farther than the range of the first type bird's-eye image 80 (N in S16), the process goes to step 10. Return.
  • the display control unit 34 performs the second type overhead view in the detection direction of the obstacle 84 in the first type overhead image 80.
  • the image 82 is superimposed and displayed (S18), and the process returns to step 16. If the display conditions for the first type overhead image 80 are not satisfied (N in S10), the process is terminated.
  • the present embodiment when the presence of an object is not detected, only the first type bird's-eye view image is displayed, so that the driver can easily understand the situation in the vicinity of the vehicle. Further, when the presence of an object is detected, the second type bird's-eye image captured from a position higher than the first type bird's-eye view image is also displayed, so that the driver can easily recognize the presence of a distant object. In addition, since the second type overhead image corresponding to the detected direction of the object is displayed, reduction in the size of the first type overhead image can be suppressed. In addition, since it is possible to detect the presence of an object farther than the imaging range of the first type image, it is possible to detect the presence of an object that is not included in the first type overhead image.
  • the second type overhead image is displayed in the direction in which the object is detected in the first type overhead image, the positional relationship between the first type overhead image and the second type overhead image can be easily grasped.
  • the angle of view of the second type overhead image is displayed wider than the angle of view of the first type overhead image, it is possible to easily grasp the position where the object is present.
  • the second type overhead image is displayed in an enlarged manner than the angle of view of the first type overhead image, the presence of the object can be easily grasped.
  • the virtual viewpoint is set above the center of the vehicle 100, but the first-type overhead image 80 and the second-type overhead image 82 may have different virtual viewpoint positions.
  • the first type bird's-eye view image 80 is a bird's-eye view image with a virtual viewpoint above the center of the vehicle 100
  • the second type bird's-eye view image 82 is a virtual viewpoint shifted to the front of the vehicle 100 from the first type bird's-eye view image 80. It may be.
  • the second type overhead image 82 displays a wider range in front of the vehicle 100 than the first type overhead image 80. For this reason, by setting the virtual viewpoint position of the second type overhead image 82 in front of the vehicle 100, in the case of the display form as shown in FIG. 7, the first type overhead image 80 and the second type overhead image 82 Discomfort is reduced.
  • Example 2 relates to a vehicular display device that generates a bird's-eye view image by viewpoint conversion with respect to images captured by a plurality of imaging units installed in the vehicle, and displays the generated bird's-eye view image.
  • the vehicle display device according to the first embodiment displays the second type overhead image in addition to the first type overhead image when the presence of an object not included in the first type overhead image is detected.
  • the vehicle display device according to the second embodiment detects an object that is not included in the first type overhead image and is included in the second type overhead image
  • display of the second type overhead image is started.
  • the vehicle 100 and the vehicle display device 50 according to the second embodiment are the same type as those in FIGS. Here, it demonstrates centering on the difference with Example 1.
  • the object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 as before, and detects objects around the vehicle 100.
  • the object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, and the distance to the detected object is farther than the imaging range of the first type image, and the second type image When it is included in the imaging range, the display control unit 34 is notified of object detection.
  • the display control unit 34 obtains a notification from the object detection unit 40 and an identification number.
  • the display control unit 34 causes the display panel 52 to display the second type overhead image 82 in addition to the first type overhead image 80.
  • the timing at which the second type overhead image 82 is displayed on the display panel 52 in addition to the first type overhead image 80 is the timing at which the notification from the object detection unit 40 is acquired after the first type overhead image 80 is displayed. It is. That is, when the object detected by the object detection unit 40 is outside the range of the first type overhead image 80 and within the range of the second type overhead image 82, the display control unit 34 adds to the first type overhead image 80. Then, the display of the second type overhead image 82 corresponding to the detected direction of the object is started.
  • FIG. 9 is a flowchart showing a display procedure by the vehicle display device 50 according to the second embodiment of the present invention.
  • the display control unit 34 causes the display panel 52 to display the first type overhead image 80 (S102). If the front sensor 22 or the rear sensor 24 does not detect the obstacle 84 (N in S104), the process returns to step 100.
  • the front sensor 22 or the rear sensor 24 detects the obstacle 84 (Y in S104), when the obstacle 84 does not exist farther than the range of the first type overhead image 80, or the range of the second type overhead image 82 If not included (N in S106), the process returns to Step 100.
  • the display control unit 34 displays the first type overhead image.
  • the second type bird's-eye view image 82 is superimposed and displayed in the detection direction of the obstacle 84 at 80 (S108), and the process returns to step 106. If the display condition of the first type overhead image 80 is not satisfied (N in S100), the process is terminated.
  • the second type when the object is outside the range of the first type overhead image and within the range of the second type overhead image, the second type corresponding to the direction of the detected object in addition to the first type overhead image. Since the bird's-eye view image is displayed, it is possible to reliably display the object that is not included in the first type bird's-eye image and is included in the second type bird's-eye view image. In addition, since the second type overhead image is displayed in the direction in which the object is detected in the first type overhead image, the positional relationship between the first type overhead image and the second type overhead image can be easily grasped. .
  • Example 3 relates to a vehicle display device that generates an overhead image by viewpoint conversion for images captured by a plurality of imaging units installed in a vehicle, and displays the generated overhead image, as in the past.
  • the second type overhead image is not displayed.
  • the second type overhead image is displayed in addition to the first type overhead image.
  • a display is made so that it can be determined that the object included in the first type overhead image and the object included in the second type overhead image are the same.
  • the vehicle 100 according to the third embodiment is the same type as that shown in FIG. Here, it demonstrates centering on the difference from before.
  • FIG. 10 shows a configuration of the vehicle display device 50 according to the third embodiment of the present invention.
  • the vehicle display device 50 further includes an identity determination unit 42 in the vehicle display device 50 shown in FIG.
  • the object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 as before, and detects objects around the vehicle 100. When the object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, the object detection unit 40 notifies the display control unit 34 and the identity determination unit 42 of the detection of the object.
  • the object detection unit 40 notifies the detection of the object. At that time, the position of the detected object is also notified.
  • the identity determination unit 42 receives the first type overhead image 80 from the first image generation unit 32 and the second type overhead image 82 from the second image generation unit 38. Furthermore, the identity determination unit 42 acquires the position information and the traveling direction of the vehicle 100. In addition, since a well-known technique should just be used for acquisition of the positional information and the advancing direction of the vehicle 100, description is abbreviate
  • the identity determination unit 42 includes the same object when the position (coordinates) at which the object detection unit 40 detects the object is included in the first type overhead image 80 and the second type overhead image 82. Is determined.
  • the identity determination unit 42 performs image recognition processing on the first-type overhead image 80 and the second-type overhead image 82, compares the shapes of the objects acquired in the image recognition processing, and the same object is found. It may be determined that it is included.
  • the identity determination unit 42 determines the identity of the first type overhead image 80 and the second type overhead image 82 of the object detected by the object detection unit 40.
  • the identity determination unit 42 outputs a determination result on whether or not the same object is included to the display control unit 34.
  • the display control unit 34 displays the second type overhead image 82 in addition to the first type overhead image 80. Further, when displaying the second type overhead image 82 in addition to the first type overhead image 80, the display control unit 34, based on the determination result by the identity determination unit 42, the first type overhead image 80 and the second type.
  • the display which can determine the identity of the object currently displayed on each of the bird's-eye view images 82 is performed. When the detected object is displayed only in the second type bird's-eye view image 82, a display capable of determining the identity is unnecessary. However, when the vehicle 100 comes close to the detected object and the object is displayed in both the second type overhead image 82 and the first type overhead image 80, a display capable of determining the identity is made. .
  • FIG. 11 shows an overhead image generated by the display control unit 34. Similar to FIG. 6, this corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the rear sensor 24.
  • the display control unit 34 arranges the first type overhead image 80, particularly the second type overhead image 82 in which the obstacle 84 that is an object is displayed below the rear image 72.
  • the obstacle 84 is also displayed on the first type overhead image 80.
  • the obstacle 84 included in the first type bird's-eye image 80 and the obstacle 84 included in the second type bird's-eye image 82 are determined to be the same by the identity determination unit 42.
  • the same object marker 86 is shown on the obstacle 84 included in the first type overhead image 80 and the obstacle 84 included in the second type overhead image 82.
  • the same object marker 86 is a display that can determine the identity of an object, and is a display that surrounds each with a frame of the same shape or the same color.
  • the identity of the objects displayed in each of the first type overhead image and the second type overhead image is determined. Since possible display is performed, the same object displayed on each of the first type overhead image and the second type overhead image can be easily recognized. In addition, since the same object displayed in each of the first type overhead image and the second type overhead image is easily recognized, the position of the object can be easily recognized in a situation where the vehicle approaches the object. .
  • the present invention it is possible to make it easier for the driver to grasp the situation in the vicinity of the vehicle, particularly the presence of an object that becomes an obstacle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)

Abstract

A first acquisition unit 30 acquires a type 1 image in which the periphery of a vehicle is imaged. A first image generation unit 32 generates a type 1 bird's-eye image by converting the viewpoints of the type 1 image so as to be seen from above the vehicle. A display control unit 34 makes the type 1 bird's-eye image displayed. A second acquisition unit 36 acquires a type 2 image in which a range more apart from the vehicle than in the type 1 image is imaged from a position higher than for the type 1 image. A second image generation unit 38 generates a type 2 bird's-eye image by converting the viewpoints of the type 2 image so as to be seen from above the vehicle. An object detection unit 40 detects an object in the periphery of the vehicle. When an object is detected, the display control unit 34 makes a type 2 bird's-eye image that corresponds to the direction of the detected image displayed in addition to the type 1 bird's-eye image.

Description

車両用表示装置および車両用表示方法VEHICLE DISPLAY DEVICE AND VEHICLE DISPLAY METHOD
 本発明は、車両用表示技術に関し、特に画像を表示する車両用表示装置および車両用表示方法に関する。 The present invention relates to vehicle display technology, and more particularly to a vehicle display device and a vehicle display method for displaying an image.
 車両の複数個所にカメラを備え、撮影された画像を視点変換して合成し、俯瞰画像を得る技術が普及している。このような表示は、例えば車庫入れ時などに表示され、車両の周辺確認や位置把握の目的に用いられている。例えば、車庫入れ時に車両近傍に障害物がある場合、俯瞰画像にも障害物は表示されるが、視点変換処理がなされているので、障害物までの距離の把握が困難である。そのため、障害物を検知した場合に、障害物の検出領域を撮像した原画像が表示される(例えば、特許文献1参照)。 A technology is widely used in which cameras are provided at a plurality of locations on a vehicle, and the captured images are synthesized by converting the viewpoints to obtain a bird's-eye view image. Such a display is displayed at the time of entering a garage, for example, and is used for the purpose of confirming the periphery of the vehicle and grasping the position. For example, when there is an obstacle in the vicinity of the vehicle when entering the garage, the obstacle is also displayed in the bird's-eye view image, but since the viewpoint conversion processing is performed, it is difficult to grasp the distance to the obstacle. Therefore, when an obstacle is detected, an original image obtained by imaging the obstacle detection area is displayed (for example, see Patent Document 1).
特開2007-235529号公報JP 2007-235529 A
 障害物を検知した場合に表示される原画像の撮像範囲は、俯瞰画像の撮像範囲と同一である。そのため、俯瞰画像に表示されない範囲に存在する障害物は、原画像においても表示されない。一方、遠くの障害物が表示されるほどの広範囲の俯瞰画像を表示した場合、運転者は車両近傍の状況を把握しにくくなる。 The imaging range of the original image displayed when an obstacle is detected is the same as the imaging range of the overhead image. Therefore, an obstacle existing in a range that is not displayed in the overhead image is not displayed in the original image. On the other hand, when displaying a bird's-eye view image over a wide range so that a distant obstacle is displayed, it becomes difficult for the driver to grasp the situation in the vicinity of the vehicle.
 本発明はこうした状況に鑑みてなされたものであり、その目的は、車両近傍の状況、特に障害物となる物体の存在を運転者に把握させやすくする技術を提供することである。 The present invention has been made in view of such circumstances, and an object of the present invention is to provide a technique that makes it easier for the driver to grasp the situation in the vicinity of the vehicle, particularly the presence of an object that becomes an obstacle.
 上記課題を解決するために、本実施形態のある態様の車両用表示装置は、車両の周辺を撮像した第1種画像を取得する第1取得部と、第1取得部において取得した第1種画像に対して、車両の上方から見たように視点を変換することによって、第1種俯瞰画像を生成する第1画像生成部と、第1画像生成部において生成した第1種俯瞰画像を表示させる表示制御部と、第1取得部において取得される第1種画像よりも高い位置から第1種画像より前記車両から離間した範囲を撮像した第2種画像を取得する第2取得部と、第2取得部において取得した第2種画像に対して、車両の上方から見たように視点を変換することによって、第2種俯瞰画像を生成する第2画像生成部と、車両の周辺の物体を検出する物体検出部とを備える。表示制御部は、物体検出部が物体を検出した場合、第1種俯瞰画像に加えて、第2画像生成部において生成した第2種俯瞰画像であって、かつ検出した物体の方向に対応した第2種俯瞰画像も表示させる。 In order to solve the above problems, a vehicle display device according to an aspect of the present embodiment includes a first acquisition unit that acquires a first type image obtained by imaging the periphery of a vehicle, and a first type acquired by the first acquisition unit. A first image generation unit that generates a first type overhead image and a first type overhead image generated by the first image generation unit by converting the viewpoint as viewed from above the vehicle. And a second acquisition unit that acquires a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image acquired by the first acquisition unit, A second image generation unit that generates a second type overhead image by converting the viewpoint as viewed from above the vehicle with respect to the second type image acquired by the second acquisition unit, and objects around the vehicle And an object detection unit for detecting. When the object detection unit detects an object, the display control unit is a second type overhead image generated by the second image generation unit in addition to the first type overhead image, and corresponds to the direction of the detected object. A second type overhead image is also displayed.
 本実施形態の別の態様は、車両用表示方法である。この方法は、車両の周辺を撮像した第1種画像を取得するステップと、取得した第1種画像に対して、車両の上方から見たように視点を変換することによって、第1種俯瞰画像を生成するステップと、生成した第1種俯瞰画像を表示させるステップと、第1種画像よりも高い位置から第1種画像より前記車両から離間した範囲を撮像した第2種画像を取得するステップと、取得した第2種画像に対して、車両の上方から見たように視点を変換することによって、第2種俯瞰画像を生成するステップと、車両の周辺の物体を検出するステップと、物体を検出した場合、生成した第2種俯瞰画像であって、かつ検出した物体の方向に対応した第2種俯瞰画像も第1種俯瞰画像に加えて表示させるステップと、を備える。 Another aspect of the present embodiment is a vehicle display method. In this method, a first type overhead image is obtained by acquiring a first type image obtained by imaging the periphery of the vehicle and converting the viewpoint of the acquired first type image as viewed from above the vehicle. A step of displaying the generated first type overhead image, and a step of acquiring a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image. Generating a second type overhead image by converting the viewpoint as seen from above the vehicle with respect to the acquired second type image, detecting an object around the vehicle, A second type overhead image generated corresponding to the direction of the detected object is displayed in addition to the first type overhead image.
 なお、以上の構成要素の任意の組合せ、本実施形態の表現を方法、装置、システム、記録媒体、コンピュータプログラムなどの間で変換したものもまた、本実施形態の態様として有効である。 It should be noted that any combination of the above-described constituent elements and a representation of the present embodiment converted between a method, an apparatus, a system, a recording medium, a computer program, etc. are also effective as an aspect of the present embodiment.
 本実施形態によれば、車両近傍の状況、特に障害物となる物体の存在を運転者に把握させやすくできる。 According to this embodiment, it is possible to make it easier for the driver to grasp the situation in the vicinity of the vehicle, particularly the presence of an object that becomes an obstacle.
図1(a)-(b)は、本発明の実施例1に係る車両の外観を示す図である。FIGS. 1A to 1B are views showing the appearance of a vehicle according to Embodiment 1 of the present invention. 本発明の実施例1に係る車両用表示装置の構成を示す図である。It is a figure which shows the structure of the display apparatus for vehicles which concerns on Example 1 of this invention. 図1(a)-(b)の車両の周辺に形成される撮像範囲を示す斜視図である。FIG. 2 is a perspective view showing an imaging range formed around the vehicle in FIGS. 図2の第1画像生成部において生成された第1種俯瞰画像を示す図である。It is a figure which shows the 1st kind overhead view image produced | generated in the 1st image generation part of FIG. 図1(a)-(b)の車両の周辺に形成される別の撮像範囲を示す斜視図である。FIG. 2 is a perspective view showing another imaging range formed around the vehicle in FIGS. 図2の表示制御部において生成された俯瞰画像を示す図である。It is a figure which shows the bird's-eye view image produced | generated in the display control part of FIG. 図2の表示制御部において生成された別の俯瞰画像を示す図である。It is a figure which shows another bird's-eye view image produced | generated in the display control part of FIG. 図2の車両用表示装置による表示手順を示すフローチャートである。It is a flowchart which shows the display procedure by the display apparatus for vehicles of FIG. 本発明の実施例2に係る車両用表示装置による表示手順を示すフローチャートである。It is a flowchart which shows the display procedure by the display apparatus for vehicles which concerns on Example 2 of this invention. 本発明の実施例3に係る車両用表示装置の構成を示す図である。It is a figure which shows the structure of the display apparatus for vehicles which concerns on Example 3 of this invention. 図10の表示制御部において生成された俯瞰画像を示す図である。It is a figure which shows the bird's-eye view image produced | generated in the display control part of FIG.
(実施例1)
 本発明を具体的に説明する前に、まず前提を説明する。本発明の実施例1は、車両に設置された複数の撮像部において撮像した画像に対して、視点変換により俯瞰画像を生成し、生成した俯瞰画像を表示する車両用表示装置に関する。俯瞰画像には、車両近傍に存在する物体が表示されるが、例えば、車両から1m以上離れた位置に存在する障害物など、車両から離れて存在する物体が表示されない。本実施例に係る車両用表示装置は、車両から離れて存在する物体を表示させるために、次の処理を実行する。車両には、車両の下部に複数の第1撮像部が備えられ、車両用表示装置は、複数の第1撮像部のそれぞれにおいて撮像した第1種画像から、第1種俯瞰画像を生成する。また、車両には、第1撮像部よりも高い位置に複数の第2撮像部も備えられ、車両用表示装置は、複数の第2撮像部のそれぞれにおいて撮像した第2種画像から、第2種俯瞰画像を生成する。
Example 1
Before describing the present invention specifically, the premise will be described first. Embodiment 1 of the present invention relates to a vehicle display device that generates a bird's-eye view image by viewpoint conversion with respect to images taken by a plurality of image pickup units installed in a vehicle, and displays the generated bird's-eye view image. In the overhead image, an object that exists in the vicinity of the vehicle is displayed, but an object that exists away from the vehicle, such as an obstacle that exists at a position 1 m or more away from the vehicle, is not displayed. The vehicle display device according to the present embodiment executes the following process in order to display an object that exists away from the vehicle. The vehicle includes a plurality of first imaging units at a lower portion of the vehicle, and the vehicle display device generates a first type overhead image from a first type image captured by each of the plurality of first imaging units. The vehicle is also provided with a plurality of second imaging units at a position higher than the first imaging unit, and the vehicular display device receives the second type image captured by each of the plurality of second imaging units from the second type image. A seed bird's-eye view image is generated.
 さらに、車両には、センサも備えられ、車両用表示装置は、センサでの検出結果により、物体の存在を検出する。ここで、車両用表示装置は、物体の存在を検出していない場合に、第1種俯瞰画像を表示させているが、物体の存在を検出した場合に、第1種俯瞰画像に加えて第2種俯瞰画像を表示する。第2撮像部が設置されている高さは、第1撮像部が設置されている高さよりも高い位置であり、第1撮像部より遠方を撮影可能な位置に設置されている。そのため、第1種俯瞰画像よりも第2種俯瞰画像の方が、より遠方の物体を表示できる。一方、物体が存在しない場合、第2俯瞰画像の表示は不要になる。そのため、物体の存在の有無により、表示が切りかえられる。 Furthermore, the vehicle is also provided with a sensor, and the vehicle display device detects the presence of an object based on the detection result of the sensor. Here, the vehicular display device displays the first type overhead image when the presence of the object is not detected. However, when the presence of the object is detected, the vehicular display device displays the first type overhead image in addition to the first type overhead image. Two types of overhead images are displayed. The height at which the second imaging unit is installed is a position higher than the height at which the first imaging unit is installed, and is installed at a position where it is possible to photograph far away from the first imaging unit. For this reason, the second type overhead image can display a farther object than the first type overhead image. On the other hand, when there is no object, the display of the second overhead image is unnecessary. Therefore, the display is switched depending on whether or not an object exists.
 以下、本発明の実施例について、図面を参照しつつ説明する。実施例に示す具体的な数値等は、発明の理解を容易とするための例示にすぎず、特に断る場合を除き、本発明を限定するものではない。なお、本明細書および図面において、実質的に同一の機能、構成を有する要素については、同一の符号を付することにより重複説明を省略し、本発明に直接関係のない要素は図示を省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. Specific numerical values and the like shown in the examples are merely examples for facilitating understanding of the invention, and do not limit the present invention unless otherwise specified. In the present specification and drawings, elements having substantially the same function and configuration are denoted by the same reference numerals, and redundant description is omitted, and elements not directly related to the present invention are not illustrated. .
 図1(a)-(b)は、本発明の実施例1に係る車両100の外観を示す。図1(a)は、車両100の上面図を示し、図1(b)は、側面図を示す。車両100の前方部分、例えばバンパー、ボンネット等には、第1前方撮像部10が設置される。車両100の後方部分、例えば、バンパー、トランク等には、第1後方撮像部12が設置される。車両の左側方部分、例えば、左側ドアミラーの下部等には、第1左側方撮像部14が設置される。第1左側方撮像部14と左右対称になるように、車両の右側部分には、第1右側方撮像部16が設置される。ここで、第1前方撮像部10から第1右側方撮像部16は、第1撮像部と総称される。 FIGS. 1A to 1B show the appearance of the vehicle 100 according to the first embodiment of the present invention. FIG. 1A shows a top view of the vehicle 100, and FIG. 1B shows a side view. A first front imaging unit 10 is installed in a front portion of the vehicle 100, for example, a bumper, a bonnet, or the like. A first rear imaging unit 12 is installed in a rear portion of the vehicle 100, such as a bumper or a trunk. A first left side imaging unit 14 is installed on a left side portion of the vehicle, for example, a lower portion of the left door mirror. A first right side imaging unit 16 is installed on the right side of the vehicle so as to be symmetrical with the first left side imaging unit 14. Here, the first front imaging unit 10 to the first right imaging unit 16 are collectively referred to as a first imaging unit.
 第2前方撮像部18は、車両100のルーフ前方付近の車内に配置され、第2後方撮像部20は、車両100のルーフ後方付近の車内に配置される。第2前方撮像部18、第2後方撮像部20は、第1撮像部よりも高い位置かつ遠方が撮影可能な向きに配置される。そのため、第2前方撮像部18、第2後方撮像部20は、第1撮像部よりも遠方の物体を撮像可能である。なお、第2前方撮像部18、第2後方撮像部20は、車両100のルーフ付近の車室内に設置されているが、第1前方撮像部10、第1後方撮像部12に比して高い位置に設置されるのであれば、車両100の前後のバンパーや前後のボディーであってもよい。ここで、第2前方撮像部18、第2後方撮像部20は、第2撮像部と総称される。また、図1(a)-(b)において、第2撮像部は、車両100の前後のみに配置されているが、前後左右に配置されてもよい。 The second front imaging unit 18 is arranged in the vehicle near the roof front of the vehicle 100, and the second rear imaging unit 20 is arranged in the vehicle near the roof rear of the vehicle 100. The 2nd front image pick-up part 18 and the 2nd back image pick-up part 20 are arranged in the direction which can be photoed in a position higher than a 1st image pick-up part, and a distant place. Therefore, the second front imaging unit 18 and the second rear imaging unit 20 can image an object farther away than the first imaging unit. The second front imaging unit 18 and the second rear imaging unit 20 are installed in the passenger compartment near the roof of the vehicle 100, but are higher than the first front imaging unit 10 and the first rear imaging unit 12. As long as it is installed at a position, the front and rear bumpers and front and rear bodies of the vehicle 100 may be used. Here, the second front imaging unit 18 and the second rear imaging unit 20 are collectively referred to as a second imaging unit. Further, in FIGS. 1A to 1B, the second imaging units are disposed only in the front and rear of the vehicle 100, but may be disposed in the front and rear and right and left.
 前方センサ22は、第1前方撮像部10と同様に、車両100の前方に配置され、後方センサ24は、第1後方撮像部12と同様に、車両100の後方に配置される。例えば、前方センサ22は、第1前方撮像部10の近傍に配置され、後方センサ24は、第1後方撮像部12の近傍に配置される。 The front sensor 22 is disposed in front of the vehicle 100 as in the first front imaging unit 10, and the rear sensor 24 is disposed in the rear of the vehicle 100 as in the first rear imaging unit 12. For example, the front sensor 22 is disposed in the vicinity of the first front imaging unit 10, and the rear sensor 24 is disposed in the vicinity of the first rear imaging unit 12.
 図2は、本発明の実施例1に係る車両用表示装置50の構成を示す。車両用表示装置50には、第1前方撮像部(第1前方カメラ)10、第1後方撮像部(第1後方カメラ)12、第1左側方撮像部(第1左側方カメラ)14、第1右側方撮像部(第1右側方カメラ)16、第2前方撮像部(第2前方カメラ)18、第2後方撮像部(第2後方カメラ)20、前方センサ22、後方センサ24、表示パネル52が接続される。車両用表示装置50は、第1取得部30、第1画像生成部32、表示制御部34、第2取得部36、第2画像生成部38、物体検出部40を含む。 FIG. 2 shows a configuration of the vehicle display device 50 according to the first embodiment of the present invention. The vehicle display device 50 includes a first front imaging unit (first front camera) 10, a first rear imaging unit (first rear camera) 12, a first left imaging unit (first left camera) 14, a first 1 right side imaging unit (first right side camera) 16, second front imaging unit (second front camera) 18, second rear imaging unit (second rear camera) 20, front sensor 22, rear sensor 24, display panel 52 is connected. The vehicle display device 50 includes a first acquisition unit 30, a first image generation unit 32, a display control unit 34, a second acquisition unit 36, a second image generation unit 38, and an object detection unit 40.
 第1前方撮像部10、第1後方撮像部12、第1左側方撮像部14、第1右側方撮像部16は、図1に示したように配置される。図3は、車両100の周辺に形成される撮像範囲を示す斜視図である。第1前方撮像部10は、第1前方撮像部10から前方に向かうように前方撮像領域60を形成し、前方撮像領域60において画像を撮像する。第1後方撮像部12は、第1後方撮像部12から後方に向かうように後方撮像領域62を形成し、後方撮像領域62において画像を撮像する。第1左側方撮像部14は、第1左側方撮像部14から左方に向かうように左側方撮像領域64を形成し、左側方撮像領域64において画像を撮像する。第1右側方撮像部16は、第1右側方撮像部16から右方に向かうように、右側方撮像領域66を形成し、右側方撮像領域66において画像を撮像する。 The first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 are arranged as shown in FIG. FIG. 3 is a perspective view showing an imaging range formed around the vehicle 100. The first front imaging unit 10 forms a front imaging region 60 so as to go forward from the first front imaging unit 10, and captures an image in the front imaging region 60. The first rear imaging unit 12 forms a rear imaging region 62 so as to go rearward from the first rear imaging unit 12, and captures an image in the rear imaging region 62. The first left side imaging unit 14 forms a left side imaging region 64 so as to go to the left from the first left side imaging unit 14, and captures an image in the left side imaging region 64. The first right side imaging unit 16 forms a right side imaging region 66 so as to go right from the first right side imaging unit 16, and images an image in the right side imaging region 66.
 図3に斜線で示される前方撮像領域60は、第1前方撮像部10が撮影可能な範囲において、斜線で示される範囲から車両100における第1前方撮像部10の設置位置の直下までの範囲が、第1画像生成部32によって切り出されるとともに視点変換処理される範囲であることを示している。同様に後方撮像領域62は、第1後方撮像部12が撮影可能な範囲において、斜線で示される範囲から車両100における第1後方撮像部12の設置位置の直下までの範囲が、第1画像生成部32によって切り出されるとともに視点変換処理される範囲であることを示している。同様に左側方撮像領域64は、第1左側方撮像部14が撮影可能な範囲において、斜線で示される範囲から車両100における第1左側方撮像部14の設置位置の直下までの範囲が、第1画像生成部32によって切り出されるとともに視点変換処理される範囲であることを示している。同様に右側方撮像領域66は、第1右側方撮像部16が撮影可能な範囲において、斜線で示される範囲から車両100における第1右側方撮像部16の設置位置の直下までの範囲が、第1画像生成部32によって切り出されるとともに視点変換処理される範囲であることを示している。これらの第1撮像部において撮像した画像によって、車両100の周辺が撮像される。図2に戻る。 The front imaging area 60 indicated by hatching in FIG. 3 is a range from the range indicated by hatching to the position immediately below the installation position of the first front imaging part 10 in the vehicle 100 in the range where the first front imaging unit 10 can shoot. , And a range that is cut out by the first image generation unit 32 and subjected to viewpoint conversion processing. Similarly, the rear imaging area 62 is a range in which the first rear imaging unit 12 can shoot, and a range from a range indicated by hatching to a position immediately below the installation position of the first rear imaging unit 12 in the vehicle 100 is the first image generation. This indicates that the range is cut out by the unit 32 and subjected to viewpoint conversion processing. Similarly, the left-side imaging area 64 has a range from the range indicated by hatching to the position immediately below the installation position of the first left-side imaging unit 14 in the vehicle 100 in the range in which the first left-side imaging unit 14 can shoot. This indicates that the range is cut out by one image generation unit 32 and subjected to viewpoint conversion processing. Similarly, the right side imaging region 66 has a range from the range indicated by oblique lines to the position immediately below the installation position of the first right side imaging unit 16 in the vehicle 100 in the range in which the first right side imaging unit 16 can shoot. This indicates that the range is cut out by one image generation unit 32 and subjected to viewpoint conversion processing. The periphery of the vehicle 100 is imaged by images captured by the first imaging unit. Returning to FIG.
 第1前方撮像部10、第1後方撮像部12、第1左側方撮像部14、第1右側方撮像部16は、前述のごとく、画像を撮像する。画像は動画像であるが、連続的に撮影される静止画像であってもよい。第1前方撮像部10、第1後方撮像部12、第1左側方撮像部14、第1右側方撮像部16は、撮像した画像を第1取得部30に出力する。第1取得部30は、第1前方撮像部10、第1後方撮像部12、第1左側方撮像部14、第1右側方撮像部16のそれぞれから画像(以下、「第1種画像」という)を取得する。つまり、第1取得部30は、車両100の周辺を撮像した第1種画像を取得する。第1取得部30が取得した第1種画像は、第1画像生成部32で処理される。 The first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 capture an image as described above. The image is a moving image, but may be a still image taken continuously. The first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 output the captured images to the first acquisition unit 30. The first acquisition unit 30 includes images (hereinafter referred to as “first type images”) from each of the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16. ) To get. That is, the first acquisition unit 30 acquires a first type image obtained by imaging the periphery of the vehicle 100. The first type image acquired by the first acquisition unit 30 is processed by the first image generation unit 32.
 第1画像生成部32は、第1取得部30が取得した第1種画像の処理を実行する。第1画像生成部32は、第1種画像に対して、車両100の上方から見たように視点を変換し、第1種俯瞰画像を生成する処理を行う。当該変換および俯瞰画像生成処理には、公知の技術が使用されればよいが、例えば、仮想的な三次元空間における立体曲面に画像の各画素を投影するとともに、車両100の上方からの仮想視点に応じて、立体曲面の必要な領域を切り出す。切り出した領域が、視点を変換した画像に相当する。生成した俯瞰画像の一例が、図4である。 The first image generation unit 32 executes processing of the first type image acquired by the first acquisition unit 30. The first image generation unit 32 performs processing for converting the viewpoint of the first type image as viewed from above the vehicle 100 and generating the first type overhead image. A known technique may be used for the conversion and the overhead image generation processing. For example, each pixel of the image is projected onto a three-dimensional curved surface in a virtual three-dimensional space, and a virtual viewpoint from above the vehicle 100 is used. Depending on, a necessary area of the three-dimensional curved surface is cut out. The cut out region corresponds to an image whose viewpoint has been converted. An example of the generated overhead image is shown in FIG.
 図4は、第1画像生成部32において生成された第1種俯瞰画像80を示す。図4における第1種俯瞰画像80の中央部分には、自車アイコン78が配置される。自車アイコン78は、車両100の上面を示す画像である。自車アイコン78の前方には前方画像70が配置され、自車アイコン78の後方には後方画像72が配置され、自車アイコン78の左側方には左側方画像74が配置され、車両100の右側方には右側方画像76が配置される。図2に戻る。つまり、第1画像生成部32は、第1取得部30において取得した第1種画像に対して、車両100の上方から見たように視点を変換することによって、第1種俯瞰画像80を生成する。第1画像生成部32が生成した第1種俯瞰画像80は、表示制御部34で処理される。 FIG. 4 shows a first type overhead image 80 generated by the first image generation unit 32. In the center part of the first type bird's-eye view image 80 in FIG. The own vehicle icon 78 is an image showing the upper surface of the vehicle 100. A front image 70 is arranged in front of the host vehicle icon 78, a rear image 72 is arranged behind the host vehicle icon 78, and a left side image 74 is arranged on the left side of the host vehicle icon 78. A right side image 76 is arranged on the right side. Returning to FIG. That is, the first image generation unit 32 generates the first type overhead image 80 by converting the viewpoint as seen from above the vehicle 100 with respect to the first type image acquired by the first acquisition unit 30. To do. The first type overhead image 80 generated by the first image generation unit 32 is processed by the display control unit 34.
 表示制御部34は、第1画像生成部32が生成した第1種俯瞰画像80を表示する処理を実行する。表示制御部34は、第1種俯瞰画像80を表示パネル52に表示させる。なお、第1種俯瞰画像80を表示パネル52に表示させるタイミングは、車両の周辺確認を要する任意のタイミングであり、例えば車庫入れ時など、車両100のリバースギアが選択されている場合としてもよい。表示パネル52には、図4のような第1種俯瞰画像80が表示される。 The display control unit 34 executes a process of displaying the first type overhead image 80 generated by the first image generation unit 32. The display control unit 34 displays the first type overhead image 80 on the display panel 52. Note that the timing at which the first-type bird's-eye view image 80 is displayed on the display panel 52 is an arbitrary timing that requires confirmation of the surroundings of the vehicle. For example, the reverse gear of the vehicle 100 may be selected when entering the garage. . A first type bird's-eye view image 80 as shown in FIG. 4 is displayed on the display panel 52.
 第2前方撮像部18、第2後方撮像部20は、図1に示したように配置される。図5は、車両100の周辺に形成される別の撮像範囲を示す斜視図である。第2前方撮像部18は、第2前方撮像部18から前方に向かうように前方撮像領域63を形成し、前方撮像領域63において画像を撮像する。前方撮像領域63は、前方撮像領域60よりも車両100の前方に延びる。第2後方撮像部20は、第2後方撮像部20から後方に向かうように後方撮像領域65を形成し、後方撮像領域65において画像を撮像する。後方撮像領域65は、後方撮像領域62よりも車両100の後方に延びる。 The second front imaging unit 18 and the second rear imaging unit 20 are arranged as shown in FIG. FIG. 5 is a perspective view showing another imaging range formed around the vehicle 100. The second front imaging unit 18 forms a front imaging region 63 so as to go forward from the second front imaging unit 18, and images an image in the front imaging region 63. The front imaging area 63 extends in front of the vehicle 100 more than the front imaging area 60. The second rear imaging unit 20 forms a rear imaging region 65 so as to extend rearward from the second rear imaging unit 20 and captures an image in the rear imaging region 65. The rear imaging area 65 extends to the rear of the vehicle 100 more than the rear imaging area 62.
 図5に斜線で示される前方撮像領域63は、第2前方撮像部18が撮影可能な範囲において、斜線で示される範囲から車両100における第2前方撮像部18の設置位置の直下までの範囲が、第2画像生成部38によって切り出されるとともに視点変換処理される範囲であることを示している。同様に後方撮像領域65は、第2後方撮像部20が撮影可能な範囲において、斜線で示される範囲から車両100における第2後方撮像部20の設置位置の直下までの範囲が、第2画像生成部38によって切り出されるとともに視点変換処理される範囲であることを示している。なお、第2前方撮像部18および第2後方撮像部20において撮像された画像と、第1前方撮像部10および第1後方撮像部12において撮像された画像との比較は後述する。これらの第2撮像部において撮像した画像によっても、車両100の周辺が撮像される。図2に戻る。 A front imaging area 63 indicated by hatching in FIG. 5 has a range from the range indicated by hatching to the position immediately below the installation position of the second front imaging part 18 in the vehicle 100 in the range where the second front imaging unit 18 can shoot. , And the range that is cut out by the second image generation unit 38 and subjected to the viewpoint conversion process. Similarly, the rear imaging area 65 is a range in which the second rear imaging unit 20 can shoot, and a range from a range indicated by hatching to a position immediately below the installation position of the second rear imaging unit 20 in the vehicle 100 is the second image generation. This indicates that the range is cut out by the unit 38 and the viewpoint conversion process is performed. Note that a comparison between an image captured by the second front imaging unit 18 and the second rear imaging unit 20 and an image captured by the first front imaging unit 10 and the first rear imaging unit 12 will be described later. The periphery of the vehicle 100 is also imaged by images captured by these second imaging units. Returning to FIG.
 第2前方撮像部18は、第1前方撮像部10よりも高い位置から前方に向かって画像を撮像する。そのため、第2前方撮像部18において撮像した画像は、第1前方撮像部10において撮像した画像よりも、車両100から離れた場所が含まれる。つまり、第2前方撮像部18は、第1前方撮像部10よりも遠方を撮像可能である。第2前方撮像部18の撮像範囲は、第1前方撮像部10の撮像範囲と一部重なっていてもよく、第1前方撮像部10の撮像範囲より重畳しない範囲であってもよい。 The second front imaging unit 18 captures an image forward from a position higher than the first front imaging unit 10. Therefore, the image captured by the second front imaging unit 18 includes a location farther from the vehicle 100 than the image captured by the first front imaging unit 10. That is, the second front imaging unit 18 can image farther than the first front imaging unit 10. The imaging range of the second front imaging unit 18 may partially overlap the imaging range of the first front imaging unit 10 or may be a range that does not overlap with the imaging range of the first front imaging unit 10.
 第2後方撮像部20は、第1後方撮像部12よりも高い位置から後方に向かって画像を撮像する。そのため、第2後方撮像部20において撮像した画像は、第1後方撮像部12において撮像した画像よりも、車両100から離れた場所が含まれる。つまり、第2後方撮像部20は、第1後方撮像部12よりも遠方を撮像可能である。第2後方撮像部20の撮像範囲は、第1後方撮像部12の撮像範囲と一部重なっていてもよく、第1後方撮像部12の撮像範囲より重畳しない範囲であってもよい。ここでも、画像は動画像であるが、連続的に撮影される静止画像であってもよい。第2前方撮像部18、第2後方撮像部20は、撮像した画像を第2取得部36に出力する。 The second rear imaging unit 20 captures an image backward from a position higher than the first rear imaging unit 12. Therefore, the image captured by the second rear imaging unit 20 includes a place farther from the vehicle 100 than the image captured by the first rear imaging unit 12. That is, the second rear imaging unit 20 can image farther than the first rear imaging unit 12. The imaging range of the second rear imaging unit 20 may partially overlap the imaging range of the first rear imaging unit 12 or may be a range that does not overlap with the imaging range of the first rear imaging unit 12. In this case as well, the image is a moving image, but may be a still image taken continuously. The second front imaging unit 18 and the second rear imaging unit 20 output the captured image to the second acquisition unit 36.
 第2取得部36は、第2前方撮像部18、第2後方撮像部20のそれぞれから画像(以下、「第2種画像」という)を取得する。つまり、第2取得部36は、第1種画像よりも高い位置から撮像した第2種画像を取得する。第2取得部36が取得した第2種画像は、第2画像生成部38で処理される。 The second acquisition unit 36 acquires images (hereinafter referred to as “second type images”) from each of the second front imaging unit 18 and the second rear imaging unit 20. That is, the second acquisition unit 36 acquires the second type image captured from a position higher than the first type image. The second type image acquired by the second acquisition unit 36 is processed by the second image generation unit 38.
 第2画像生成部38は、第2取得部36が取得した第2種画像の処理を実行する。第2画像生成部38は、第2種画像に対して、車両100の上方から見たように視点を変換し、第2種俯瞰画像を生成する処理を行う。第2画像生成部38での処理は、第1画像生成部32での処理と同様である。つまり、第2画像生成部38は、第2取得部36において取得した第2種画像に対して、車両の上方から見たように視点を変換することによって、第2種俯瞰画像82を生成する。このような第2種俯瞰画像82は、第1種俯瞰画像80よりも車両100から離間した範囲の俯瞰画像である。なお、第1種俯瞰画像80では、前方画像70から右側方画像76の4方向の画像が合成されるが、第2種俯瞰画像82は、1つの方向の第2種画像から生成されている。第2画像生成部38が生成した第2種俯瞰画像82は、表示制御部34で処理される。 The second image generation unit 38 executes processing of the second type image acquired by the second acquisition unit 36. The 2nd image generation part 38 performs a process which converts a viewpoint with respect to a 2nd type image as it saw from the upper direction of the vehicle 100, and produces | generates a 2nd type bird's-eye view image. The processing in the second image generation unit 38 is the same as the processing in the first image generation unit 32. That is, the second image generation unit 38 generates the second type overhead image 82 by converting the viewpoint as viewed from above the vehicle with respect to the second type image acquired by the second acquisition unit 36. . Such a second type overhead image 82 is an overhead image in a range farther from the vehicle 100 than the first type overhead image 80. In the first type bird's-eye view image 80, images in the four directions of the front image 70 and the right side image 76 are synthesized, but the second type bird's-eye view image 82 is generated from the second type image in one direction. . The second type overhead image 82 generated by the second image generation unit 38 is processed by the display control unit 34.
 前方センサ22、後方センサ24は、図1に示したように配置される。前方センサ22、後方センサ24は、例えば、ミリ波センサ、赤外線センサである。また、第2撮像部であってもよい。前方センサ22および後方センサ24に第2撮像部を用いる場合、第2撮像部が撮像した画像に対して、物体検出部40がエッジ検出処理等を行い、障害物を検出する。前方センサ22と後方センサ24のそれぞれには、識別するための識別番号が付与されている。 The front sensor 22 and the rear sensor 24 are arranged as shown in FIG. The front sensor 22 and the rear sensor 24 are, for example, a millimeter wave sensor or an infrared sensor. Moreover, a 2nd imaging part may be sufficient. When the second imaging unit is used for the front sensor 22 and the rear sensor 24, the object detection unit 40 performs edge detection processing or the like on the image captured by the second imaging unit, and detects an obstacle. An identification number for identification is assigned to each of the front sensor 22 and the rear sensor 24.
 物体検出部40は、前方センサ22、後方センサ24に接続され、車両100の周辺の物体を検出する。物体の検出には公知の技術が使用されればよいので、ここでは具体的な説明を省略する。例えば、前方センサ22または後方センサ24に赤外レーザを用いた場合、車両100の検出方向となる範囲に赤外レーザが照射され、物体に反射した赤外レーザを受光したときの時間差に基づいて、物体を検出する。なお、物体検出部40の検出範囲は、第1取得部30において取得した第1種画像の撮像範囲よりも遠くなるように設定されている。物体検出部40は、前方センサ22、後方センサ24のいずれかにおいて物体を検出し、かつ検出した物体までの距離がしきい値よりも大きい場合、物体の検出を表示制御部34に通知する。その際、物体を検出した前方センサ22あるいは後方センサ24の識別番号も通知される。ここで、しきい値は、第1種画像の撮像範囲の遠方側の値になるように設定される。 The object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 and detects an object around the vehicle 100. Since a known technique may be used for the detection of the object, a specific description is omitted here. For example, when an infrared laser is used for the front sensor 22 or the rear sensor 24, based on the time difference when the infrared laser is irradiated in a range that is the detection direction of the vehicle 100 and the infrared laser reflected on the object is received. , Detect objects. Note that the detection range of the object detection unit 40 is set to be farther than the imaging range of the first type image acquired by the first acquisition unit 30. The object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, and notifies the display control unit 34 of the detection of the object when the distance to the detected object is larger than the threshold value. At that time, the identification number of the front sensor 22 or the rear sensor 24 that detected the object is also notified. Here, the threshold value is set to be a value on the far side of the imaging range of the first type image.
 表示制御部34は、第1画像生成部32が生成した第1種俯瞰画像80の表示処理を実行する。また、表示制御部34は、第2画像生成部38が生成した第2種俯瞰画像82の表示処理も実行する。さらに、物体検出部40において物体が検出された場合、表示制御部34は、物体検出部40からの通知、識別番号を取得する。表示制御部34は、物体検出部40からの通知が取得されない場合、それまでと同様に第1種俯瞰画像80を表示パネル52に表示させる。一方、表示制御部34は、物体検出部40からの通知を取得した場合、第1種俯瞰画像80に加えて第2種俯瞰画像82も表示パネル52に表示させる。なお、表示パネル52に第1種俯瞰画像80に加えて第2種俯瞰画像82を表示させるタイミングは、第1種俯瞰画像80を表示した後に物体検出部40からの通知を取得したタイミングである。 The display control unit 34 executes display processing of the first type overhead image 80 generated by the first image generation unit 32. The display control unit 34 also executes display processing of the second type overhead image 82 generated by the second image generation unit 38. Further, when an object is detected by the object detection unit 40, the display control unit 34 acquires a notification and an identification number from the object detection unit 40. When the notification from the object detection unit 40 is not acquired, the display control unit 34 displays the first type overhead image 80 on the display panel 52 as before. On the other hand, when acquiring the notification from the object detection unit 40, the display control unit 34 causes the display panel 52 to display the second type overhead image 82 in addition to the first type overhead image 80. The timing at which the second type overhead image 82 is displayed on the display panel 52 in addition to the first type overhead image 80 is the timing at which the notification from the object detection unit 40 is acquired after the first type overhead image 80 is displayed. .
 図6は、表示制御部34において生成された俯瞰画像を示す。これは、物体検出部40からの通知を取得した場合であり、かつ取得した識別番号が後方センサ24を示す場合に相当する。取得した識別番号が後方センサ24を示す場合、車両100の後方に存在する物体が後方センサ24によって検出されている。図示のごとく、表示制御部34は、第1種俯瞰画像80、特に後方画像72の下側に、物体である障害物84が表示された第2種俯瞰画像82を配置させる。このように、表示制御部34は、第1種俯瞰画像80のうち、物体検出部40が物体を検出した方向に、検出した物体の方向に対応した第2種俯瞰画像82を表示させる。後方センサ24が検出した物体に基づき図6のような表示を行う場合は、車両100が後退しているときが適切である。 FIG. 6 shows a bird's-eye view image generated by the display control unit 34. This corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the rear sensor 24. When the acquired identification number indicates the rear sensor 24, an object existing behind the vehicle 100 is detected by the rear sensor 24. As illustrated, the display control unit 34 arranges the first type overhead image 80, particularly the second type overhead image 82 in which the obstacle 84 that is an object is displayed below the rear image 72. In this way, the display control unit 34 displays the second type overhead image 82 corresponding to the direction of the detected object in the direction in which the object detection unit 40 detects the object in the first type overhead image 80. When the display as shown in FIG. 6 is performed based on the object detected by the rear sensor 24, it is appropriate that the vehicle 100 is moving backward.
 図7は、表示制御部34において生成された別の俯瞰画像を示す。これは、物体検出部40からの通知を取得した場合であり、かつ取得した識別番号が前方センサ22を示す場合に相当する。取得した識別番号が前方センサ22を示す場合、車両100の前方に存在する物体が前方センサ22によって検出されている。図示のごとく、表示制御部34は、第1種俯瞰画像80、特に前方画像70の上側に、障害物84が表示された第2種俯瞰画像82を配置させる。ここでも、表示制御部34は、第1種俯瞰画像80のうち、物体検出部40が物体を検出した方向に、検出した物体の方向に対応した第2種俯瞰画像82を表示させる。前方センサ22が検出した物体に基づき図7のような表示を行う場合は、車両100が前進しているときが適切である。 FIG. 7 shows another overhead image generated by the display control unit 34. This corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the front sensor 22. When the acquired identification number indicates the front sensor 22, an object that exists in front of the vehicle 100 is detected by the front sensor 22. As shown in the figure, the display control unit 34 arranges the first type bird's-eye view image 80, particularly the second type bird's-eye view image 82 on which the obstacle 84 is displayed above the front image 70. Also in this case, the display control unit 34 displays the second type overhead image 82 corresponding to the direction of the detected object in the direction in which the object detection unit 40 detects the object in the first type overhead image 80. When the display as shown in FIG. 7 is performed based on the object detected by the front sensor 22, it is appropriate that the vehicle 100 is moving forward.
 ここでは、このような俯瞰画像を生成するための表示制御部34の処理をさらに詳細に説明する。取得した識別番号が後方センサ24を示す場合、表示制御部34は、第2後方撮像部20において撮像された第2種画像から生成した第2種俯瞰画像82を選択する。表示制御部34は、第2後方撮像部20と同様に後方を向いた第1後方撮像部12に対応した後方画像72の下側に、選択した第2種俯瞰画像82を配置させる。その際、表示制御部34は、物体検出部40が物体を検出した方向における第1種俯瞰画像80の画角よりも、第2種俯瞰画像82の画角を広く表示させてもよい。また、表示制御部34は、物体検出部40が物体を検出した方向における第1種俯瞰画像80の画角よりも、第2種俯瞰画像82を拡大して表示させてもよい。なお、画角を広く表示すること、拡大して表示することには公知の技術が使用されればよいので、ここでは説明を省略する。また、表示制御部34は、図6に示すように、第1種俯瞰画像80を上方に移動させ、第1種俯瞰画像80の下方に第2種俯瞰画像82を表示させてもよい。 Here, the processing of the display control unit 34 for generating such a bird's-eye view image will be described in more detail. When the acquired identification number indicates the rear sensor 24, the display control unit 34 selects the second type overhead image 82 generated from the second type image captured by the second rear imaging unit 20. The display control unit 34 arranges the selected second type overhead image 82 below the rear image 72 corresponding to the first rear imaging unit 12 facing rearward in the same manner as the second rear imaging unit 20. At that time, the display control unit 34 may display the angle of view of the second type overhead image 82 wider than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object. Further, the display control unit 34 may display the second type overhead image 82 in an enlarged manner than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object. It should be noted that a known technique may be used to display a wide angle of view and display an enlarged view, and thus description thereof is omitted here. Further, as shown in FIG. 6, the display control unit 34 may move the first type overhead image 80 upward and display the second type overhead image 82 below the first type overhead image 80.
 一方、取得した識別番号が前方センサ22を示す場合、表示制御部34は、第2前方撮像部18において撮像された第2種画像から生成した第2種俯瞰画像82を選択する。表示制御部34は、第2前方撮像部18と同様に前方を向いた第1前方撮像部10に対応した前方画像70の上側に、選択した第2種俯瞰画像82を配置させる。この場合も、表示制御部34は、物体検出部40が物体を検出した方向における第1種俯瞰画像80の画角よりも、第2種俯瞰画像82の画角を広く表示させてもよい。また、表示制御部34は、物体検出部40が物体を検出した方向における第1種俯瞰画像80の画角よりも、第2種俯瞰画像82を拡大して表示させてもよい。また、表示制御部34は、図7に示すように、第1種俯瞰画像80を下方に移動させ、第1種俯瞰画像80の上方に第2種俯瞰画像82を表示させてもよい。 On the other hand, when the acquired identification number indicates the front sensor 22, the display control unit 34 selects the second type overhead image 82 generated from the second type image captured by the second front imaging unit 18. The display control unit 34 arranges the selected second type overhead image 82 on the upper side of the front image 70 corresponding to the first front imaging unit 10 facing forward like the second front imaging unit 18. Also in this case, the display control unit 34 may display the angle of view of the second type overhead image 82 wider than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object. Further, the display control unit 34 may display the second type overhead image 82 in an enlarged manner than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object. Further, as shown in FIG. 7, the display control unit 34 may move the first type overhead image 80 downward and display the second type overhead image 82 above the first type overhead image 80.
 図6および図7のような表示を行うことで、第1種俯瞰画像80として表示される範囲より遠方に障害物84が存在するとき、実質的に俯瞰画像の表示範囲を拡張するように、障害物84が検出された方向に第2種俯瞰画像82が表示される。このため、運転者は自己の目視に加えて俯瞰画像の表示を確認することで、より適切に障害物の存在および位置関係を把握することができる。 By performing the display as shown in FIG. 6 and FIG. 7, when the obstacle 84 exists far from the range displayed as the first type overhead image 80, the display range of the overhead image is substantially expanded. The second type overhead image 82 is displayed in the direction in which the obstacle 84 is detected. For this reason, the driver | operator can grasp | ascertain the presence and positional relationship of an obstacle more appropriately by confirming the display of a bird's-eye view image in addition to his own visual observation.
 また、第1取得部30が第1種画像を取得する第1撮像部より、第2取得部36が第2種画像を取得する第2撮像部は高い位置に配置される。第2撮像部は、図1に示すように車両100のルーフ付近に配置されている場合は、運転者の位置より高い位置に配置されているため、障害物84を自己の視点より上方から見ることとなり、障害物84の立体感などをより適切に把握することができる。 Further, the second imaging unit from which the second acquisition unit 36 acquires the second type image is arranged at a higher position than the first imaging unit from which the first acquisition unit 30 acquires the first type image. When the second imaging unit is disposed near the roof of the vehicle 100 as illustrated in FIG. 1, the second imaging unit is disposed at a position higher than the position of the driver, and thus the obstacle 84 is viewed from above from its own viewpoint. Thus, the stereoscopic effect of the obstacle 84 can be grasped more appropriately.
 この構成は、ハードウエア的には、任意のコンピュータのCPU、メモリ、その他のLSIで実現でき、ソフトウエア的にはメモリにロードされたプログラムなどによって実現されるが、ここではそれらの連携によって実現される機能ブロックを描いている。したがって、これらの機能ブロックがハードウエアのみ、ソフトウエアのみ、またはそれらの組合せによっていろいろな形で実現できることは、当業者には理解されるところである。 This configuration can be realized in terms of hardware by a CPU, memory, or other LSI of any computer, and in terms of software, it can be realized by a program loaded in the memory, but here it is realized by their cooperation. Draw functional blocks. Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
 以上の構成による車両用表示装置50の動作を説明する。図8は、車両用表示装置50による表示手順を示すフローチャートである。第1種俯瞰画像80の表示条件を満たしている場合(S10のY)、表示制御部34は、表示パネル52に第1種俯瞰画像80を表示させる(S12)。前方センサ22あるいは後方センサ24が障害物84を検出しなければ(S14のN)、ステップ10に戻る。前方センサ22あるいは後方センサ24が障害物84を検出した場合(S14のY)、障害物84が第1種俯瞰画像80の範囲よりも遠方に存在しなければ(S16のN)、ステップ10に戻る。障害物84が第1種俯瞰画像80の範囲よりも遠方に存在すれば(S16のY)、表示制御部34は、第1種俯瞰画像80における障害物84の検出方向に、第2種俯瞰画像82を重畳表示し(S18)、ステップ16に戻る。第1種俯瞰画像80の表示条件を満たしていない場合(S10のN)、処理が終了される。 The operation of the vehicular display device 50 having the above configuration will be described. FIG. 8 is a flowchart showing a display procedure by the vehicle display device 50. When the display condition of the first type overhead image 80 is satisfied (Y in S10), the display control unit 34 causes the display panel 52 to display the first type overhead image 80 (S12). If the front sensor 22 or the rear sensor 24 does not detect the obstacle 84 (N in S14), the process returns to step 10. When the front sensor 22 or the rear sensor 24 detects the obstacle 84 (Y in S14), if the obstacle 84 does not exist farther than the range of the first type bird's-eye image 80 (N in S16), the process goes to step 10. Return. If the obstacle 84 exists farther than the range of the first type overhead image 80 (Y in S16), the display control unit 34 performs the second type overhead view in the detection direction of the obstacle 84 in the first type overhead image 80. The image 82 is superimposed and displayed (S18), and the process returns to step 16. If the display conditions for the first type overhead image 80 are not satisfied (N in S10), the process is terminated.
 本実施例によれば、物体の存在を検出しない場合、第1種俯瞰画像だけを表示するので、車両近傍の状況を運転者に把握させやすくできる。また、物体の存在を検出した場合、第1種俯瞰画像よりも高い位置から撮像された第2種俯瞰画像も表示するので、遠くの物体の存在を運転者に把握させやすくできる。また、検出した物体の方向に対応した第2種俯瞰画像を表示させるので、第1種俯瞰画像のサイズの縮小を抑制できる。また、第1種画像の撮像範囲よりも遠くまでの物体の存在を検出可能であるので、第1種俯瞰画像に含まれない物体の存在を検出できる。 According to the present embodiment, when the presence of an object is not detected, only the first type bird's-eye view image is displayed, so that the driver can easily understand the situation in the vicinity of the vehicle. Further, when the presence of an object is detected, the second type bird's-eye image captured from a position higher than the first type bird's-eye view image is also displayed, so that the driver can easily recognize the presence of a distant object. In addition, since the second type overhead image corresponding to the detected direction of the object is displayed, reduction in the size of the first type overhead image can be suppressed. In addition, since it is possible to detect the presence of an object farther than the imaging range of the first type image, it is possible to detect the presence of an object that is not included in the first type overhead image.
 また、第1種俯瞰画像のうち、物体を検出した方向に、第2種俯瞰画像を表示させるので、第1種俯瞰画像と第2種俯瞰画像との位置関係を容易に把握させることができる。また、第1種俯瞰画像の画角よりも、第2種俯瞰画像の画角を広く表示させるので、物体が存在する位置の把握を容易にさせることができる。また、第1種俯瞰画像の画角よりも、第2種俯瞰画像を拡大して表示させるので、物体の存在を容易に把握させることができる。 In addition, since the second type overhead image is displayed in the direction in which the object is detected in the first type overhead image, the positional relationship between the first type overhead image and the second type overhead image can be easily grasped. . Further, since the angle of view of the second type overhead image is displayed wider than the angle of view of the first type overhead image, it is possible to easily grasp the position where the object is present. In addition, since the second type overhead image is displayed in an enlarged manner than the angle of view of the first type overhead image, the presence of the object can be easily grasped.
 また、一般的に仮想視点は車両100の中央付近の上方に設定されるが、第1種俯瞰画像80と第2種俯瞰画像82とで、仮想視点の位置が異なっていてもよい。例えば、第1種俯瞰画像80を車両100の中央付近の上方を仮想視点とした俯瞰画像とし、第2種俯瞰画像82は、第1種俯瞰画像80より車両100の前方に偏移した仮想視点であってもよい。第2種俯瞰画像82は、第1種俯瞰画像80より車両100の前方に広い範囲を表示する。このため、第2種俯瞰画像82の仮想視点位置を車両100の前方に設定することで、図7に示すような表示形態である場合、第1種俯瞰画像80と第2種俯瞰画像82との違和感が低減される。 In general, the virtual viewpoint is set above the center of the vehicle 100, but the first-type overhead image 80 and the second-type overhead image 82 may have different virtual viewpoint positions. For example, the first type bird's-eye view image 80 is a bird's-eye view image with a virtual viewpoint above the center of the vehicle 100, and the second type bird's-eye view image 82 is a virtual viewpoint shifted to the front of the vehicle 100 from the first type bird's-eye view image 80. It may be. The second type overhead image 82 displays a wider range in front of the vehicle 100 than the first type overhead image 80. For this reason, by setting the virtual viewpoint position of the second type overhead image 82 in front of the vehicle 100, in the case of the display form as shown in FIG. 7, the first type overhead image 80 and the second type overhead image 82 Discomfort is reduced.
(実施例2)
 次に、実施例2を説明する。実施例2は、実施例1と同様に、車両に設置された複数の撮像部において撮像した画像に対して、視点変換により俯瞰画像を生成し、生成した俯瞰画像を表示する車両用表示装置に関する。実施例1に係る車両用表示装置は、第1種俯瞰画像に含まれない物体の存在を検出した場合に、第1種俯瞰画像に加えて第2種俯瞰画像を表示する。一方、実施例2に係る車両表示装置は、第1種俯瞰画像に含まれない物体であって、かつ第2種俯瞰画像に含まれる物体の存在を検出した場合に、第1種俯瞰画像に加えて第2種俯瞰画像の表示を開始する。実施例2に係る車両100、車両用表示装置50は図1、図2と同様のタイプである。ここでは、実施例1との差異を中心に説明する。
(Example 2)
Next, Example 2 will be described. As in the first embodiment, the second embodiment relates to a vehicular display device that generates a bird's-eye view image by viewpoint conversion with respect to images captured by a plurality of imaging units installed in the vehicle, and displays the generated bird's-eye view image. . The vehicle display device according to the first embodiment displays the second type overhead image in addition to the first type overhead image when the presence of an object not included in the first type overhead image is detected. On the other hand, when the vehicle display device according to the second embodiment detects an object that is not included in the first type overhead image and is included in the second type overhead image, In addition, display of the second type overhead image is started. The vehicle 100 and the vehicle display device 50 according to the second embodiment are the same type as those in FIGS. Here, it demonstrates centering on the difference with Example 1. FIG.
 物体検出部40は、これまでと同様に、前方センサ22、後方センサ24に接続され、車両100の周辺の物体を検出する。物体検出部40は、前方センサ22、後方センサ24のいずれかにおいて物体を検出するとともに、検出した物体までの距離が、第1種画像の撮像範囲よりも遠方であり、かつ第2種画像の撮像範囲に含まれている場合に、物体の検出を表示制御部34に通知する。 The object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 as before, and detects objects around the vehicle 100. The object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, and the distance to the detected object is farther than the imaging range of the first type image, and the second type image When it is included in the imaging range, the display control unit 34 is notified of object detection.
 表示制御部34は、物体検出部40において物体が検出された場合、物体検出部40からの通知、識別番号を取得する。表示制御部34は、物体検出部40からの通知を取得した場合、第1種俯瞰画像80に加えて第2種俯瞰画像82も表示パネル52に表示させる。なお、ここでも表示パネル52に第1種俯瞰画像80に加えて第2種俯瞰画像82を表示させるタイミングは、第1種俯瞰画像80を表示した後に物体検出部40からの通知を取得したタイミングである。つまり、表示制御部34は、物体検出部40が検出した物体が第1種俯瞰画像80の範囲外かつ第2種俯瞰画像82の範囲内であるときに、第1種俯瞰画像80に加えて、検出した物体の方向に対応した第2種俯瞰画像82の表示を開始させる。 When the object detection unit 40 detects an object, the display control unit 34 obtains a notification from the object detection unit 40 and an identification number. When the notification from the object detection unit 40 is acquired, the display control unit 34 causes the display panel 52 to display the second type overhead image 82 in addition to the first type overhead image 80. Here, the timing at which the second type overhead image 82 is displayed on the display panel 52 in addition to the first type overhead image 80 is the timing at which the notification from the object detection unit 40 is acquired after the first type overhead image 80 is displayed. It is. That is, when the object detected by the object detection unit 40 is outside the range of the first type overhead image 80 and within the range of the second type overhead image 82, the display control unit 34 adds to the first type overhead image 80. Then, the display of the second type overhead image 82 corresponding to the detected direction of the object is started.
 以上の構成による車両用表示装置50の動作を説明する。図9は、本発明の実施例2に係る車両用表示装置50による表示手順を示すフローチャートである。第1種俯瞰画像80の表示条件を満たしている場合(S100のY)、表示制御部34は、表示パネル52に第1種俯瞰画像80を表示させる(S102)。前方センサ22あるいは後方センサ24が障害物84を検出しなければ(S104のN)、ステップ100に戻る。前方センサ22あるいは後方センサ24が障害物84を検出した場合(S104のY)、障害物84が第1種俯瞰画像80の範囲よりも遠方に存在しない場合、あるいは第2種俯瞰画像82の範囲内に含まれない場合(S106のN)、ステップ100に戻る。障害物84が第1種俯瞰画像80の範囲よりも遠方に存在し、かつ第2種俯瞰画像82の範囲内に含まれる場合(S106のY)、表示制御部34は、第1種俯瞰画像80における障害物84の検出方向に、第2種俯瞰画像82を重畳表示し(S108)、ステップ106に戻る。第1種俯瞰画像80の表示条件を満たしていない場合(S100のN)、処理が終了される。 The operation of the vehicular display device 50 having the above configuration will be described. FIG. 9 is a flowchart showing a display procedure by the vehicle display device 50 according to the second embodiment of the present invention. When the display condition of the first type overhead image 80 is satisfied (Y in S100), the display control unit 34 causes the display panel 52 to display the first type overhead image 80 (S102). If the front sensor 22 or the rear sensor 24 does not detect the obstacle 84 (N in S104), the process returns to step 100. When the front sensor 22 or the rear sensor 24 detects the obstacle 84 (Y in S104), when the obstacle 84 does not exist farther than the range of the first type overhead image 80, or the range of the second type overhead image 82 If not included (N in S106), the process returns to Step 100. When the obstacle 84 exists farther than the range of the first type overhead image 80 and is included in the range of the second type overhead image 82 (Y in S106), the display control unit 34 displays the first type overhead image. The second type bird's-eye view image 82 is superimposed and displayed in the detection direction of the obstacle 84 at 80 (S108), and the process returns to step 106. If the display condition of the first type overhead image 80 is not satisfied (N in S100), the process is terminated.
 本実施例によれば、物体が第1種俯瞰画像の範囲外かつ第2種俯瞰画像の範囲内である場合、第1種俯瞰画像に加えて、検出した物体の方向に対応した第2種俯瞰画像を表示するので、第1種俯瞰画像に含まれず、第2種俯瞰画像に含まれる物体を確実に表示できる。また、第1種俯瞰画像のうち、物体を検出した方向に、第2種俯瞰画像を表示させるので、第1種俯瞰画像と第2種俯瞰画像との位置関係を容易に把握させることができる。 According to the present embodiment, when the object is outside the range of the first type overhead image and within the range of the second type overhead image, the second type corresponding to the direction of the detected object in addition to the first type overhead image. Since the bird's-eye view image is displayed, it is possible to reliably display the object that is not included in the first type bird's-eye image and is included in the second type bird's-eye view image. In addition, since the second type overhead image is displayed in the direction in which the object is detected in the first type overhead image, the positional relationship between the first type overhead image and the second type overhead image can be easily grasped. .
(実施例3)
 次に、実施例3を説明する。実施例3は、これまでと同様に、車両に設置された複数の撮像部において撮像した画像に対して、視点変換により俯瞰画像を生成し、生成した俯瞰画像を表示する車両用表示装置に関する。これまでは、第1種俯瞰画像に物体が含まれる場合、第2種俯瞰画像が表示されない。一方、実施例3では、第1種俯瞰画像に物体が含まれる場合であっても、第1種俯瞰画像に加えて第2種俯瞰画像も表示される。その際、第1種俯瞰画像に含まれる物体と第2種俯瞰画像に含まれる物体とが同一であることが判定可能なような表示がなされる。実施例3に係る車両100は図1と同様のタイプである。ここでは、これまでとの差異を中心に説明する。
(Example 3)
Next, Example 3 will be described. Example 3 relates to a vehicle display device that generates an overhead image by viewpoint conversion for images captured by a plurality of imaging units installed in a vehicle, and displays the generated overhead image, as in the past. Until now, when the first type overhead image includes an object, the second type overhead image is not displayed. On the other hand, in Example 3, even if the first type overhead image includes an object, the second type overhead image is displayed in addition to the first type overhead image. At that time, a display is made so that it can be determined that the object included in the first type overhead image and the object included in the second type overhead image are the same. The vehicle 100 according to the third embodiment is the same type as that shown in FIG. Here, it demonstrates centering on the difference from before.
 図10は、本発明の実施例3に係る車両用表示装置50の構成を示す。車両用表示装置50は、図2に示された車両用表示装置50において同一性判定部42をさらに含む。物体検出部40は、これまでと同様に、前方センサ22、後方センサ24に接続され、車両100の周辺の物体を検出する。物体検出部40は、前方センサ22、後方センサ24のいずれかにおいて物体を検出した場合に、物体の検出を表示制御部34、同一性判定部42に通知する。 FIG. 10 shows a configuration of the vehicle display device 50 according to the third embodiment of the present invention. The vehicle display device 50 further includes an identity determination unit 42 in the vehicle display device 50 shown in FIG. The object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 as before, and detects objects around the vehicle 100. When the object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, the object detection unit 40 notifies the display control unit 34 and the identity determination unit 42 of the detection of the object.
 同一性判定部42では、物体検出部40によって物体の検出が通知される。その際、検出された物体の位置も通知される。また、同一性判定部42は、第1画像生成部32から第1種俯瞰画像80を受けつけるとともに、第2画像生成部38から第2種俯瞰画像82を受けつける。さらに、同一性判定部42は、車両100の位置情報および進行方向を取得する。なお、車両100の位置情報および進行方向の取得には公知の技術が使用されればよいので、ここでは説明を省略する。同一性判定部42は、第1種俯瞰画像80の画角を予め認識しているので、車両100の位置情報および進行方向と第1種俯瞰画像80の画角とをもとに、第1種俯瞰画像80に含まれた複数の画素のそれぞれに対する座標を取得する。また、同一性判定部42は、第2種俯瞰画像82に対しても同様の処理を実行することによって、第2種俯瞰画像82に含まれた複数の画素のそれぞれに対する座標を取得する。 In the identity determination unit 42, the object detection unit 40 notifies the detection of the object. At that time, the position of the detected object is also notified. The identity determination unit 42 receives the first type overhead image 80 from the first image generation unit 32 and the second type overhead image 82 from the second image generation unit 38. Furthermore, the identity determination unit 42 acquires the position information and the traveling direction of the vehicle 100. In addition, since a well-known technique should just be used for acquisition of the positional information and the advancing direction of the vehicle 100, description is abbreviate | omitted here. Since the identity determination unit 42 recognizes in advance the angle of view of the first type overhead image 80, the first determination unit 42 determines the first angle based on the position information and traveling direction of the vehicle 100 and the angle of view of the first type overhead image 80. The coordinates for each of the plurality of pixels included in the seed bird's-eye view image 80 are acquired. In addition, the identity determination unit 42 performs the same process on the second type bird's-eye image 82, thereby acquiring the coordinates for each of the plurality of pixels included in the second type bird's-eye image 82.
 同一性判定部42は、物体検出部40が物体を検出した位置(座標)が、第1種俯瞰画像80および第2種俯瞰画像82に含まれている場合、同一の物体が含まれていると判定する。なお、同一性判定部42は、第1種俯瞰画像80および第2種俯瞰画像82に対して画像認識処理を実行し、画像認識処理において取得した物体の形状も比較して、同一の物体が含まれていると判定してもよい。このように、同一性判定部42は、物体検出部40が検出した物体の第1種俯瞰画像80および第2種俯瞰画像82における同一性を判定する。同一性判定部42は、同一の物体が含まれているか否かの判定結果を表示制御部34に出力する。 The identity determination unit 42 includes the same object when the position (coordinates) at which the object detection unit 40 detects the object is included in the first type overhead image 80 and the second type overhead image 82. Is determined. The identity determination unit 42 performs image recognition processing on the first-type overhead image 80 and the second-type overhead image 82, compares the shapes of the objects acquired in the image recognition processing, and the same object is found. It may be determined that it is included. As described above, the identity determination unit 42 determines the identity of the first type overhead image 80 and the second type overhead image 82 of the object detected by the object detection unit 40. The identity determination unit 42 outputs a determination result on whether or not the same object is included to the display control unit 34.
 表示制御部34は、物体検出部40が物体を検出した場合、第1種俯瞰画像80に加えて第2種俯瞰画像82を表示する。また、表示制御部34は、第1種俯瞰画像80に加えて第2種俯瞰画像82を表示する場合に、同一性判定部42による判定結果に基づき、第1種俯瞰画像80および第2種俯瞰画像82の各々に表示されている物体の同一性が判定可能な表示を行う。検出した物体が第2種俯瞰画像82のみに表示されている場合、同一性を判定可能な表示は不要である。しかしながら、検出した物体に車両100が近接していくことで、その物体が第2種俯瞰画像82および第1種俯瞰画像80の双方に表示される場合、同一性が判定可能な表示がなされる。 When the object detection unit 40 detects an object, the display control unit 34 displays the second type overhead image 82 in addition to the first type overhead image 80. Further, when displaying the second type overhead image 82 in addition to the first type overhead image 80, the display control unit 34, based on the determination result by the identity determination unit 42, the first type overhead image 80 and the second type. The display which can determine the identity of the object currently displayed on each of the bird's-eye view images 82 is performed. When the detected object is displayed only in the second type bird's-eye view image 82, a display capable of determining the identity is unnecessary. However, when the vehicle 100 comes close to the detected object and the object is displayed in both the second type overhead image 82 and the first type overhead image 80, a display capable of determining the identity is made. .
 図11は、表示制御部34において生成された俯瞰画像を示す。これは、図6と同様に、物体検出部40からの通知を取得した場合であり、かつ取得した識別番号が後方センサ24を示す場合に相当する。図示のごとく、表示制御部34は、第1種俯瞰画像80、特に後方画像72の下側に、物体である障害物84が表示された第2種俯瞰画像82を配置させる。また、障害物84は、第1種俯瞰画像80にも表示されている。ここで、第1種俯瞰画像80に含まれた障害物84と、第2種俯瞰画像82に含まれた障害物84とは、同一性判定部42によって同一物と判定されている。そのため、第1種俯瞰画像80に含まれた障害物84と、第2種俯瞰画像82に含まれた障害物84には、同一物マーカ86が示される。同一物マーカ86は、物体の同一性が判定可能な表示であり、同一形状または同一色の枠で各々を囲う表示である。 FIG. 11 shows an overhead image generated by the display control unit 34. Similar to FIG. 6, this corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the rear sensor 24. As illustrated, the display control unit 34 arranges the first type overhead image 80, particularly the second type overhead image 82 in which the obstacle 84 that is an object is displayed below the rear image 72. The obstacle 84 is also displayed on the first type overhead image 80. Here, the obstacle 84 included in the first type bird's-eye image 80 and the obstacle 84 included in the second type bird's-eye image 82 are determined to be the same by the identity determination unit 42. Therefore, the same object marker 86 is shown on the obstacle 84 included in the first type overhead image 80 and the obstacle 84 included in the second type overhead image 82. The same object marker 86 is a display that can determine the identity of an object, and is a display that surrounds each with a frame of the same shape or the same color.
 本実施例によれば、第1種俯瞰画像に加えて第2種俯瞰画像を表示する場合に、第1種俯瞰画像および第2種俯瞰画像の各々に表示されている物体の同一性が判定可能な表示を行うので、第1種俯瞰画像および第2種俯瞰画像の各々に表示されている同一の物体を容易に認識できる。また、第1種俯瞰画像および第2種俯瞰画像の各々に表示されている同一の物体が容易に認識されるので、車両が物体に近づいてくる状況下において、物体の位置を容易に認識できる。 According to the present embodiment, when the second type overhead image is displayed in addition to the first type overhead image, the identity of the objects displayed in each of the first type overhead image and the second type overhead image is determined. Since possible display is performed, the same object displayed on each of the first type overhead image and the second type overhead image can be easily recognized. In addition, since the same object displayed in each of the first type overhead image and the second type overhead image is easily recognized, the position of the object can be easily recognized in a situation where the vehicle approaches the object. .
 以上、実施例をもとに本発明を説明した。この実施例は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。 The present invention has been described above based on the embodiments. This embodiment is an exemplification, and it will be understood by those skilled in the art that various modifications can be made to the combination of each component and each processing process, and such modifications are also within the scope of the present invention. .
 10 第1前方撮像部、 12 第1後方撮像部、 14 第1左側方撮像部、 16 第1右側方撮像部、 18 第2前方撮像部、 20 第2後方撮像部、 22 前方センサ、 24 後方センサ、 30 第1取得部、 32 第1画像生成部、 34 表示制御部、 36 第2取得部、 38 第2画像生成部、 40 物体検出部、 50 車両用表示装置、 52 表示パネル、 100 車両。 10 First front imaging unit, 12 First rear imaging unit, 14 First left side imaging unit, 16 First right side imaging unit, 18 Second front imaging unit, 20 Second rear imaging unit, 22 Front sensor, 24 Rear Sensor, 30 first acquisition unit, 32 first image generation unit, 34 display control unit, 36 second acquisition unit, 38 second image generation unit, 40 object detection unit, 50 vehicle display device, 52 display panel, 100 vehicle .
 本発明によれば、車両近傍の状況、特に障害物となる物体の存在を運転者に把握させやすくできる。 According to the present invention, it is possible to make it easier for the driver to grasp the situation in the vicinity of the vehicle, particularly the presence of an object that becomes an obstacle.

Claims (8)

  1.  車両の周辺を撮像した第1種画像を取得する第1取得部と、
     前記第1取得部において取得した第1種画像に対して、前記車両の上方から見たように視点を変換することによって、第1種俯瞰画像を生成する第1画像生成部と、
     前記第1画像生成部において生成した第1種俯瞰画像を表示させる表示制御部と、
     前記第1取得部において取得される第1種画像よりも高い位置から第1種画像より前記車両から離間した範囲を撮像した第2種画像を取得する第2取得部と、
     前記第2取得部において取得した第2種画像に対して、前記車両の上方から見たように視点を変換することによって、第2種俯瞰画像を生成する第2画像生成部と、
     前記車両の周辺の物体を検出する物体検出部とを備え、
     前記表示制御部は、前記物体検出部が物体を検出した場合、前記第1種俯瞰画像に加えて、前記第2画像生成部において生成した第2種俯瞰画像であって、かつ検出した物体の方向に対応した第2種俯瞰画像も表示させることを特徴とする車両用表示装置。
    A first acquisition unit that acquires a first type image obtained by imaging the periphery of the vehicle;
    A first image generation unit that generates a first type overhead image by converting a viewpoint as seen from above the first type image acquired in the first acquisition unit;
    A display control unit for displaying the first type overhead image generated in the first image generation unit;
    A second acquisition unit that acquires a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image acquired by the first acquisition unit;
    A second image generation unit that generates a second type overhead image by converting the viewpoint as seen from above the second type image acquired by the second acquisition unit;
    An object detection unit for detecting objects around the vehicle,
    When the object detection unit detects an object, the display control unit is a second type overhead image generated by the second image generation unit in addition to the first type overhead image, and the detected object A vehicular display device that also displays a second type overhead image corresponding to a direction.
  2.  前記物体検出部は、前記第1取得部において取得した第1種画像の撮像範囲よりも遠くまでの検出範囲を有することを特徴とする請求項1に記載の車両用表示装置。 2. The vehicular display device according to claim 1, wherein the object detection unit has a detection range farther than an imaging range of the first type image acquired by the first acquisition unit.
  3.  前記表示制御部は、前記第1画像生成部において生成した第1種俯瞰画像のうち、前記物体検出部が物体を検出した方向に、前記第2画像生成部において生成した第2種俯瞰画像を表示させることを特徴とする請求項1または2に記載の車両用表示装置。 The display control unit generates a second type overhead image generated by the second image generation unit in a direction in which the object detection unit detects an object among the first type overhead images generated by the first image generation unit. The vehicle display device according to claim 1, wherein the display device is displayed.
  4.  前記表示制御部は、前記物体検出部が物体を検出した方向における第1種俯瞰画像の画角よりも、第2種俯瞰画像の画角を広く表示させることを特徴とする請求項1から3のいずれかに記載の車両用表示装置。 The display control unit displays the angle of view of the second type overhead image wider than the angle of view of the first type overhead image in the direction in which the object detection unit detects the object. The vehicle display device according to any one of the above.
  5.  前記表示制御部は、前記物体検出部が物体を検出した方向における第1種俯瞰画像の画角よりも、第2種俯瞰画像を拡大して表示させることを特徴とする請求項1から3のいずれかに記載の車両用表示装置。 4. The display control unit according to claim 1, wherein the display control unit displays the second type overhead image in an enlarged manner than the angle of view of the first type overhead image in the direction in which the object detection unit detects the object. 5. The vehicle display device according to any one of the above.
  6.  前記第2画像生成部は、前記第1画像生成部が生成した第1種俯瞰画像よりも前記車両から離間した範囲の第2種俯瞰画像を生成し、
     前記表示制御部は、前記物体検出部が検出した物体が第1種俯瞰画像の範囲外かつ第2種俯瞰画像の範囲内であるときに、第1種俯瞰画像に加えて、検出した物体の方向に対応した第2種俯瞰画像の表示を開始させることを特徴とする請求項1から5のいずれかに記載の車両用表示装置。
    The second image generation unit generates a second type overhead image in a range farther from the vehicle than the first type overhead image generated by the first image generation unit,
    When the object detected by the object detection unit is outside the first type overhead image and within the second type overhead image, the display control unit adds the first type overhead image to the detected object. 6. The vehicular display device according to claim 1, wherein display of the second type overhead image corresponding to the direction is started.
  7.  前記物体検出部が検出した物体の第1種俯瞰画像および第2種俯瞰画像における同一性を判定する同一性判定部をさらに備え、
     前記表示制御部は、第1種俯瞰画像に加えて第2種俯瞰画像を表示する場合に、前記同一性判定部による判定結果に基づき、第1種俯瞰画像および第2種俯瞰画像の各々に表示されている物体の同一性が判定可能な表示を行うことを特徴とする請求項1から6のいずれかに記載の車両用表示装置。
    An identity determination unit for determining the identity of the first type overhead image and the second type overhead image of the object detected by the object detection unit;
    When the display control unit displays the second type overhead image in addition to the first type overhead image, based on the determination result by the identity determination unit, each of the first type overhead image and the second type overhead image The vehicular display device according to any one of claims 1 to 6, wherein a display capable of determining the identity of the displayed object is performed.
  8.  車両の周辺を撮像した第1種画像を取得するステップと、
     取得した第1種画像に対して、前記車両の上方から見たように視点を変換することによって、第1種俯瞰画像を生成するステップと、
     生成した第1種俯瞰画像を表示させるステップと、
     第1種画像よりも高い位置から第1種画像より前記車両から離間した範囲を撮像した第2種画像を取得するステップと、
     取得した第2種画像に対して、前記車両の上方から見たように視点を変換することによって、第2種俯瞰画像を生成するステップと、
     前記車両の周辺の物体を検出するステップと、
     物体を検出した場合、生成した第2種俯瞰画像であって、かつ検出した物体の方向に対応した第2種俯瞰画像も第1種俯瞰画像に加えて表示させるステップと、
     を備えることを特徴とする車両用表示方法。
    Acquiring a first type image obtained by imaging the periphery of the vehicle;
    A step of generating a first type overhead image by converting the viewpoint as seen from above the vehicle with respect to the acquired first type image;
    Displaying the generated first-type overhead image,
    Obtaining a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image;
    Generating a second type overhead image by converting the viewpoint as seen from above the vehicle with respect to the acquired second type image;
    Detecting objects around the vehicle;
    A step of displaying a second type overhead image corresponding to the direction of the detected object in addition to the first type overhead image when the object is detected;
    A vehicle display method comprising the steps of:
PCT/JP2016/080091 2015-11-17 2016-10-11 Display device for vehicles and display method for vehicles WO2017086057A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201680051297.1A CN107950023B (en) 2015-11-17 2016-10-11 Vehicle display device and vehicle display method
EP16866059.5A EP3379827B1 (en) 2015-11-17 2016-10-11 Display device for vehicles and display method for vehicles
US15/935,143 US20180208115A1 (en) 2015-11-17 2018-03-26 Vehicle display device and vehicle display method for displaying images

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015-224379 2015-11-17
JP2015224379 2015-11-17
JP2016-146181 2016-07-26
JP2016146181A JP6699427B2 (en) 2015-11-17 2016-07-26 Vehicle display device and vehicle display method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/935,143 Continuation US20180208115A1 (en) 2015-11-17 2018-03-26 Vehicle display device and vehicle display method for displaying images

Publications (1)

Publication Number Publication Date
WO2017086057A1 true WO2017086057A1 (en) 2017-05-26

Family

ID=58718844

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/080091 WO2017086057A1 (en) 2015-11-17 2016-10-11 Display device for vehicles and display method for vehicles

Country Status (1)

Country Link
WO (1) WO2017086057A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109515321A (en) * 2017-09-19 2019-03-26 华创车电技术中心股份有限公司 Travelling image interface switch system and travelling image switching method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000064175A1 (en) * 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Ltd. Image processing device and monitoring system
JP2003212041A (en) * 2002-01-25 2003-07-30 Toyota Central Res & Dev Lab Inc Vehicle rear display device
JP2012064096A (en) * 2010-09-17 2012-03-29 Nissan Motor Co Ltd Vehicle image display device
JP2012185540A (en) * 2011-03-03 2012-09-27 Honda Elesys Co Ltd Image processing device, image processing method, and image processing program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000064175A1 (en) * 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Ltd. Image processing device and monitoring system
JP2003212041A (en) * 2002-01-25 2003-07-30 Toyota Central Res & Dev Lab Inc Vehicle rear display device
JP2012064096A (en) * 2010-09-17 2012-03-29 Nissan Motor Co Ltd Vehicle image display device
JP2012185540A (en) * 2011-03-03 2012-09-27 Honda Elesys Co Ltd Image processing device, image processing method, and image processing program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109515321A (en) * 2017-09-19 2019-03-26 华创车电技术中心股份有限公司 Travelling image interface switch system and travelling image switching method

Similar Documents

Publication Publication Date Title
JP6699427B2 (en) Vehicle display device and vehicle display method
US20170297488A1 (en) Surround view camera system for object detection and tracking
JP4899424B2 (en) Object detection device
JP4816923B2 (en) Vehicle peripheral image providing apparatus and method
JP5953824B2 (en) Vehicle rear view support apparatus and vehicle rear view support method
JP4731392B2 (en) In-vehicle peripheral status presentation device
JP6425991B2 (en) Towing vehicle surrounding image generating apparatus and method for generating towing vehicle surrounding image
JP6565188B2 (en) Parallax value deriving apparatus, device control system, moving body, robot, parallax value deriving method, and program
JP2009086787A (en) Vehicle detection device
JP7072641B2 (en) Road surface detection device, image display device using road surface detection device, obstacle detection device using road surface detection device, road surface detection method, image display method using road surface detection method, and obstacle detection method using road surface detection method
JP6743882B2 (en) Image processing device, device control system, imaging device, image processing method, and program
JP2009206747A (en) Ambient condition monitoring system for vehicle, and video display method
JP6597792B2 (en) Image processing apparatus, object recognition apparatus, device control system, image processing method and program
JP5098563B2 (en) Object detection device
CN107004250B (en) Image generation device and image generation method
JP6589313B2 (en) Parallax value deriving apparatus, device control system, moving body, robot, parallax value deriving method, and program
JP2008048094A (en) Video display device for vehicle, and display method of video images in vicinity of the vehicle
JP2020068515A (en) Image processing apparatus
JP3988551B2 (en) Vehicle perimeter monitoring device
JP2007249814A (en) Image-processing device and image-processing program
WO2017086057A1 (en) Display device for vehicles and display method for vehicles
JP7047291B2 (en) Information processing equipment, image pickup equipment, equipment control system, mobile body, information processing method and program
JP5984714B2 (en) Three-dimensional object detection device, driving support device, and three-dimensional object detection method
KR20220097656A (en) Driver asistance apparatus, vehicle and control method thereof
JP4799236B2 (en) In-vehicle display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16866059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016866059

Country of ref document: EP