WO2017086057A1 - Dispositif d'affichage pour des véhicules et procédé d'affichage pour des véhicules - Google Patents

Dispositif d'affichage pour des véhicules et procédé d'affichage pour des véhicules Download PDF

Info

Publication number
WO2017086057A1
WO2017086057A1 PCT/JP2016/080091 JP2016080091W WO2017086057A1 WO 2017086057 A1 WO2017086057 A1 WO 2017086057A1 JP 2016080091 W JP2016080091 W JP 2016080091W WO 2017086057 A1 WO2017086057 A1 WO 2017086057A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
type
vehicle
overhead image
type overhead
Prior art date
Application number
PCT/JP2016/080091
Other languages
English (en)
Japanese (ja)
Inventor
昇 勝俣
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016146181A external-priority patent/JP6699427B2/ja
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Priority to CN201680051297.1A priority Critical patent/CN107950023B/zh
Priority to EP16866059.5A priority patent/EP3379827B1/fr
Publication of WO2017086057A1 publication Critical patent/WO2017086057A1/fr
Priority to US15/935,143 priority patent/US20180208115A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to vehicle display technology, and more particularly to a vehicle display device and a vehicle display method for displaying an image.
  • a technology is widely used in which cameras are provided at a plurality of locations on a vehicle, and the captured images are synthesized by converting the viewpoints to obtain a bird's-eye view image.
  • Such a display is displayed at the time of entering a garage, for example, and is used for the purpose of confirming the periphery of the vehicle and grasping the position. For example, when there is an obstacle in the vicinity of the vehicle when entering the garage, the obstacle is also displayed in the bird's-eye view image, but since the viewpoint conversion processing is performed, it is difficult to grasp the distance to the obstacle. Therefore, when an obstacle is detected, an original image obtained by imaging the obstacle detection area is displayed (for example, see Patent Document 1).
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a technique that makes it easier for the driver to grasp the situation in the vicinity of the vehicle, particularly the presence of an object that becomes an obstacle.
  • a vehicle display device includes a first acquisition unit that acquires a first type image obtained by imaging the periphery of a vehicle, and a first type acquired by the first acquisition unit.
  • a first image generation unit that generates a first type overhead image and a first type overhead image generated by the first image generation unit by converting the viewpoint as viewed from above the vehicle.
  • a second acquisition unit that acquires a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image acquired by the first acquisition unit
  • a second image generation unit that generates a second type overhead image by converting the viewpoint as viewed from above the vehicle with respect to the second type image acquired by the second acquisition unit, and objects around the vehicle
  • an object detection unit for detecting.
  • the display control unit is a second type overhead image generated by the second image generation unit in addition to the first type overhead image, and corresponds to the direction of the detected object.
  • a second type overhead image is also displayed.
  • a first type overhead image is obtained by acquiring a first type image obtained by imaging the periphery of the vehicle and converting the viewpoint of the acquired first type image as viewed from above the vehicle.
  • a step of displaying the generated first type overhead image and a step of acquiring a second type image obtained by imaging a range separated from the vehicle from the first type image from a position higher than the first type image.
  • Generating a second type overhead image by converting the viewpoint as seen from above the vehicle with respect to the acquired second type image, detecting an object around the vehicle, A second type overhead image generated corresponding to the direction of the detected object is displayed in addition to the first type overhead image.
  • any combination of the above-described constituent elements and a representation of the present embodiment converted between a method, an apparatus, a system, a recording medium, a computer program, etc. are also effective as an aspect of the present embodiment.
  • FIGS. 1A to 1B are views showing the appearance of a vehicle according to Embodiment 1 of the present invention. It is a figure which shows the structure of the display apparatus for vehicles which concerns on Example 1 of this invention.
  • FIG. 2 is a perspective view showing an imaging range formed around the vehicle in FIGS. It is a figure which shows the 1st kind overhead view image produced
  • FIG. 2 is a perspective view showing another imaging range formed around the vehicle in FIGS. It is a figure which shows the bird's-eye view image produced
  • Embodiment 1 of the present invention relates to a vehicle display device that generates a bird's-eye view image by viewpoint conversion with respect to images taken by a plurality of image pickup units installed in a vehicle, and displays the generated bird's-eye view image.
  • a vehicle display device that generates a bird's-eye view image by viewpoint conversion with respect to images taken by a plurality of image pickup units installed in a vehicle, and displays the generated bird's-eye view image.
  • an object that exists in the vicinity of the vehicle is displayed, but an object that exists away from the vehicle, such as an obstacle that exists at a position 1 m or more away from the vehicle, is not displayed.
  • the vehicle display device executes the following process in order to display an object that exists away from the vehicle.
  • the vehicle includes a plurality of first imaging units at a lower portion of the vehicle, and the vehicle display device generates a first type overhead image from a first type image captured by each of the plurality of first imaging units.
  • the vehicle is also provided with a plurality of second imaging units at a position higher than the first imaging unit, and the vehicular display device receives the second type image captured by each of the plurality of second imaging units from the second type image.
  • a seed bird's-eye view image is generated.
  • the vehicle is also provided with a sensor, and the vehicle display device detects the presence of an object based on the detection result of the sensor.
  • the vehicular display device displays the first type overhead image when the presence of the object is not detected.
  • the vehicular display device displays the first type overhead image in addition to the first type overhead image.
  • Two types of overhead images are displayed.
  • the height at which the second imaging unit is installed is a position higher than the height at which the first imaging unit is installed, and is installed at a position where it is possible to photograph far away from the first imaging unit. For this reason, the second type overhead image can display a farther object than the first type overhead image.
  • the display of the second overhead image is unnecessary. Therefore, the display is switched depending on whether or not an object exists.
  • FIGS. 1A to 1B show the appearance of the vehicle 100 according to the first embodiment of the present invention.
  • FIG. 1A shows a top view of the vehicle 100
  • FIG. 1B shows a side view.
  • a first front imaging unit 10 is installed in a front portion of the vehicle 100, for example, a bumper, a bonnet, or the like.
  • a first rear imaging unit 12 is installed in a rear portion of the vehicle 100, such as a bumper or a trunk.
  • a first left side imaging unit 14 is installed on a left side portion of the vehicle, for example, a lower portion of the left door mirror.
  • a first right side imaging unit 16 is installed on the right side of the vehicle so as to be symmetrical with the first left side imaging unit 14.
  • the first front imaging unit 10 to the first right imaging unit 16 are collectively referred to as a first imaging unit.
  • the second front imaging unit 18 is arranged in the vehicle near the roof front of the vehicle 100, and the second rear imaging unit 20 is arranged in the vehicle near the roof rear of the vehicle 100.
  • the 2nd front image pick-up part 18 and the 2nd back image pick-up part 20 are arranged in the direction which can be photoed in a position higher than a 1st image pick-up part, and a distant place. Therefore, the second front imaging unit 18 and the second rear imaging unit 20 can image an object farther away than the first imaging unit.
  • the second front imaging unit 18 and the second rear imaging unit 20 are installed in the passenger compartment near the roof of the vehicle 100, but are higher than the first front imaging unit 10 and the first rear imaging unit 12.
  • the second front imaging unit 18 and the second rear imaging unit 20 are collectively referred to as a second imaging unit. Further, in FIGS. 1A to 1B, the second imaging units are disposed only in the front and rear of the vehicle 100, but may be disposed in the front and rear and right and left.
  • the front sensor 22 is disposed in front of the vehicle 100 as in the first front imaging unit 10, and the rear sensor 24 is disposed in the rear of the vehicle 100 as in the first rear imaging unit 12.
  • the front sensor 22 is disposed in the vicinity of the first front imaging unit 10, and the rear sensor 24 is disposed in the vicinity of the first rear imaging unit 12.
  • FIG. 2 shows a configuration of the vehicle display device 50 according to the first embodiment of the present invention.
  • the vehicle display device 50 includes a first front imaging unit (first front camera) 10, a first rear imaging unit (first rear camera) 12, a first left imaging unit (first left camera) 14, a first 1 right side imaging unit (first right side camera) 16, second front imaging unit (second front camera) 18, second rear imaging unit (second rear camera) 20, front sensor 22, rear sensor 24, display panel 52 is connected.
  • the vehicle display device 50 includes a first acquisition unit 30, a first image generation unit 32, a display control unit 34, a second acquisition unit 36, a second image generation unit 38, and an object detection unit 40.
  • the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 are arranged as shown in FIG.
  • FIG. 3 is a perspective view showing an imaging range formed around the vehicle 100.
  • the first front imaging unit 10 forms a front imaging region 60 so as to go forward from the first front imaging unit 10, and captures an image in the front imaging region 60.
  • the first rear imaging unit 12 forms a rear imaging region 62 so as to go rearward from the first rear imaging unit 12, and captures an image in the rear imaging region 62.
  • the first left side imaging unit 14 forms a left side imaging region 64 so as to go to the left from the first left side imaging unit 14, and captures an image in the left side imaging region 64.
  • the first right side imaging unit 16 forms a right side imaging region 66 so as to go right from the first right side imaging unit 16, and images an image in the right side imaging region 66.
  • the front imaging area 60 indicated by hatching in FIG. 3 is a range from the range indicated by hatching to the position immediately below the installation position of the first front imaging part 10 in the vehicle 100 in the range where the first front imaging unit 10 can shoot.
  • a range that is cut out by the first image generation unit 32 and subjected to viewpoint conversion processing is a range in which the first rear imaging unit 12 can shoot, and a range from a range indicated by hatching to a position immediately below the installation position of the first rear imaging unit 12 in the vehicle 100 is the first image generation. This indicates that the range is cut out by the unit 32 and subjected to viewpoint conversion processing.
  • the left-side imaging area 64 has a range from the range indicated by hatching to the position immediately below the installation position of the first left-side imaging unit 14 in the vehicle 100 in the range in which the first left-side imaging unit 14 can shoot. This indicates that the range is cut out by one image generation unit 32 and subjected to viewpoint conversion processing.
  • the right side imaging region 66 has a range from the range indicated by oblique lines to the position immediately below the installation position of the first right side imaging unit 16 in the vehicle 100 in the range in which the first right side imaging unit 16 can shoot. This indicates that the range is cut out by one image generation unit 32 and subjected to viewpoint conversion processing.
  • the periphery of the vehicle 100 is imaged by images captured by the first imaging unit.
  • the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 capture an image as described above.
  • the image is a moving image, but may be a still image taken continuously.
  • the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16 output the captured images to the first acquisition unit 30.
  • the first acquisition unit 30 includes images (hereinafter referred to as “first type images”) from each of the first front imaging unit 10, the first rear imaging unit 12, the first left side imaging unit 14, and the first right side imaging unit 16. ) To get. That is, the first acquisition unit 30 acquires a first type image obtained by imaging the periphery of the vehicle 100.
  • the first type image acquired by the first acquisition unit 30 is processed by the first image generation unit 32.
  • the first image generation unit 32 executes processing of the first type image acquired by the first acquisition unit 30.
  • the first image generation unit 32 performs processing for converting the viewpoint of the first type image as viewed from above the vehicle 100 and generating the first type overhead image.
  • a known technique may be used for the conversion and the overhead image generation processing. For example, each pixel of the image is projected onto a three-dimensional curved surface in a virtual three-dimensional space, and a virtual viewpoint from above the vehicle 100 is used. Depending on, a necessary area of the three-dimensional curved surface is cut out. The cut out region corresponds to an image whose viewpoint has been converted.
  • An example of the generated overhead image is shown in FIG.
  • FIG. 4 shows a first type overhead image 80 generated by the first image generation unit 32.
  • the own vehicle icon 78 is an image showing the upper surface of the vehicle 100.
  • a front image 70 is arranged in front of the host vehicle icon 78
  • a rear image 72 is arranged behind the host vehicle icon 78
  • a left side image 74 is arranged on the left side of the host vehicle icon 78.
  • a right side image 76 is arranged on the right side.
  • the first image generation unit 32 generates the first type overhead image 80 by converting the viewpoint as seen from above the vehicle 100 with respect to the first type image acquired by the first acquisition unit 30. To do.
  • the first type overhead image 80 generated by the first image generation unit 32 is processed by the display control unit 34.
  • the display control unit 34 executes a process of displaying the first type overhead image 80 generated by the first image generation unit 32.
  • the display control unit 34 displays the first type overhead image 80 on the display panel 52.
  • the timing at which the first-type bird's-eye view image 80 is displayed on the display panel 52 is an arbitrary timing that requires confirmation of the surroundings of the vehicle. For example, the reverse gear of the vehicle 100 may be selected when entering the garage. .
  • a first type bird's-eye view image 80 as shown in FIG. 4 is displayed on the display panel 52.
  • the second front imaging unit 18 and the second rear imaging unit 20 are arranged as shown in FIG.
  • FIG. 5 is a perspective view showing another imaging range formed around the vehicle 100.
  • the second front imaging unit 18 forms a front imaging region 63 so as to go forward from the second front imaging unit 18, and images an image in the front imaging region 63.
  • the front imaging area 63 extends in front of the vehicle 100 more than the front imaging area 60.
  • the second rear imaging unit 20 forms a rear imaging region 65 so as to extend rearward from the second rear imaging unit 20 and captures an image in the rear imaging region 65.
  • the rear imaging area 65 extends to the rear of the vehicle 100 more than the rear imaging area 62.
  • a front imaging area 63 indicated by hatching in FIG. 5 has a range from the range indicated by hatching to the position immediately below the installation position of the second front imaging part 18 in the vehicle 100 in the range where the second front imaging unit 18 can shoot.
  • the range that is cut out by the second image generation unit 38 and subjected to the viewpoint conversion process is a range in which the second rear imaging unit 20 can shoot, and a range from a range indicated by hatching to a position immediately below the installation position of the second rear imaging unit 20 in the vehicle 100 is the second image generation. This indicates that the range is cut out by the unit 38 and the viewpoint conversion process is performed.
  • the second front imaging unit 18 captures an image forward from a position higher than the first front imaging unit 10. Therefore, the image captured by the second front imaging unit 18 includes a location farther from the vehicle 100 than the image captured by the first front imaging unit 10. That is, the second front imaging unit 18 can image farther than the first front imaging unit 10.
  • the imaging range of the second front imaging unit 18 may partially overlap the imaging range of the first front imaging unit 10 or may be a range that does not overlap with the imaging range of the first front imaging unit 10.
  • the second rear imaging unit 20 captures an image backward from a position higher than the first rear imaging unit 12. Therefore, the image captured by the second rear imaging unit 20 includes a place farther from the vehicle 100 than the image captured by the first rear imaging unit 12. That is, the second rear imaging unit 20 can image farther than the first rear imaging unit 12.
  • the imaging range of the second rear imaging unit 20 may partially overlap the imaging range of the first rear imaging unit 12 or may be a range that does not overlap with the imaging range of the first rear imaging unit 12. In this case as well, the image is a moving image, but may be a still image taken continuously.
  • the second front imaging unit 18 and the second rear imaging unit 20 output the captured image to the second acquisition unit 36.
  • the second image generation unit 38 executes processing of the second type image acquired by the second acquisition unit 36.
  • the 2nd image generation part 38 performs a process which converts a viewpoint with respect to a 2nd type image as it saw from the upper direction of the vehicle 100, and produces
  • the processing in the second image generation unit 38 is the same as the processing in the first image generation unit 32. That is, the second image generation unit 38 generates the second type overhead image 82 by converting the viewpoint as viewed from above the vehicle with respect to the second type image acquired by the second acquisition unit 36. .
  • Such a second type overhead image 82 is an overhead image in a range farther from the vehicle 100 than the first type overhead image 80.
  • first type bird's-eye view image 80 images in the four directions of the front image 70 and the right side image 76 are synthesized, but the second type bird's-eye view image 82 is generated from the second type image in one direction. .
  • the second type overhead image 82 generated by the second image generation unit 38 is processed by the display control unit 34.
  • the front sensor 22 and the rear sensor 24 are arranged as shown in FIG.
  • the front sensor 22 and the rear sensor 24 are, for example, a millimeter wave sensor or an infrared sensor. Moreover, a 2nd imaging part may be sufficient.
  • the object detection unit 40 performs edge detection processing or the like on the image captured by the second imaging unit, and detects an obstacle. An identification number for identification is assigned to each of the front sensor 22 and the rear sensor 24.
  • the object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 and detects an object around the vehicle 100. Since a known technique may be used for the detection of the object, a specific description is omitted here. For example, when an infrared laser is used for the front sensor 22 or the rear sensor 24, based on the time difference when the infrared laser is irradiated in a range that is the detection direction of the vehicle 100 and the infrared laser reflected on the object is received. , Detect objects. Note that the detection range of the object detection unit 40 is set to be farther than the imaging range of the first type image acquired by the first acquisition unit 30.
  • the object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, and notifies the display control unit 34 of the detection of the object when the distance to the detected object is larger than the threshold value. At that time, the identification number of the front sensor 22 or the rear sensor 24 that detected the object is also notified.
  • the threshold value is set to be a value on the far side of the imaging range of the first type image.
  • the display control unit 34 executes display processing of the first type overhead image 80 generated by the first image generation unit 32.
  • the display control unit 34 also executes display processing of the second type overhead image 82 generated by the second image generation unit 38. Further, when an object is detected by the object detection unit 40, the display control unit 34 acquires a notification and an identification number from the object detection unit 40. When the notification from the object detection unit 40 is not acquired, the display control unit 34 displays the first type overhead image 80 on the display panel 52 as before. On the other hand, when acquiring the notification from the object detection unit 40, the display control unit 34 causes the display panel 52 to display the second type overhead image 82 in addition to the first type overhead image 80.
  • the timing at which the second type overhead image 82 is displayed on the display panel 52 in addition to the first type overhead image 80 is the timing at which the notification from the object detection unit 40 is acquired after the first type overhead image 80 is displayed. .
  • FIG. 6 shows a bird's-eye view image generated by the display control unit 34. This corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the rear sensor 24.
  • the display control unit 34 arranges the first type overhead image 80, particularly the second type overhead image 82 in which the obstacle 84 that is an object is displayed below the rear image 72. In this way, the display control unit 34 displays the second type overhead image 82 corresponding to the direction of the detected object in the direction in which the object detection unit 40 detects the object in the first type overhead image 80.
  • the display as shown in FIG. 6 is performed based on the object detected by the rear sensor 24, it is appropriate that the vehicle 100 is moving backward.
  • FIG. 7 shows another overhead image generated by the display control unit 34. This corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the front sensor 22.
  • the display control unit 34 arranges the first type bird's-eye view image 80, particularly the second type bird's-eye view image 82 on which the obstacle 84 is displayed above the front image 70.
  • the display control unit 34 displays the second type overhead image 82 corresponding to the direction of the detected object in the direction in which the object detection unit 40 detects the object in the first type overhead image 80.
  • the display as shown in FIG. 7 is performed based on the object detected by the front sensor 22, it is appropriate that the vehicle 100 is moving forward.
  • the display control unit 34 selects the second type overhead image 82 generated from the second type image captured by the second rear imaging unit 20.
  • the display control unit 34 arranges the selected second type overhead image 82 below the rear image 72 corresponding to the first rear imaging unit 12 facing rearward in the same manner as the second rear imaging unit 20.
  • the display control unit 34 may display the angle of view of the second type overhead image 82 wider than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object.
  • the display control unit 34 may display the second type overhead image 82 in an enlarged manner than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object. It should be noted that a known technique may be used to display a wide angle of view and display an enlarged view, and thus description thereof is omitted here. Further, as shown in FIG. 6, the display control unit 34 may move the first type overhead image 80 upward and display the second type overhead image 82 below the first type overhead image 80.
  • the display control unit 34 selects the second type overhead image 82 generated from the second type image captured by the second front imaging unit 18.
  • the display control unit 34 arranges the selected second type overhead image 82 on the upper side of the front image 70 corresponding to the first front imaging unit 10 facing forward like the second front imaging unit 18.
  • the display control unit 34 may display the angle of view of the second type overhead image 82 wider than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object.
  • the display control unit 34 may display the second type overhead image 82 in an enlarged manner than the angle of view of the first type overhead image 80 in the direction in which the object detection unit 40 detects the object.
  • the display control unit 34 may move the first type overhead image 80 downward and display the second type overhead image 82 above the first type overhead image 80.
  • the display range of the overhead image is substantially expanded.
  • the second type overhead image 82 is displayed in the direction in which the obstacle 84 is detected. For this reason, the driver
  • the second imaging unit from which the second acquisition unit 36 acquires the second type image is arranged at a higher position than the first imaging unit from which the first acquisition unit 30 acquires the first type image.
  • the second imaging unit is disposed near the roof of the vehicle 100 as illustrated in FIG. 1, the second imaging unit is disposed at a position higher than the position of the driver, and thus the obstacle 84 is viewed from above from its own viewpoint.
  • the stereoscopic effect of the obstacle 84 can be grasped more appropriately.
  • This configuration can be realized in terms of hardware by a CPU, memory, or other LSI of any computer, and in terms of software, it can be realized by a program loaded in the memory, but here it is realized by their cooperation.
  • Draw functional blocks Accordingly, those skilled in the art will understand that these functional blocks can be realized in various forms by hardware only, software only, or a combination thereof.
  • FIG. 8 is a flowchart showing a display procedure by the vehicle display device 50.
  • the display control unit 34 causes the display panel 52 to display the first type overhead image 80 (S12). If the front sensor 22 or the rear sensor 24 does not detect the obstacle 84 (N in S14), the process returns to step 10.
  • the front sensor 22 or the rear sensor 24 detects the obstacle 84 (Y in S14), if the obstacle 84 does not exist farther than the range of the first type bird's-eye image 80 (N in S16), the process goes to step 10. Return.
  • the display control unit 34 performs the second type overhead view in the detection direction of the obstacle 84 in the first type overhead image 80.
  • the image 82 is superimposed and displayed (S18), and the process returns to step 16. If the display conditions for the first type overhead image 80 are not satisfied (N in S10), the process is terminated.
  • the present embodiment when the presence of an object is not detected, only the first type bird's-eye view image is displayed, so that the driver can easily understand the situation in the vicinity of the vehicle. Further, when the presence of an object is detected, the second type bird's-eye image captured from a position higher than the first type bird's-eye view image is also displayed, so that the driver can easily recognize the presence of a distant object. In addition, since the second type overhead image corresponding to the detected direction of the object is displayed, reduction in the size of the first type overhead image can be suppressed. In addition, since it is possible to detect the presence of an object farther than the imaging range of the first type image, it is possible to detect the presence of an object that is not included in the first type overhead image.
  • the second type overhead image is displayed in the direction in which the object is detected in the first type overhead image, the positional relationship between the first type overhead image and the second type overhead image can be easily grasped.
  • the angle of view of the second type overhead image is displayed wider than the angle of view of the first type overhead image, it is possible to easily grasp the position where the object is present.
  • the second type overhead image is displayed in an enlarged manner than the angle of view of the first type overhead image, the presence of the object can be easily grasped.
  • the virtual viewpoint is set above the center of the vehicle 100, but the first-type overhead image 80 and the second-type overhead image 82 may have different virtual viewpoint positions.
  • the first type bird's-eye view image 80 is a bird's-eye view image with a virtual viewpoint above the center of the vehicle 100
  • the second type bird's-eye view image 82 is a virtual viewpoint shifted to the front of the vehicle 100 from the first type bird's-eye view image 80. It may be.
  • the second type overhead image 82 displays a wider range in front of the vehicle 100 than the first type overhead image 80. For this reason, by setting the virtual viewpoint position of the second type overhead image 82 in front of the vehicle 100, in the case of the display form as shown in FIG. 7, the first type overhead image 80 and the second type overhead image 82 Discomfort is reduced.
  • Example 2 relates to a vehicular display device that generates a bird's-eye view image by viewpoint conversion with respect to images captured by a plurality of imaging units installed in the vehicle, and displays the generated bird's-eye view image.
  • the vehicle display device according to the first embodiment displays the second type overhead image in addition to the first type overhead image when the presence of an object not included in the first type overhead image is detected.
  • the vehicle display device according to the second embodiment detects an object that is not included in the first type overhead image and is included in the second type overhead image
  • display of the second type overhead image is started.
  • the vehicle 100 and the vehicle display device 50 according to the second embodiment are the same type as those in FIGS. Here, it demonstrates centering on the difference with Example 1.
  • the object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 as before, and detects objects around the vehicle 100.
  • the object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, and the distance to the detected object is farther than the imaging range of the first type image, and the second type image When it is included in the imaging range, the display control unit 34 is notified of object detection.
  • the display control unit 34 obtains a notification from the object detection unit 40 and an identification number.
  • the display control unit 34 causes the display panel 52 to display the second type overhead image 82 in addition to the first type overhead image 80.
  • the timing at which the second type overhead image 82 is displayed on the display panel 52 in addition to the first type overhead image 80 is the timing at which the notification from the object detection unit 40 is acquired after the first type overhead image 80 is displayed. It is. That is, when the object detected by the object detection unit 40 is outside the range of the first type overhead image 80 and within the range of the second type overhead image 82, the display control unit 34 adds to the first type overhead image 80. Then, the display of the second type overhead image 82 corresponding to the detected direction of the object is started.
  • FIG. 9 is a flowchart showing a display procedure by the vehicle display device 50 according to the second embodiment of the present invention.
  • the display control unit 34 causes the display panel 52 to display the first type overhead image 80 (S102). If the front sensor 22 or the rear sensor 24 does not detect the obstacle 84 (N in S104), the process returns to step 100.
  • the front sensor 22 or the rear sensor 24 detects the obstacle 84 (Y in S104), when the obstacle 84 does not exist farther than the range of the first type overhead image 80, or the range of the second type overhead image 82 If not included (N in S106), the process returns to Step 100.
  • the display control unit 34 displays the first type overhead image.
  • the second type bird's-eye view image 82 is superimposed and displayed in the detection direction of the obstacle 84 at 80 (S108), and the process returns to step 106. If the display condition of the first type overhead image 80 is not satisfied (N in S100), the process is terminated.
  • the second type when the object is outside the range of the first type overhead image and within the range of the second type overhead image, the second type corresponding to the direction of the detected object in addition to the first type overhead image. Since the bird's-eye view image is displayed, it is possible to reliably display the object that is not included in the first type bird's-eye image and is included in the second type bird's-eye view image. In addition, since the second type overhead image is displayed in the direction in which the object is detected in the first type overhead image, the positional relationship between the first type overhead image and the second type overhead image can be easily grasped. .
  • Example 3 relates to a vehicle display device that generates an overhead image by viewpoint conversion for images captured by a plurality of imaging units installed in a vehicle, and displays the generated overhead image, as in the past.
  • the second type overhead image is not displayed.
  • the second type overhead image is displayed in addition to the first type overhead image.
  • a display is made so that it can be determined that the object included in the first type overhead image and the object included in the second type overhead image are the same.
  • the vehicle 100 according to the third embodiment is the same type as that shown in FIG. Here, it demonstrates centering on the difference from before.
  • FIG. 10 shows a configuration of the vehicle display device 50 according to the third embodiment of the present invention.
  • the vehicle display device 50 further includes an identity determination unit 42 in the vehicle display device 50 shown in FIG.
  • the object detection unit 40 is connected to the front sensor 22 and the rear sensor 24 as before, and detects objects around the vehicle 100. When the object detection unit 40 detects an object in either the front sensor 22 or the rear sensor 24, the object detection unit 40 notifies the display control unit 34 and the identity determination unit 42 of the detection of the object.
  • the object detection unit 40 notifies the detection of the object. At that time, the position of the detected object is also notified.
  • the identity determination unit 42 receives the first type overhead image 80 from the first image generation unit 32 and the second type overhead image 82 from the second image generation unit 38. Furthermore, the identity determination unit 42 acquires the position information and the traveling direction of the vehicle 100. In addition, since a well-known technique should just be used for acquisition of the positional information and the advancing direction of the vehicle 100, description is abbreviate
  • the identity determination unit 42 includes the same object when the position (coordinates) at which the object detection unit 40 detects the object is included in the first type overhead image 80 and the second type overhead image 82. Is determined.
  • the identity determination unit 42 performs image recognition processing on the first-type overhead image 80 and the second-type overhead image 82, compares the shapes of the objects acquired in the image recognition processing, and the same object is found. It may be determined that it is included.
  • the identity determination unit 42 determines the identity of the first type overhead image 80 and the second type overhead image 82 of the object detected by the object detection unit 40.
  • the identity determination unit 42 outputs a determination result on whether or not the same object is included to the display control unit 34.
  • the display control unit 34 displays the second type overhead image 82 in addition to the first type overhead image 80. Further, when displaying the second type overhead image 82 in addition to the first type overhead image 80, the display control unit 34, based on the determination result by the identity determination unit 42, the first type overhead image 80 and the second type.
  • the display which can determine the identity of the object currently displayed on each of the bird's-eye view images 82 is performed. When the detected object is displayed only in the second type bird's-eye view image 82, a display capable of determining the identity is unnecessary. However, when the vehicle 100 comes close to the detected object and the object is displayed in both the second type overhead image 82 and the first type overhead image 80, a display capable of determining the identity is made. .
  • FIG. 11 shows an overhead image generated by the display control unit 34. Similar to FIG. 6, this corresponds to the case where the notification from the object detection unit 40 is acquired and the acquired identification number indicates the rear sensor 24.
  • the display control unit 34 arranges the first type overhead image 80, particularly the second type overhead image 82 in which the obstacle 84 that is an object is displayed below the rear image 72.
  • the obstacle 84 is also displayed on the first type overhead image 80.
  • the obstacle 84 included in the first type bird's-eye image 80 and the obstacle 84 included in the second type bird's-eye image 82 are determined to be the same by the identity determination unit 42.
  • the same object marker 86 is shown on the obstacle 84 included in the first type overhead image 80 and the obstacle 84 included in the second type overhead image 82.
  • the same object marker 86 is a display that can determine the identity of an object, and is a display that surrounds each with a frame of the same shape or the same color.
  • the identity of the objects displayed in each of the first type overhead image and the second type overhead image is determined. Since possible display is performed, the same object displayed on each of the first type overhead image and the second type overhead image can be easily recognized. In addition, since the same object displayed in each of the first type overhead image and the second type overhead image is easily recognized, the position of the object can be easily recognized in a situation where the vehicle approaches the object. .
  • the present invention it is possible to make it easier for the driver to grasp the situation in the vicinity of the vehicle, particularly the presence of an object that becomes an obstacle.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)

Abstract

Une première unité d'acquisition (30) acquiert une image de type 1 dans laquelle la périphérie d'un véhicule est imagée. Une première unité de génération d'image (32) génère une image plongeante de type 1 par conversion des points de vue de l'image de type 1 afin d'être visible d'au-dessus du véhicule. Une unité de commande d'affichage (34) amène l'image plongeante de type 1 à être affichée. Une seconde unité d'acquisition (36) acquiert une image de type 2 dans laquelle une plage plus éloignée du véhicule que dans l'image de type 1 est imagée à partir d'un emplacement plus élevé que pour l'image de type 1. Une seconde unité de génération d'image (38) génère une image plongeante de type 2 par conversion des points de vue de l'image de type 2 afin d'être visible d'au-dessus du véhicule. Une unité de détection d'objet (40) détecte un objet dans la périphérie du véhicule. Lorsqu'un objet est détecté, l'unité de commande d'affichage (34) réalise une image plongeante de type 2 qui correspond à la direction de l'image détectée affichée en plus de l'image plongeante de type 1.
PCT/JP2016/080091 2015-11-17 2016-10-11 Dispositif d'affichage pour des véhicules et procédé d'affichage pour des véhicules WO2017086057A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201680051297.1A CN107950023B (zh) 2015-11-17 2016-10-11 车辆用显示装置以及车辆用显示方法
EP16866059.5A EP3379827B1 (fr) 2015-11-17 2016-10-11 Dispositif d'affichage pour des véhicules et procédé d'affichage pour des véhicules
US15/935,143 US20180208115A1 (en) 2015-11-17 2018-03-26 Vehicle display device and vehicle display method for displaying images

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015224379 2015-11-17
JP2015-224379 2015-11-17
JP2016146181A JP6699427B2 (ja) 2015-11-17 2016-07-26 車両用表示装置および車両用表示方法
JP2016-146181 2016-07-26

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/935,143 Continuation US20180208115A1 (en) 2015-11-17 2018-03-26 Vehicle display device and vehicle display method for displaying images

Publications (1)

Publication Number Publication Date
WO2017086057A1 true WO2017086057A1 (fr) 2017-05-26

Family

ID=58718844

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/080091 WO2017086057A1 (fr) 2015-11-17 2016-10-11 Dispositif d'affichage pour des véhicules et procédé d'affichage pour des véhicules

Country Status (1)

Country Link
WO (1) WO2017086057A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109515321A (zh) * 2017-09-19 2019-03-26 华创车电技术中心股份有限公司 行车影像接口切换系统及行车影像切换方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000064175A1 (fr) * 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Ltd. Dispositif de traitement d'images et systeme de surveillance
JP2003212041A (ja) * 2002-01-25 2003-07-30 Toyota Central Res & Dev Lab Inc 車輌後方表示装置
JP2012064096A (ja) * 2010-09-17 2012-03-29 Nissan Motor Co Ltd 車両用画像表示装置
JP2012185540A (ja) * 2011-03-03 2012-09-27 Honda Elesys Co Ltd 画像処理装置、画像処理方法、及び画像処理プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000064175A1 (fr) * 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Ltd. Dispositif de traitement d'images et systeme de surveillance
JP2003212041A (ja) * 2002-01-25 2003-07-30 Toyota Central Res & Dev Lab Inc 車輌後方表示装置
JP2012064096A (ja) * 2010-09-17 2012-03-29 Nissan Motor Co Ltd 車両用画像表示装置
JP2012185540A (ja) * 2011-03-03 2012-09-27 Honda Elesys Co Ltd 画像処理装置、画像処理方法、及び画像処理プログラム

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109515321A (zh) * 2017-09-19 2019-03-26 华创车电技术中心股份有限公司 行车影像接口切换系统及行车影像切换方法

Similar Documents

Publication Publication Date Title
JP6699427B2 (ja) 車両用表示装置および車両用表示方法
US20170297488A1 (en) Surround view camera system for object detection and tracking
JP4899424B2 (ja) 物体検出装置
JP4816923B2 (ja) 車両周辺画像提供装置および方法
JP5953824B2 (ja) 車両用後方視界支援装置及び車両用後方視界支援方法
JP4731392B2 (ja) 車載周辺状況提示装置
JP6425991B2 (ja) 牽引車両周囲画像生成装置および牽引車両周囲画像生成方法
JP6565188B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
CN111971682B (zh) 路面检测装置、图像显示装置、障碍物检测装置、路面检测方法、图像显示方法以及障碍物检测方法
JP2009086787A (ja) 車両検出装置
JP6743882B2 (ja) 画像処理装置、機器制御システム、撮像装置、画像処理方法及びプログラム
JP6597792B2 (ja) 画像処理装置、物体認識装置、機器制御システム、画像処理方法およびプログラム
JP2009206747A (ja) 車両用周囲状況監視装置及び映像表示方法
JP5098563B2 (ja) 物体検出装置
CN107004250B (zh) 图像生成装置及图像生成方法
JP6589313B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
JP2020068515A (ja) 画像処理装置
JP2008048094A (ja) 車両用映像表示装置及び車両周囲映像の表示方法
JP3988551B2 (ja) 車両周囲監視装置
JP2007249814A (ja) 画像処理装置及び画像処理プログラム
WO2017086057A1 (fr) Dispositif d'affichage pour des véhicules et procédé d'affichage pour des véhicules
JP7047291B2 (ja) 情報処理装置、撮像装置、機器制御システム、移動体、情報処理方法およびプログラム
JP5984714B2 (ja) 立体物検出装置、運転支援装置および立体物検出方法
KR20220097656A (ko) 운전자 보조 장치, 차량 및 그 제어 방법
JP4799236B2 (ja) 車載表示システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16866059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016866059

Country of ref document: EP