WO2019073548A1 - Display control device and display control method - Google Patents

Display control device and display control method Download PDF

Info

Publication number
WO2019073548A1
WO2019073548A1 PCT/JP2017/036790 JP2017036790W WO2019073548A1 WO 2019073548 A1 WO2019073548 A1 WO 2019073548A1 JP 2017036790 W JP2017036790 W JP 2017036790W WO 2019073548 A1 WO2019073548 A1 WO 2019073548A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
control device
display surface
display control
Prior art date
Application number
PCT/JP2017/036790
Other languages
French (fr)
Japanese (ja)
Inventor
下谷 光生
中村 好孝
克治 淺賀
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/036790 priority Critical patent/WO2019073548A1/en
Priority to JP2019547838A priority patent/JP6910457B2/en
Publication of WO2019073548A1 publication Critical patent/WO2019073548A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/26Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the rear of the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a display control apparatus that controls a display device and a display control method.
  • Patent Document 1 proposes an electronic mirror that displays the situation behind a vehicle travel lane and an adjacent lane.
  • an autostereoscopic display that displays an image with a sense of depth, that is, a sense of perspective, by displaying left and right eye images that cause parallax to the left and right eyes of the user. It is done. If this autostereoscopic display is applied to an electronic mirror, the user can visually recognize an electronic mirror similar to a stereoscopic surface of an existing door mirror.
  • this invention is made in view of the above problems, and an object of this invention is to provide the technique which can reduce the number of cameras for performing an autostereoscopic display.
  • the display control device is a display control device that controls a display device, and the display device can display an image for left eye and right eye so that autostereoscopic display is possible, and the vehicle periphery is displayed.
  • An acquisition unit for acquiring one photographed image of the image and display surface generation information for generating an autostereoscopic display surface which is a surface on an autostereoscopic display, and a photographed image and display surface generated by the acquisition unit
  • a control unit configured to perform control to display a photographed image on the autostereoscopic display surface based on the information.
  • control is performed to display a captured image on the autostereoscopic display surface based on one captured image and display surface generation information. According to such a configuration, the number of cameras for performing autostereoscopic display can be reduced.
  • FIG. 1 is a block diagram showing a configuration of a display control device according to Embodiment 1.
  • FIG. 7 is a block diagram showing a configuration of a display control device according to Embodiment 2.
  • FIG. 10 is a diagram for describing a camera direction according to Embodiment 2. It is a figure for demonstrating the adjustment example of a mirror. It is a figure for demonstrating the adjustment example of a mirror.
  • FIG. 10 is a perspective view schematically showing an autostereoscopic display surface according to Embodiment 2.
  • FIG. 10 is a perspective view schematically showing an autostereoscopic display surface according to Embodiment 2.
  • FIG. 18 is a diagram showing a display example of an autostereoscopic display surface according to Embodiment 2.
  • FIG. 1 is a block diagram showing a configuration of a display control device according to Embodiment 1.
  • FIG. 7 is a block diagram showing a configuration of a display control device according to Embodiment 2.
  • FIG. 10 is a diagram for
  • FIG. 10 is a perspective view schematically showing an autostereoscopic display surface according to Embodiment 2.
  • FIG. 18 is a diagram showing a display example of an autostereoscopic display surface according to Embodiment 2.
  • 7 is a flowchart showing the operation of the display control apparatus according to Embodiment 2; It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction.
  • FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a second modification of the second embodiment.
  • FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a second modification of the second embodiment.
  • FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a third modification of the second embodiment.
  • FIG. 18 is a diagram for describing a camera direction according to a fourth modification of the second embodiment.
  • FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a fourth modification of the second embodiment.
  • FIG. 16 is a block diagram showing a configuration of a display control device according to Embodiment 3. It is a figure which shows the imaging
  • FIG. 18 is a block diagram showing a configuration of a display control device according to Embodiment 4.
  • FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a fifth embodiment.
  • FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a fifth embodiment.
  • FIG. 21 is a block diagram showing a configuration of a display control device according to Embodiment 6.
  • FIG. 16 is a flowchart showing the operation of the display control apparatus according to Embodiment 6.
  • FIG. FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a seventh embodiment.
  • FIG. 21 is a diagram showing a display example of an autostereoscopic display surface according to a seventh embodiment.
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 1 of Embodiment 7;
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 1 of Embodiment 7;
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to a second modification of the seventh embodiment.
  • FIG. 35 is a diagram showing a display example of an autostereoscopic display surface according to a second modification of the seventh embodiment.
  • FIG. 31 is a perspective view schematically showing an autostereoscopic display surface according to Variation 3 of Embodiment 7;
  • FIG. 31 is a perspective view schematically showing an autostereoscopic display surface according to Variation 3 of Embodiment 7;
  • FIG. 35 is a diagram showing a display example of an autostereoscopic display surface according to Variation 3 of Embodiment 7.
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 4 of Embodiment 7;
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 5 of Embodiment 7;
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 5 of Embodiment 7;
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 6 of Embodiment 7;
  • FIG. 35 is a diagram showing a display example of an autostereoscopic display surface according to Variation 6 of Embodiment 7.
  • FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 7 of Embodiment 7;
  • FIG. 35 is a top view schematically showing an autostereoscopic display surface according to Variation 7 of Embodiment 7;
  • FIG. 56 is a front view schematically showing an autostereoscopic display surface according to Variation 7 of Embodiment 7;
  • It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification.
  • It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification.
  • It is a block diagram showing composition of a server concerning other modifications.
  • It is a block diagram which shows the structure of the communication terminal which concerns on another modification.
  • Embodiment 1 the display control apparatus according to the first embodiment of the present invention is mounted, and a vehicle to be focused on will be described as “own vehicle”.
  • FIG. 1 is a block diagram showing the configuration of the display control device 1 according to the first embodiment.
  • the display control device 1 of FIG. 1 includes an acquisition unit 11 and a control unit 12 and can control the display device 21.
  • the display device 21 is disposed, for example, on an instrument panel or a dashboard of the host vehicle.
  • the display device 21 can perform autostereoscopic display by displaying images for the left eye and the right eye.
  • the image for the left eye is an image that is not viewed by the right eye of the user but is viewed by the left eye, and the image for the right eye is not viewed by the left eye of the user but is viewed by the right eye It is an image.
  • a polarization filter method, a liquid crystal shutter method, or the like is used to display such images for the left eye and the right eye.
  • the image for the left eye and the image for the right eye are partially different from each other to the extent that they cause parallax in the left eye and the right eye of the user, and this parallax allows the user to view a stereoscopic image It is possible.
  • the acquisition unit 11 acquires one captured image and display surface generation information.
  • One captured image is an image used for displaying on the display device 21.
  • the subject vehicle such as one image behind the subject vehicle, one image at the side of the subject vehicle, or one image in front of the subject vehicle It is one image of the periphery.
  • the photographed image in the case of displaying the door mirror is an image for mirror display obtained by reversing the image of the rear of the host vehicle.
  • the captured image may be the entire image captured by a camera or the like, or may be a part of the image.
  • the display surface generation information is information for generating an autostereoscopic display surface which is a surface on the autostereoscopic display. Specific examples of the display surface generation information will be described in Embodiment 2 and later.
  • the acquisition unit 11 may be a camera or various interfaces.
  • the control unit 12 controls the display of the photographed image on the autostereoscopic display surface based on the one photographed image acquired by the acquisition unit 11 and the display surface generation information acquired by the acquisition unit 11.
  • control unit 12 generates an image obtained by shifting a portion corresponding to the autostereoscopic display surface of one captured image in one direction to the left or right as an image for the left eye, and generates an autostereoscopic image of one captured image.
  • An image in which a portion corresponding to the visual display surface is shifted in the direction opposite to the one direction is generated as an image for the right eye.
  • the degree of parallax of the user with respect to the portion corresponding to the autostereoscopic display surface in one captured image is different from the degree of parallax of the user with respect to the other images.
  • it appears to the user that the position of the portion corresponding to the autostereoscopic display surface is different from the position of the other image in the depth direction, and the autostereoscopic display surface appears stereoscopically .
  • control unit 12 determines whether to generate an autostereoscopic display surface tilted from the display screen of the display device 21 based on the display surface generation information. Then, when it is determined that the inclined autostereoscopic display surface is to be generated, the control unit 12 changes the size of the shift based on the inclination of the autostereoscopic display surface. The control unit 12 controls the display device 21 to display the images for the left eye and the right eye obtained by the above processing.
  • the display control device 1 performs control to display the one photographed image on the autostereoscopic display surface based on the one photographed image and the display surface generation information. .
  • one electronic mirror or the like can be autostereoscopically displayed from one captured image.
  • two cameras for acquiring a photographed image of the left rear side and a photographed image of the right rear side of the own vehicle are used, it is possible to display the left and right electronic mirrors in an autostereoscopic manner.
  • the number of cameras for autostereoscopic display of the left and right electronic mirrors can be reduced, so that the cost and power consumption of hardware can be reduced.
  • FIG. 2 is a block diagram showing a configuration of a display control device 1 according to Embodiment 2 of the present invention.
  • constituent elements according to the second embodiment constituent elements that are the same as or similar to the above constituent elements are given the same reference numerals, and different constituent elements are mainly described.
  • the display control device 1 of FIG. 2 is connected to the left autostereoscopic display device 21a for left, the autostereoscopic display device 21b for right, and the camera control device 26d.
  • Each of the left-eye autostereoscopic display 21a and the right-eye autostereoscopic display 21b corresponds to the display 21 shown in FIG.
  • the left-handed naked-eye stereoscopic display 21a displays an image for the left eye and the right-eye for displaying the left electronic mirror stereoscopically
  • the right-handed naked-eye stereoscopic display 21b displays the right electronic mirror nakedly Display images for the left and right eyes.
  • one autostereoscopic display may be used instead of the left autostereoscopic display 21a and the right autostereoscopic display 21b.
  • the left-eye autostereoscopic display 21a and the right-eye autostereoscopic display 21b are not distinguished from one another, they may be described as "display 21".
  • the camera control device 26d is connected to the left rear side camera 26a, the right rear side camera 26b, and the operation device 26c.
  • the left rear side camera 26a is a camera for photographing the left rear side of the host vehicle, and is provided on the host vehicle instead of the left door mirror.
  • the right rear side camera 26b is a camera for photographing the right rear side of the host vehicle, and is provided on the host vehicle instead of the right door mirror.
  • each of the left rear side camera 26a and the right rear side camera 26b is not a panoramic camera or a wide-angle camera, but is a standard camera that captures an image with an angle of view of about 10 degrees.
  • an image for mirror display based on an image captured by the left rear side camera 26a may be referred to as a “left camera image”, and based on an image captured by the right rear side camera 26b
  • An image for mirror display may be described as a “right camera image”, and when the left camera image and the right camera image are not distinguished, these may be described as a “camera image”.
  • left camera direction the direction in which the left rear side camera 26a shoots
  • right camera direction the direction in which the right rear side camera 26b is photographed
  • left camera direction and right camera direction are not distinguished from one another, these may be referred to as “camera direction”.
  • FIG. 3 is a view showing an example of the left camera direction and the right camera direction in the host vehicle 3.
  • the reference direction in the left camera direction is the just rear direction D1 of the vehicle 3 and the left camera direction is the direction that forms an angle ⁇ l1 with the just rear direction D1.
  • the reference direction in the right camera direction is the just rear direction D1 of the vehicle 3 and the right camera direction is the direction that forms an angle ⁇ r1 with the just rear direction D1.
  • the reference direction of the camera direction may be, for example, a standard adjusted direction which will be described later with reference to FIGS. 4 and 5, or may be preset and stored in the display control device 1 by the user. It may be the other direction.
  • the left camera direction and the angle ⁇ l1 have a one-to-one relationship, in the following description, the left camera direction and the angle ⁇ l1 will be treated the same and may be referred to as the left camera direction ⁇ l1.
  • the right camera direction and the angle ⁇ r1 have a one-to-one relationship, in the following description, the right camera direction and the angle ⁇ r1 are treated the same and may be referred to as the right camera direction ⁇ r1.
  • the operation device 26c in FIG. 2 is, for example, an operation switch, and an operation for changing the left camera direction and the right camera direction separately, and the angle of view of the left rear side camera 26a and the right rear side camera 26b individually Accept the operation to change.
  • the camera control device 26d appropriately performs control to change the camera direction, the angle of view, and the focus with respect to each of the left rear side camera 26a and the right rear side camera 26b based on the operation received by the operation device 26c. According to such a configuration, it is possible to appropriately adjust the range or the like photographed by the left rear side camera 26a and the right rear side camera 26b by appropriately performing the operation in the operation device 26c.
  • FIGS. 4 and 5 are diagrams showing an example of adjustment of a mirror image of a standard door mirror.
  • photographed with the left rear side camera 26a and the right rear side camera 26b is performed similarly to adjustment of a door mirror.
  • the door mirror is adjusted so that the driver can visually recognize a range of width of 5 m from the side of the host vehicle 30 m behind the host vehicle 3.
  • the body of the vehicle 3 is visually recognized by the driver in the range of about 1 ⁇ 4 of the horizontal length of the mirror surface 4a from the left end of the mirror surface 4a of the right door mirror 4 and
  • the right door mirror 4 is adjusted so that the road 5 can be viewed by the driver in a range of about 1/3 of the vertical length of 4a.
  • an adjustment is performed in which the right and left sides are reversed in the adjustment of the right door mirror. Adjustments recommended for the left and right door mirrors differ depending on the left and right door mirrors and country-specific recommendations, and the size and type of the left and right door mirrors.
  • the camera control device 26 d not only controls the left rear side camera 26 a and the right rear side camera 26 b as described above, but also with the left rear side camera 26 a and the right rear side camera 26 b
  • the taken left camera image and the right camera image, and the left camera direction and the right camera direction of the left rear side camera 26 a and the right rear side camera 26 b are output to the display control device 1.
  • the display control device 1 includes an image acquisition unit 11a, a display surface generation information acquisition unit 11b, a video processing unit 12a, and an image output unit 12b.
  • the concepts of the image acquisition unit 11a and the display surface generation information acquisition unit 11b are the concepts included in the acquisition unit 11 of FIG. 1, and the concepts of the video processing unit 12a and the image output unit 12b are the control unit 12 of FIG. Is a concept included in
  • the image acquisition unit 11a acquires a camera image from the camera control device 26d as a captured image.
  • the display surface generation information acquisition unit 11 b acquires the camera direction from the camera control device 26 d as a photographing direction that is a direction regarding a photographed image.
  • the image processing unit 12a is configured based on the camera image which is a photographed image acquired by the image acquiring unit 11a and the camera direction which is a photographing direction acquired by the display surface generation information acquiring unit 11b. Generate That is, in the second embodiment, the video processing unit 12a uses the camera direction as the display surface generation information described in the first embodiment.
  • FIG. 6 is a diagram showing an autostereoscopic display surface generated by the video processing unit 12a according to the second embodiment.
  • the autostereoscopic display surface corresponding to the left camera image may be referred to as “left autostereoscopic display surface”, and the autostereoscopic display surface corresponding to the right camera image is referred to as “right autostereoscopic display surface”. It may be described as ".
  • the left autostereoscopic display surface SL and the right autostereoscopic display surface SR are both flat.
  • the xyz axes are defined in the upper direction, the left direction, and the depth direction of the display screen of the display device 21, and for the right autostereoscopic display surface SR, the display screen of the display device 21
  • the x, y, and z axes are defined in the upward, rightward, and depth directions.
  • the image processing unit 12a controls the rotation of the left autostereoscopic display surface SL with respect to a predetermined axis based on the left camera direction ⁇ l1.
  • the video processing unit 12a uses the right side of the left autostereoscopic display surface SL as a predetermined axis, and the left side in the depth direction from the display screen DS of the display device 21 with reference to the axis.
  • the left autostereoscopic display surface SL is rotated by the same angle ⁇ l2 as the camera direction ⁇ l1.
  • the angle ⁇ 12 defining the rotation of the left autostereoscopic display surface SL may be referred to as a “left rotation angle ⁇ 12”.
  • the image processing unit 12a controls the rotation of the right autostereoscopic display surface SR with respect to a predetermined axis based on the right camera direction ⁇ r1.
  • the video processing unit 12a uses the left side of the right-handed naked-eye stereoscopic display surface SR as a predetermined axis, and the right side in the depth direction from the display screen DS of the display device 21 with reference to the axis.
  • the right autostereoscopic display surface SR is rotated by the same angle ⁇ r2 as the camera direction ⁇ r1.
  • the angle ⁇ r2 defining the rotation of the right naked eye stereoscopic display surface SR may be referred to as “right rotation angle ⁇ r2”, and the left rotation angle ⁇ l2 and the right rotation angle ⁇ r2 When not distinguished, these may be described as a "rotation angle.”
  • FIG. 8 is a diagram showing a specific display example of the display of FIG.
  • the left-handed autostereoscopic display 21 a and the right-handed autostereoscopic display 21 b are provided with the meter region 7 provided with a meter, and the left autostereoscopic display surface SL and the right autostereoscopic
  • the visual display surface SR is displayed as a left electronic mirror and a right electronic mirror, respectively.
  • the body of the host vehicle 3, the road 5, and the other vehicle 6 are displayed on the left naked eye stereoscopic display surface SL, and the body of the host vehicle 3 and the road 5 are displayed on the right naked eye stereoscopic display surface SR. It is displayed.
  • FIG. 10 is a diagram showing a specific display example of the display of FIG.
  • the left camera direction ⁇ l1 is sufficiently larger than 0 degrees
  • the body of the host vehicle 3 is not displayed on the left-eye autostereoscopic display device 21a
  • the right camera direction ⁇ r1 is sufficiently larger than 0 degrees.
  • the body of the vehicle 3 is not displayed on the right-handed autostereoscopic display 21b.
  • the image on the left side, which is the outside of the left electron mirror, is positioned behind the image on the right side, which is the inside of the left electron mirror.
  • the right side which is the outer side of the left electron mirror may be displayed shorter than the left side which is the inner side of the left electron mirror.
  • the left side which is the outer side of the right electron mirror may be displayed shorter than the right side which is the inner side of the right electron mirror.
  • the image output unit 12b shown in FIG. 2 outputs the image signals of the left-eye and right-eye images for displaying the left autostereoscopic display surface SL generated by the video processing unit 12a to the left-eye autostereoscopic display device 21a. Output.
  • the image output unit 12 b transmits the image signals of the left-eye and right-eye images for displaying the right-eye-eye stereoscopic display surface SR generated by the image processing unit 12 a to the right-handed naked-eye stereoscopic display 21 b. Output.
  • FIG. 11 is a flowchart showing the operation of the display control device 1 according to the second embodiment. This operation is started, for example, when the accessory power supply of the host vehicle is turned on, or when the drive source of the host vehicle is turned on.
  • step S1 the display surface generation information acquiring unit 11b acquires the left camera direction ⁇ l1 and the right camera direction ⁇ r1.
  • step S2 the video processing unit 12a determines the left rotation angle ⁇ l2 of the left autostereoscopic display surface SL based on the left camera direction ⁇ l1. Similarly, the image processing unit 12a determines the right rotation angle ⁇ r2 of the right-eye-eye stereoscopic display surface SR based on the right camera direction ⁇ r1.
  • step S3 the image acquisition unit 11a acquires a left camera image and a right camera image.
  • step S4 the image processing unit 12a performs image processing on the left camera image based on the left rotation angle ⁇ 12 to display the left autostereoscopic display surface SL, for the left eye and the right eye. Generate an image. Similarly, the image processing unit 12 a performs image processing on the right camera image based on the right rotation angle ⁇ r 2 to display images for the left eye and the right eye for displaying the right autostereoscopic display surface SR. Generate
  • step S5 the image output unit 12b outputs the video signals of the left-eye and right-eye images for displaying the left autostereoscopic display surface SL to the left autostereoscopic display 21a, and the right autostereoscopic display Video signals of images for the left eye and for the right eye for displaying the visual display surface SR are output to the right-handed autostereoscopic display 21 b. Thereafter, the process returns to step S1.
  • the camera direction (the photographing direction) is changed based on an operation from the outside of the display control device 1. According to such a configuration, the user can perform the operation for the rotation of the electronic mirror as well as the operation for the rotation of the actual door mirror.
  • the rotation angles ⁇ l2 and ⁇ r2 of the autostereoscopic display surface are the same as the camera directions ⁇ l1 and ⁇ r1, respectively. That is, the amount of rotation of the autostereoscopic display surface was the same as the amount of change of the angle in the camera direction. However, the amount of rotation of the autostereoscopic display surface may be different from the amount of change in the angle in the camera direction.
  • the rotation angles ⁇ l2 and ⁇ r2 of the autostereoscopic display surface may be values of functions fl ( ⁇ l1) and fr ( ⁇ r1) of the camera directions ⁇ l1 and ⁇ r1.
  • the amount of rotation of the autostereoscopic display surface is equal to or less than the amount of change in the angle in the camera direction.
  • the video processing unit 12a displays the naked eye stereoscopic display It will rotate the surface.
  • the functions fl (.theta.l1) and fr (.theta.r1) are increasing functions of the camera directions .theta.l1 and .theta.r1, and the degree of increase in the value thereof is gradually reduced It is also good.
  • upper limit values such as 45 degrees in FIGS. 13 to 15 may be set to the values of the functions fl ( ⁇ 11) and fr ( ⁇ r1).
  • the functions fl ( ⁇ l1) and fr ( ⁇ r1) are not limited to the above, and any other equation can be used. Also, the function fl ( ⁇ l1) and the function fr ( ⁇ r1) may be the same or different. For example, even if the amount of change in the left camera direction ⁇ l1 and the amount of change in the right camera direction ⁇ r1 are the same, one of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR that is closer to the driver's seat The functions fl ( ⁇ l1) and fr ( ⁇ r1) may be used such that the amount of rotation of the surface is smaller than the amount of rotation of the other display surface. According to such a configuration, the rotation of the autostereoscopic display surface can be controlled in consideration of the distance between the driver and the mirror.
  • the axis serving as the reference of the rotation of the left autostereoscopic display surface SL (hereinafter referred to as the “left rotation axis”) is the right side of the left autostereoscopic display surface SL (FIG. 6)
  • the axis serving as the reference of the rotation of the right naked eye stereoscopic display surface SR (hereinafter referred to as “right rotation axis”) is the left side of the right naked eye stereoscopic display surface SR (FIG. 6).
  • the position of the left rotation axis may be the position of the central portion in the left-right direction of the left autostereoscopic display surface SL, and the position of the right rotation axis is the right naked eye It may be the position of the center of the stereoscopic display surface SR in the left-right direction.
  • the position of the left rotation axis may be the position of the left side of the left autostereoscopic display surface SL, and the position of the right rotation axis is the right autostereoscopic display surface SR It may be the position of the right side of.
  • the positions of the left pivoting axis and the right pivoting axis may be positions preset and stored in the display control device 1 by the user. Further, the image processing unit 12a may move the left rotation axis and the right rotation axis based on the rotation angles ⁇ l2 and ⁇ r2 of the autostereoscopic display surface. For example, the image processing unit 12a may move the left rotation axis closer to the left side of the left naked eye stereoscopic display surface SL as the left rotation angle ⁇ 12 increases, or as the right rotation angle ⁇ r2 increases. It may be close to the right side of the right naked eye stereoscopic display surface SR.
  • the left autostereoscopic display surface SL and the right autostereoscopic display surface SR are both flat. However, the left autostereoscopic display surface SL and the right autostereoscopic display surface SR may be curved as shown in FIG.
  • the image processing unit 12a controls the angle ⁇ l3 between the left autostereoscopic display surface SL, which is a curved surface, and the reference direction of the left autostereoscopic display surface SL based on the left camera direction ⁇ l1.
  • the rotation of the left autostereoscopic display surface SL may be controlled.
  • the angle ⁇ l3 is an angle between a line connecting the right side of the curved surface and the left side of the curved surface and the x-axis direction which is a reference direction.
  • the angle ⁇ l 3 is not limited to this, and may be an angle between the tangent of the curved surface on the right side of the curved surface and the x-axis direction which is the reference direction.
  • the video processing unit 12a controls an angle ⁇ r3 between the right autostereoscopic display surface SR which is a curved surface and the reference direction of the right autostereoscopic display surface SR.
  • the rotation of the right naked eye stereoscopic display surface SR may be controlled.
  • the angle ⁇ r3 is an angle between a line connecting the left side of the curved surface and the right side of the curved surface and the x-axis direction which is the reference direction.
  • the angle ⁇ r3 is not limited to this, and may be an angle between the tangent of the curved surface on the left side of the curved surface and the x-axis direction which is the reference direction.
  • the curved surface may be convex on the back side of the display screen DS of the display device 21 as shown in FIG. 18, or may be convex on the opposite side to the back side of the display screen DS although not shown.
  • the camera direction and the turning angle of the autostereoscopic display surface are the direction and the angle corresponding to the horizontal angle.
  • the camera direction and the rotation angle of the autostereoscopic display surface may be, for example, a direction and an angle corresponding to the supine angle as shown in FIGS. 19 and 20.
  • the left camera direction ⁇ l1 of the left rear side camera 26a may be a towing angle based on the horizontal direction D2.
  • the video processing unit 12a sets the left naked eye solid only by an angle ⁇ l2 corresponding to the left camera direction ⁇ l1 in the depth direction from the display screen DS of the display device 21 based on the lower side of the left autostereoscopic display surface SL.
  • the visual display surface SL may be rotated.
  • the right camera direction of the right rear side camera 26b may also be a falling angle based on the horizontal direction.
  • the video processing unit 12a performs right naked eye stereoscopic display by an angle corresponding to the right camera direction in the depth direction from the display screen DS of the display device 21 with reference to the lower side of the right naked eye stereoscopic display surface SR.
  • the surface SR may be rotated.
  • the display device 21 displays as shown in FIG. 5, that is, if the road 5 in a certain range is displayed, the camera direction is directed upward as shown in FIG. It is thought that they are often directed downward.
  • the camera direction and the turning angle of the autostereoscopic display surface may be a direction and an angle corresponding to both the horizontal angle and the supine angle.
  • the image processing unit 12a may perform control to rotate each of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR based on two axes of the vertical axis and the horizontal axis.
  • the rotation of the autostereoscopic display surface may not necessarily be interlocked with the change in the camera direction.
  • the controller device 26c may be configured to receive a dedicated operation for changing only the rotation of the autostereoscopic display surface without changing the camera direction.
  • the video processing unit 12a rotates the autostereoscopic display surface based on the dedicated operation without the camera control device 26d changing the camera direction. Good.
  • FIG. 21 is a block diagram showing a configuration of a display control device 1 according to Embodiment 3 of the present invention.
  • the same or similar components as or to the components described above are designated by the same reference numerals, and different components will be mainly described.
  • the left rear side wide-angle camera 26e and the right rear side wide-angle camera 26f serve as the host vehicle. It is provided.
  • each of the left rear wide-angle camera 26e and the right rear wide-angle camera 26f is not a standard camera but a wide-angle camera.
  • FIG. 22 is a view showing a photographing area R1 which is an area where a standard camera having an angle of view of 10 degrees can take an image and an imaging area R2 which is an area where a wide angle camera having an angle of view of 30 degrees can take an image It is.
  • FIG. 23 is a diagram corresponding to FIG. 22 and is a diagram illustrating an example of an image captured by a wide-angle camera. As shown in FIGS. 22 and 23, the imaging area R2 of the wide-angle camera is wider than the imaging area R1 of a standard camera.
  • the controller device 26 c receives a cutting operation for cutting out a part of the image captured by the left rear side wide-angle camera 26 e.
  • the camera control device 26d may also be referred to as a partial image (hereinafter referred to as "left partial image") from an image captured by the left rear side wide-angle camera 26e based on the cutting operation received by the operating device 26c. Cut out).
  • the left partial image is, for example, an image within a broken line frame 27 of FIG.
  • the camera control device 26 d outputs the left partial image to the display control device 1. Further, the camera control device 26 d outputs the range of the left partial image in the entire range of the image captured by the left rear side wide-angle camera 26 e to the display control device 1.
  • the controller device 26c receives a cutting operation for cutting out a part of the image captured by the right rear side wide-angle camera 26f.
  • the camera control device 26d may also be referred to as a partial image (hereinafter referred to as a "right partial image") from an image taken by the right rear side wide-angle camera 26f based on the cutting operation received by the operation device 26c. Cut out). Then, the camera control device 26 d outputs the right partial image to the display control device 1. Further, the camera control device 26 d outputs the range of the right partial image in the entire range of the image captured by the right rear side wide-angle camera 26 f to the display control device 1.
  • the image acquisition unit 11a acquires the left partial image from the camera control device 26d as a captured image, and acquires the right partial image from the camera control device 26d as a captured image.
  • the display surface generation information acquiring unit 11b Based on the range of the left partial image from the camera control device 26d, the display surface generation information acquiring unit 11b obtains the virtual shooting direction of the left partial image as a shooting direction that is a direction related to the photographed image.
  • the display surface generation information acquiring unit 11b sets the right end of the range of the left partial image based on the right end of the image captured by the left rear side wide-angle camera 26e. An angle corresponding to the position is determined as a virtual imaging direction of the left partial image.
  • the virtual imaging direction of the left partial image is a horizontal angle as in the left camera direction of the second embodiment.
  • the display surface generation information acquisition unit 11b obtains the virtual shooting direction of the right partial image as a shooting direction that is a direction related to the shot image.
  • the video processing unit 12a is a virtual image of the left partial image, the right partial image, and the left partial image for the left camera image, the right camera image, the left camera direction and the right camera direction.
  • the same operation as the shooting direction and the virtual shooting direction of the right partial image is performed. That is, in the third embodiment, the virtual shooting direction of the left partial image and the virtual shooting direction of the right partial image are used as the display surface generation information described in the first embodiment.
  • ⁇ Summary of Embodiment 3> In the display control device 1 according to the third embodiment as described above, a partial image and a virtual shooting direction are used instead of the camera image and the camera direction described in the second embodiment. According to such a configuration, the rotation of the autostereoscopic display surface can be controlled without using the camera directions of the left rear wide-angle wide camera 26e and the right rear wide-angle wide camera 26f. As a result, since hardware such as an electric drive mechanism for changing the direction of the camera becomes unnecessary, cost reduction can be expected.
  • the virtual photographing direction of the left partial image, the virtual photographing direction of the right partial image, and the rotation angle of the autostereoscopic display surface are the direction and the angle corresponding to the horizontal angle.
  • the virtual imaging direction of the left partial image, the virtual imaging direction of the right partial image, and the rotation angle of the autostereoscopic display surface correspond to the supine angle as in the fourth and fifth modifications of the second embodiment. It may be a direction and an angle.
  • FIG. 24 is a block diagram showing a configuration of a display control device 1 according to Embodiment 4 of the present invention.
  • constituent elements according to the fourth embodiment constituent elements that are the same as or similar to the above constituent elements are given the same reference numerals, and different constituent elements are mainly described.
  • the camera control device 26d generates a left partial image and a right partial image based on the cutting operation received by the operation device 26c, and the display control device displays the left partial image and the right partial image. Output to 1.
  • the display control device 1 is configured to generate the left partial image and the right partial image based on the cutting operation received by the operation device 26c.
  • the display surface generation information acquisition unit 11 b acquires the cutting operation received by the operation device 26 c.
  • the camera control device 26 d outputs, to the display control device 1, an image captured by the left rear wide-angle camera 26 e and an image captured by the right rear wide-angle camera 26 f.
  • the image acquisition unit 11a acquires an image from the camera control device 26d, that is, an image captured by the left rear wide-angle camera 26e and an image captured by the right rear wide-angle camera 26f. Then, the image acquiring unit 11a performs a left rear side wide-angle camera 26e based on a cutting operation for cutting out a part of the image captured by the left rear side wide-angle camera 26e acquired by the display surface generation information acquiring unit 11b. Cut out the left partial image that is a part of the image taken with. Similarly, the image acquiring unit 11a is a right rear side wide-angle camera based on a cutting operation for cutting out a part of the image captured by the right rear side wide-angle camera 26f acquired by the display surface generation information acquiring unit 11b. The right partial image that is a part of it is cut out from the image taken at 26f. As described above, the image acquisition unit 11a acquires the left partial image and the right partial image as a photographed image.
  • the display surface generation information acquiring unit 11b is a shooting direction that is a virtual shooting direction of the left partial image based on the range of the left partial image in the entire range of the image captured by the left rear wide-angle camera 26e. Ask as. Similarly, based on the range of the right partial image in the entire range of the image captured by the right rear side wide-angle camera 26f, the display surface generation information acquiring unit 11b sets the virtual shooting direction of the right partial image to the direction regarding the captured image. Determined as the shooting direction.
  • the video processing unit 12a performs the same operation as the operation of the video processing unit 12a according to the third embodiment. That is, in the fourth embodiment, the virtual shooting direction of the left partial image and the virtual shooting direction of the right partial image are used as the display surface generation information described in the first embodiment.
  • the virtual photographing direction of the left partial image, the virtual photographing direction of the right partial image, and the rotation angle of the autostereoscopic display surface may be a direction and an angle corresponding to the horizontal angle, and correspond to the supine angle It may be a direction and an angle.
  • ⁇ Summary of Embodiment 4> In the display control device 1 according to the fourth embodiment as described above, a partial image and a virtual shooting direction are used instead of the camera image and the camera direction described in the second embodiment. According to such a configuration, the rotation of the autostereoscopic display surface can be controlled without using the camera directions of the left rear wide-angle wide camera 26e and the right rear wide-angle wide camera 26f. As a result, since hardware such as an electric drive mechanism for changing the direction of the camera becomes unnecessary, cost reduction can be expected.
  • the block diagram showing the configuration of the display control device 1 according to the fifth embodiment of the present invention is the same as the block diagram (FIG. 2) of the display control device 1 according to the second embodiment.
  • constituent elements according to the fifth embodiment constituent elements that are the same as or similar to the above constituent elements are given the same reference numerals, and different constituent elements are mainly described.
  • the display surface generation information acquiring unit 11b acquires the speed of the vehicle from the ECU (Electronic Control Unit) and the in-vehicle LAN (Local Area Network) (not shown) of the vehicle.
  • the image processing unit 12a controls the position of the autostereoscopic display surface in the depth direction of the autostereoscopic display based on the speed of the host vehicle.
  • the video processing unit 12a increases the distance zl in the depth direction shown in FIGS. 25 and 26 as the speed of the host vehicle increases, thereby causing the left autostereoscopic display surface SL to be in the depth direction. Control to move to the far side of Similarly, the video processing unit 12a increases the distance zr in the depth direction shown in FIG. 25 and FIG. 26 as the speed of the host vehicle increases, so that the right naked eye stereoscopic display surface SR is made deeper in the depth direction. Control to move. That is, in the fifth embodiment, the speed of the host vehicle is used as the display surface generation information described in the first embodiment.
  • the display control apparatus 1 controls the position of the autostereoscopic display surface in the depth direction of the autostereoscopic display based on the display surface generation information. According to such a configuration, it is possible to increase the degree of freedom in display of the autostereoscopic display surface.
  • the display surface generation information is the speed of the host vehicle.
  • the display surface generation information may be the position of the autostereoscopic display surface in the depth direction of the autostereoscopic display designated by the user's operation, or as described later, in advance. It may be the relative position of the defined object with respect to the host vehicle.
  • the contents described in the fifth embodiment may be applied not only to the second embodiment but also to the third and fourth embodiments.
  • FIG. 27 is a block diagram showing a configuration of a display control device 1 according to Embodiment 6 of the present invention.
  • constituent elements according to the sixth embodiment constituent elements which are the same as or similar to the constituent elements described above are given the same reference numerals, and different constituent elements are mainly described.
  • the shooting direction of the shot image is automatically changed based on various information such as the condition around the vehicle, and interlocking with it automatically changes the rotation of the autostereoscopic display surface. It is configured to be.
  • the photographed image includes the camera image described in the second embodiment and the partial images described in the third and fourth embodiments, and the photographing direction of the photographed image described in the second embodiment.
  • the camera direction and the virtual shooting direction of the partial image described in the third and fourth embodiments are included.
  • the photographed image is the camera image described in the second embodiment, and the photographing direction of the photographed image is described as the camera direction described in the second embodiment.
  • the configuration in FIG. 27 is similar to the configuration in which the image recognition processing unit 12c is added to the configuration in FIG.
  • the concept of the image recognition processing unit 12c is a concept included in the control unit 12 of FIG.
  • the image recognition processing unit 12c performs an image recognition process on the captured image acquired by the image acquisition unit 11a to acquire the situation around the vehicle from the captured image. For example, when there is another vehicle around the host vehicle in the captured image, the image recognition processing unit 12c obtains the width of the portion where the other vehicle is located in the road such as the lane shown in the captured image. And the image recognition process part 12c calculates
  • the camera control device 26d controls the shooting direction of the shot image based on the situation around the host vehicle acquired by the image recognition processing unit 12c.
  • FIG. 28 is a flowchart showing an operation of the display control device 1 according to the sixth embodiment. This operation is started, for example, when the accessory power supply of the host vehicle is turned on or when the drive source of the host vehicle is turned on, and is performed for each of the left and right electronic mirrors.
  • step S11 the display surface generation information acquiring unit 11b acquires the photographing direction of the photographed image.
  • step S12 the image acquisition unit 11a acquires a photographed image.
  • step S13 the image recognition processing unit 12c performs an image recognition process on the captured image acquired by the image acquisition unit 11a to acquire the situation around the vehicle from the captured image.
  • step S14 the camera control device 26d determines whether or not to change the shooting direction of the shot image based on the situation around the host vehicle acquired by the image recognition processing unit 12c.
  • the situation around the host vehicle is such that the distance between the host vehicle and another vehicle traveling in the adjacent lane adjacent to the lane in which the host vehicle is traveling is less than a predetermined distance (for example, 20 m) If the situation is the case, the camera control device 26d determines that the shooting direction of the shot image should be changed. If it is determined that the shooting direction of the shot image should be changed, the process proceeds to step S15. If it is not determined that the shooting direction of the shot image should be changed, the process proceeds to step S16.
  • step S15 the camera control device 26d changes the shooting direction of the shot image.
  • the camera control device 26d changes the shooting direction of the captured image so that the center of the captured image approaches the other vehicle traveling in the adjacent lane and the distance to the own vehicle is equal to or less than a predetermined distance. Do. Thereafter, the process proceeds to step S17.
  • step S16 the camera control device 26d maintains the shooting direction of the shot image in the initial direction. If the shooting direction of the shot image is not the initial direction, the camera control device 26d changes the shooting direction of the shot image to the initial direction. Thereafter, the process proceeds to step S17.
  • step S17 the video processing unit 12a determines the rotation angle of the autostereoscopic display surface based on the shooting direction of the shot image.
  • step S18 the image processing unit 12a generates an image for the left eye and a right eye for displaying an autostereoscopic display surface by performing image processing on the captured image based on the rotation angle.
  • step S19 the image output unit 12b outputs, to the display device 21, video signals of the left-eye and right-eye images for displaying the autostereoscopic display surface. Thereafter, the process returns to step S11.
  • the photographing direction of the photographed image is automatically changed based on the situation around the host vehicle. According to such a configuration, it is possible to emphasize and display the target to which the driver should pay attention, so the convenience for the driver can be enhanced.
  • the photographed image is the camera image described in the second embodiment, and the photographing direction of the photographed image is the camera direction described in the second embodiment.
  • the photographed image may be the partial image described in the third and fourth embodiments, and the photographing direction of the photographed image may be the virtual photographing direction of the partial image described in the third and fourth embodiments.
  • the display surface generation information acquisition unit 11b may control the photographing direction of the photographed image based on the situation around the host vehicle acquired by the image recognition processing unit 12c instead of the camera control device 26d.
  • the situation around the host vehicle is the distance between the host vehicle and the other vehicle.
  • a detection device such as an optical sensor, millimeter wave radar, high-accuracy image recognition device, or ultrasonic sensor
  • the situation around the vehicle is It may be a relative position of the other vehicle to the own vehicle.
  • the shooting direction of the shot image is automatically changed based on the situation around the host vehicle, but it is not limited to this.
  • the shooting direction of the shot image may be automatically changed based on the traveling condition of the host vehicle including the speed of the host vehicle. Specifically, as the speed of the host vehicle is higher, the shooting direction of the captured image may be changed such that the center of the captured image approaches the rearmost portion of the road of the host vehicle.
  • the shooting direction of the shot image may be automatically changed based on the shape of the road on which the host vehicle is traveling. For example, when the shape of the road on which the host vehicle is traveling is a curve, the shooting direction of the captured image is changed so that the center of the photographed image approaches the rearmost part of the host vehicle's road. It is also good.
  • the shooting direction of the shot image may be automatically changed based on the turning condition of the host vehicle. For example, when the vehicle turns left, the center of the captured image moves to the left from just behind the vehicle, and when the vehicle turns right, the captured image from the rear to the right of the vehicle The shooting direction of the shot image may be changed such that the center of the arrow moves.
  • the photographing direction of the photographed image is the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and Changed based on any one of the vehicle's turn conditions. Then, in conjunction with the change of the photographing direction of the photographed image, the turning angle of the autostereoscopic display surface is also changed.
  • the shooting direction of the shot image is changed based on at least one of the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and the turning condition of the vehicle. It may be done. That is, the display surface generation information is configured to include at least one of the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and the turning condition of the vehicle. It may be done.
  • the photographing direction of the photographed image is changed based on at least one of the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and the turning condition of the vehicle.
  • the rotation angle of the autostereoscopic display surface may be changed.
  • the block diagram showing the configuration of the display control device 1 according to the seventh embodiment of the present invention is the same as the block diagram (FIG. 2) of the display control device 1 according to the second embodiment.
  • constituent elements according to the seventh embodiment constituent elements which are the same as or similar to the constituent elements described above are given the same reference numerals, and different constituent elements are mainly described.
  • the autostereoscopic display surface was a single surface.
  • the autostereoscopic display surface is configured to include a plurality of surfaces.
  • rotation of the autostereoscopic display surface as in the second embodiment is not essential.
  • FIG. 29 is a view schematically showing an autostereoscopic display surface according to the seventh embodiment.
  • the right auto-stereoscopic display surface SR is also similar to the left auto-stereoscopic display surface SL.
  • the left autostereoscopic display surface SL is connected to the first surface SL1 inclined from the xz plane and the back side portion of the first surface SL1 and is parallel to the xy plane, and the display screen DS And a second surface SL2 having a distance of zl.
  • the width of the first surface SL1 may be reduced toward the back side.
  • FIG. 30 is a diagram showing a specific display example of the display of FIG.
  • the road 5 behind the host vehicle is mainly displayed on the first surface SL1, and a blue sky, clouds and the like are mainly displayed on the second surface SL2.
  • the boundary between the first surface SL1 and the second surface SL2 may be aligned with the vanishing line of the road 5 based on the image recognition result and the shooting direction of the captured image.
  • the autostereoscopic display surface includes a plurality of surfaces. According to such a configuration, it is possible to make the display of the electronic mirror similar to a landscape seen by the driver with the naked eye.
  • the contents described in the seventh embodiment may be appropriately applied not only to the second embodiment but also to the third to sixth embodiments.
  • the image processing unit 12a selects one of the plurality of surfaces included in the autostereoscopic display surface based on the attribute of the object.
  • the surface on which the object is displayed may be determined, and control may be performed to display the object on the surface.
  • FIG. 31 is a view schematically showing an autostereoscopic display surface according to the first modification.
  • the left autostereoscopic display surface SL includes, in addition to the first surface SL1 and the second surface SL2, a third surface SL3 connected to the front side portion of the first surface SL1 and parallel to the xy plane. It is.
  • the position of the third surface SL3 is the same as the position of the display screen DS of the display device 21.
  • the video processing unit 12 a displays the first to third surfaces based on the attribute of the body of the vehicle 3.
  • the third surface SL3 of SL1 to SL3 is determined as the surface on which the body of the host vehicle 3 is to be displayed.
  • the video processing part 12a performs control which displays the body of the own vehicle 3 on 3rd surface SL3.
  • the plurality of surfaces included in the left autostereoscopic display surface SL need not be connected to each other.
  • the plurality of surfaces included in the left autostereoscopic display surface SL may be the second surface SL2 and the third surface SL3 which are separated from each other.
  • the display surface generation information may include object information on the relative position of the predetermined object around the host vehicle with respect to the host vehicle.
  • the predetermined objects around the host vehicle are, for example, other vehicles around the host vehicle, features, and the like.
  • the object information may be, for example, the relative position itself of another vehicle in the vicinity of the own vehicle obtained by a periphery detection device provided outside the display control device 1 or the like, or image recognition of an image photographed by each camera It may be the distance between the vehicle and the other vehicle obtained by
  • the image processing unit 12a may control the position of the surface on which the object is displayed among the plurality of surfaces in the depth direction of the autostereoscopic display based on the object information as described above. At this time, the surface on which the object is displayed among the plurality of surfaces may be determined in the same manner as, for example, the first modification of the seventh embodiment.
  • FIG. 33 is a view schematically showing an autostereoscopic display surface according to the second modification.
  • the left autostereoscopic display surface SL includes a fourth surface SL4 parallel to the xy plane in addition to the first surface SL1 and the second surface SL2.
  • the position of the fourth surface SL4 is different from the position of the display screen DS of the display device 21.
  • FIG. 34 is a diagram showing a specific display example of the display of FIG.
  • the object information is the relative position itself of the other vehicle 6 around the host vehicle.
  • the video processing unit 12a performs the first, the second, and the third based on the attribute
  • the fourth surface SL4 of the second and fourth surfaces SL1, SL2, and SL4 is determined as the surface on which the other vehicle 6 is displayed. Then, as shown in FIG.
  • the video processing unit 12a controls the distance ozl between the fourth surface SL4 and the display screen DS of the display device 21 based on the relative position of the other vehicle 6 around the host vehicle. . By performing such control, the image processing unit 12a controls the position of the fourth surface SL4 displaying the other vehicle 6 in the depth direction of the autostereoscopic display.
  • FIG. 35 the display of FIG. 32 may be combined with the display of FIG. That is, the first surface SL1 may be removed from the left autostereoscopic display surface SL in FIG.
  • the display of FIG. 31 may be combined with the display of FIG. That is, the left autostereoscopic display surface SL may include the first surface SL1 to the fourth SL4.
  • FIG. 37 is a view showing a specific display example of the display of FIG.
  • the object is a part of the host vehicle 3, another vehicle 6, or the like. However, as shown in FIG. 38, it may be an alarm mark displayed on the fifth surface SL5 of the left autostereoscopic display surface SL, or may be another display object.
  • the left turning angle ⁇ of the left autostereoscopic display surface SL is larger than 0, the left image outside the left electronic mirror is the right image inside the left electronic mirror. Compared to the rear side, the distance to the driver is relatively large.
  • the distance between a person and an object increases, the size of the object seen by the person decreases.
  • the video processing unit 12a may distort or cut the length in the vertical direction of the captured image IM as it becomes smaller toward the outer side of the captured image IM (from the right to the left). Then, the image processing unit 12a may perform control to display the captured image IM obtained by distorting or clipping on the left autostereoscopic display surface SL where the left rotation angle is greater than 0. .
  • the present modification 5 may be applied to the case where the left autostereoscopic display surface SL includes a plurality of surfaces as shown in FIG.
  • each of the left camera image and the right camera image is a horizontally long image.
  • the video processing unit 12a divides the horizontally long image into a plurality of images arranged in the horizontal direction, and the plurality of images
  • the image of may be dispersed and displayed on a plurality of surfaces included in the autostereoscopic display surface.
  • FIG. 41 is a view schematically showing an autostereoscopic display surface according to the sixth modification.
  • the left autostereoscopic display surface SL includes a sixth surface SL6 and a seventh surface SL7.
  • the size of the sixth surface SL6 corresponds to the size of an image captured by a standard camera (hereinafter referred to as "standard size")
  • the total size of the sixth surface SL6 and the seventh surface SL7 is a wide angle Corresponds to the size of the image taken by the camera.
  • the video processing unit 12a divides the horizontally elongated image captured by the left rear side camera 26a into an image of a standard size and a remaining image which is another image. Then, the video processing unit 12a performs control of displaying an image of a standard size on the sixth surface SL6 and control of displaying the remaining image on the seventh surface SL7.
  • FIG. 42 is a diagram showing a specific display example of the display of FIG. The image processing unit 12a performs the same control as the control on the horizontally long image photographed by the right rear side camera 26b, also on the horizontally long image photographed by the right rear side camera 26b.
  • the driver can distinguish at a glance the range close to the host vehicle and the range far from the host vehicle, and can perform comfortable driving.
  • the video processing unit 12a divides the camera image.
  • the video processing unit 12a may divide a composite image including an image of the left rear of the vehicle, an image of the right rear, and an image between the left rear and the right rear.
  • an image between the left rear and the right rear will be referred to as "immediately after”.
  • a rear rear camera 26g capable of capturing a rear rear image is provided in the host vehicle 3. ing.
  • the camera control device 26d combines the left camera image and the right camera image captured by the left rear side wide-angle camera 26e and the right rear side wide-angle camera 26f with the rear image captured by the rear camera 26g. To generate a composite image. Then, the camera control device 26 d outputs the generated composite image to the display control device 1.
  • the image acquisition unit 11a acquires a composite image from the camera control device 26d as a captured image.
  • FIG. 44 and FIG. 45 are diagrams schematically showing an autostereoscopic display surface according to the seventh modification.
  • the autostereoscopic display surface ST1 corresponding to the maximum range that can be displayed by the display device 21 is the left autostereoscopic display surface SL, the right autostereoscopic display surface SR, and the left autostereoscopic surface It includes a central autostereoscopic display surface SC connected to the visual display surface SL and the right autostereoscopic display surface SR.
  • the respective sizes of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR correspond to the sizes of images captured by the left rear side wide-angle camera 26 e and the right rear side wide-angle camera 26 f.
  • the image processing unit 12a generates a composite image of a horizontally long shape acquired by the image acquisition unit 11a into a left camera image (image on the left rear of the host vehicle), a right camera image (image on the right rear of the host vehicle), and It divides into a back image (image between the left back and the right back). Then, the image processing unit 12a controls the left camera image to be displayed on the left autostereoscopic display surface SL, the control to display the right camera image on the right autostereoscopic display surface SR, and the center naked eye just after the image. Control to display on the stereoscopic display surface SC is performed.
  • the video processing unit 12a does not have to perform control to display all of the autostereoscopic display surface ST1 corresponding to the maximum range that can be displayed by the display device 21 on the display device 21.
  • the video processing unit 12a may perform control to display the partial autostereoscopic display surface ST2, which is a part of the autostereoscopic display surface ST1, on the display device 21 based on the operation received by the operation device 26c. .
  • the shape of the top view of the autostereoscopic display surface ST1 is a shape that brings the central autostereoscopic display surface SC in front of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR as shown in FIG. In the shape shown in FIG. 44, the shape corresponding to the Z-axis direction may be reversed. Further, the shape of the top view of the autostereoscopic display surface ST1 may be a V-shape or a hook shape. In addition, the left camera image and the right camera image may be displayed on the autostereoscopic display surface ST1 having the V shape or the U shape without displaying the just after image.
  • acquisition unit 11 and the control unit 12 in FIG. 1 described above will be referred to as “acquisition unit 11 and the like”.
  • the acquisition unit 11 and the like are realized by the processing circuit 81 shown in FIG. That is, the processing circuit 81 acquires the one captured image around the vehicle and the display surface generation information for generating the autostereoscopic display surface, and the captured image and display acquired by the acquisition unit 11 And a control unit that performs control of displaying a photographed image on the autostereoscopic display surface based on the surface generation information.
  • Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in a memory may be applied.
  • the processor corresponds to, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), and the like.
  • the processing circuit 81 may be, for example, a single circuit, a complex circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC), an FPGA (field programmable gate) An array) or a combination thereof is applicable.
  • ASIC application specific integrated circuit
  • FPGA field programmable gate
  • Each function of each unit such as the acquisition unit 11 may be realized by a circuit in which processing circuits are dispersed, or the function of each unit may be realized by one processing circuit.
  • the processing circuit 81 When the processing circuit 81 is a processor, the functions of the acquisition unit 11 and the like are realized by a combination with software and the like.
  • the software and the like correspond to, for example, software, firmware, or software and firmware.
  • Software and the like are described as a program and stored in the memory 83. As shown in FIG. 47, the processor 82 applied to the processing circuit 81 reads out and executes the program stored in the memory 83 to realize the function of each part.
  • the display control device 1 acquires, when executed by the processing circuit 81, a single captured image around the host vehicle and display surface generation information for generating an autostereoscopic display surface; Controlling the display of the photographed image on the autostereoscopic display surface based on the photographed image and the display surface generation information, and a memory 83 for storing a program that is to be executed as a result.
  • this program causes a computer to execute the procedure and method of the acquisition unit 11 and the like.
  • the memory 83 is, for example, non-volatile or non-volatile, such as random access memory (RAM), read only memory (ROM), flash memory, erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), its drive device, etc. or any storage medium used in the future May be
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • the present invention is not limited to this, and a part of the acquisition unit 11 or the like may be realized by dedicated hardware, and another part may be realized by software or the like.
  • the function of the acquisition unit 11 is realized by the processing circuit 81 and the receiver as dedicated hardware, and the processing circuit 81 as the processor 82 reads out and executes the program stored in the memory 83 for the rest. It is possible to realize the function by that.
  • the processing circuit 81 can realize each of the functions described above by hardware, software, etc., or a combination thereof.
  • the display control device 1 described above includes at least one of a navigation device such as a portable navigation device (PND), a communication terminal including a portable terminal such as a mobile phone, a smartphone and a tablet, a navigation device and a communication terminal.
  • a navigation device such as a portable navigation device (PND)
  • PND portable navigation device
  • a communication terminal including a portable terminal such as a mobile phone, a smartphone and a tablet
  • a navigation device and a communication terminal can also be applied to a display control system constructed as a system by appropriately combining the function of an application installed in one and the server.
  • each function or each component of the display control device 1 described above may be distributed to each of the devices constructing the system, or may be arranged centrally to any of the devices.
  • the display control device may be incorporated into the display device 21 of FIG.
  • FIG. 48 is a block diagram showing a configuration of the server 91 according to the present modification.
  • the server 91 of FIG. 48 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with the navigation device 93 of the vehicle 92.
  • the communication unit 91a which is an acquisition unit, performs wireless communication with the navigation device 93 to receive one captured image around the vehicle 92 and display surface generation information.
  • the control unit 91 b has a function similar to that of the control unit 12 of FIG. 1 by executing a program stored in a memory (not shown) of the server 91 by a processor (not shown) of the server 91 or the like. That is, the control unit 91b generates a control signal for performing control of displaying the photographed image on the autostereoscopic display surface based on the photographed image and the display surface generation information received by the communication unit 91a.
  • the communication unit 91a transmits the control signal to the navigation device 93. According to the server 91 configured as described above, the same effects as the display control device 1 described in the first embodiment can be obtained.
  • FIG. 49 is a block diagram showing a configuration of communication terminal 96 according to the present modification.
  • the communication terminal 96 in FIG. 49 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can communicate wirelessly with the navigation device 98 of the vehicle 97. ing.
  • a mobile terminal such as a mobile phone carried by the driver of the vehicle 97, a smart phone, and a tablet is applied.
  • communication terminal 96 configured as described above, the same effect as display control device 1 described in the first embodiment can be obtained.
  • each embodiment and each modification can be freely combined, or each embodiment and each modification can be suitably modified or omitted.
  • Reference Signs List 1 display control device, 3 own vehicle, 11 acquisition unit, 12 control unit, 21 display device, SL left autostereoscopic display surface, SR right autostereoscopic display surface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The purpose of the invention is to provide a technique of a camera for implementing a naked-eye stereoscopic vision display. A display control device 1 controls a display device 21 capable of implementing a naked-eye stereoscopic vision display by displaying left-eye and right-eye images. The display control device 1 is provided with an acquisition unit 11 and a control unit 12. The acquisition unit 11 acquires one captured image of vehicle periphery and also acquires display plane generation information that is used for generating a naked-eye stereoscopic vision display plane. The control unit 12 implements, on the basis of the captured image and the display plane generation information acquired by the acquisition unit 11, a control for displaying the captured image on the naked-eye stereoscopic vision display plane.

Description

表示制御装置及び表示制御方法Display control apparatus and display control method
 本発明は、表示装置を制御する表示制御装置及び表示制御方法に関する。 The present invention relates to a display control apparatus that controls a display device and a display control method.
 近年、ドアミラーの代わりに車両の左右後側方の状況を運転者に提示する電子ミラーが提案されている。電子ミラーでは、車両の左右後側方をカメラで撮影し、当該撮影によって得られた画像を左右反転させて表示装置で表示する。例えば、特許文献1には、車両走行車線及び隣接車線の後方の状況を表示する電子ミラーが提案されている。 In recent years, instead of a door mirror, an electronic mirror has been proposed which presents the driver with the condition of the left and right rear sides of the vehicle. In the electronic mirror, the left and right rear sides of the vehicle are photographed by a camera, and the image obtained by the photographing is horizontally reversed and displayed by the display device. For example, Patent Document 1 proposes an electronic mirror that displays the situation behind a vehicle travel lane and an adjacent lane.
特開2015-144407号公報JP, 2015-144407, A
 上記とは別に、ユーザの左眼及び右眼に視差を生じさせる左眼用及び右眼用の画像を表示することによって、奥行感、つまり遠近感のある画像を表示する裸眼立体視表示が提案されている。この裸眼立体視表示を電子ミラーに適用すれば、ユーザは、実在するドアミラーの立体的な面と同じような電子ミラーを視認することができる。 Aside from the above, proposed is an autostereoscopic display that displays an image with a sense of depth, that is, a sense of perspective, by displaying left and right eye images that cause parallax to the left and right eyes of the user. It is done. If this autostereoscopic display is applied to an electronic mirror, the user can visually recognize an electronic mirror similar to a stereoscopic surface of an existing door mirror.
 しかしながら、そのような構成では、左側の電子ミラーの裸眼立体視表示を行うための右眼用及び左眼用の画像、並びに、右側の電子ミラーの裸眼立体視表示を行うための右眼用及び左眼用の画像、という合計4つの画像が必要になる。このため、4つの画像をそれぞれ撮影する4つのカメラを用いる構成では、ハードウェアのコストが比較的高く、4つのカメラが起動することにより消費電力が比較的高いという問題があった。 However, in such a configuration, an image for right eye and left eye for performing autostereoscopic display of the left electronic mirror, and an image for right eye for performing autostereoscopic display of the right electronic mirror and A total of four images are required, one for the left eye. For this reason, in the configuration using four cameras that respectively capture four images, the cost of hardware is relatively high, and there is a problem that the power consumption is relatively high when the four cameras are activated.
 そこで、本発明は、上記のような問題点を鑑みてなされたものであり、裸眼立体視表示を行うためのカメラの数を低減可能な技術を提供することを目的とする。 Then, this invention is made in view of the above problems, and an object of this invention is to provide the technique which can reduce the number of cameras for performing an autostereoscopic display.
 本発明に係る表示制御装置は、表示装置を制御する表示制御装置であって、表示装置は、左眼用及び右眼用の画像を表示することによって裸眼立体視表示が可能であり、車両周辺の1つの撮影画像と、裸眼立体視表示上の面である裸眼立体視表示面を生成するための表示面生成情報とを取得する取得部と、取得部で取得された撮影画像及び表示面生成情報に基づいて、裸眼立体視表示面上に撮影画像を表示する制御を行う制御部とを備える。 The display control device according to the present invention is a display control device that controls a display device, and the display device can display an image for left eye and right eye so that autostereoscopic display is possible, and the vehicle periphery is displayed. An acquisition unit for acquiring one photographed image of the image and display surface generation information for generating an autostereoscopic display surface which is a surface on an autostereoscopic display, and a photographed image and display surface generated by the acquisition unit And a control unit configured to perform control to display a photographed image on the autostereoscopic display surface based on the information.
 本発明によれば、1つの撮影画像と表示面生成情報とに基づいて、裸眼立体視表示面上に撮影画像を表示する制御を行う。このような構成によれば、裸眼立体視表示を行うためのカメラの数を低減することができる。 According to the present invention, control is performed to display a captured image on the autostereoscopic display surface based on one captured image and display surface generation information. According to such a configuration, the number of cameras for performing autostereoscopic display can be reduced.
 本発明の目的、特徴、態様及び利点は、以下の詳細な説明と添付図面とによって、より明白となる。 The objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description and the accompanying drawings.
実施の形態1に係る表示制御装置の構成を示すブロック図である。FIG. 1 is a block diagram showing a configuration of a display control device according to Embodiment 1. 実施の形態2に係る表示制御装置の構成を示すブロック図である。FIG. 7 is a block diagram showing a configuration of a display control device according to Embodiment 2. 実施の形態2に係るカメラ方向を説明するための図である。FIG. 10 is a diagram for describing a camera direction according to Embodiment 2. ミラーの調整例を説明するための図である。It is a figure for demonstrating the adjustment example of a mirror. ミラーの調整例を説明するための図である。It is a figure for demonstrating the adjustment example of a mirror. 実施の形態2に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 10 is a perspective view schematically showing an autostereoscopic display surface according to Embodiment 2. 実施の形態2に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 10 is a perspective view schematically showing an autostereoscopic display surface according to Embodiment 2. 実施の形態2に係る裸眼立体視表示面の表示例を示す図である。FIG. 18 is a diagram showing a display example of an autostereoscopic display surface according to Embodiment 2. 実施の形態2に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 10 is a perspective view schematically showing an autostereoscopic display surface according to Embodiment 2. 実施の形態2に係る裸眼立体視表示面の表示例を示す図である。FIG. 18 is a diagram showing a display example of an autostereoscopic display surface according to Embodiment 2. 実施の形態2に係る表示制御装置の動作を示すフローチャートである。7 is a flowchart showing the operation of the display control apparatus according to Embodiment 2; 裸眼立体視表示面の回動角度とカメラ方向との関係を示す図である。It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. 裸眼立体視表示面の回動角度とカメラ方向との関係を示す図である。It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. 裸眼立体視表示面の回動角度とカメラ方向との関係を示す図である。It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. 裸眼立体視表示面の回動角度とカメラ方向との関係を示す図である。It is a figure which shows the relationship between the rotation angle of an autostereoscopic display surface, and a camera direction. 実施の形態2の変形例2に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a second modification of the second embodiment. 実施の形態2の変形例2に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a second modification of the second embodiment. 実施の形態2の変形例3に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a third modification of the second embodiment. 実施の形態2の変形例4に係るカメラ方向を説明するための図である。FIG. 18 is a diagram for describing a camera direction according to a fourth modification of the second embodiment. 実施の形態2の変形例4に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a fourth modification of the second embodiment. 実施の形態3に係る表示制御装置の構成を示すブロック図である。FIG. 16 is a block diagram showing a configuration of a display control device according to Embodiment 3. 標準的なカメラ及び広角カメラの撮影領域を示す図である。It is a figure which shows the imaging | photography area | region of a standard camera and a wide-angle camera. 広角カメラで撮影された画像の一例を示す図である。It is a figure which shows an example of the image image | photographed with the wide angle camera. 実施の形態4に係る表示制御装置の構成を示すブロック図である。FIG. 18 is a block diagram showing a configuration of a display control device according to Embodiment 4. 実施の形態5に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a fifth embodiment. 実施の形態5に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a fifth embodiment. 実施の形態6に係る表示制御装置の構成を示すブロック図である。FIG. 21 is a block diagram showing a configuration of a display control device according to Embodiment 6. 実施の形態6に係る表示制御装置の動作を示すフローチャートである。FIG. 16 is a flowchart showing the operation of the display control apparatus according to Embodiment 6. FIG. 実施の形態7に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 21 is a perspective view schematically showing an autostereoscopic display surface according to a seventh embodiment. 実施の形態7に係る裸眼立体視表示面の表示例を示す図である。FIG. 21 is a diagram showing a display example of an autostereoscopic display surface according to a seventh embodiment. 実施の形態7の変形例1に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 1 of Embodiment 7; 実施の形態7の変形例1に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 1 of Embodiment 7; 実施の形態7の変形例2に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to a second modification of the seventh embodiment. 実施の形態7の変形例2に係る裸眼立体視表示面の表示例を示す図である。FIG. 35 is a diagram showing a display example of an autostereoscopic display surface according to a second modification of the seventh embodiment. 実施の形態7の変形例3に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 31 is a perspective view schematically showing an autostereoscopic display surface according to Variation 3 of Embodiment 7; 実施の形態7の変形例3に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 31 is a perspective view schematically showing an autostereoscopic display surface according to Variation 3 of Embodiment 7; 実施の形態7の変形例3に係る裸眼立体視表示面の表示例を示す図である。FIG. 35 is a diagram showing a display example of an autostereoscopic display surface according to Variation 3 of Embodiment 7. 実施の形態7の変形例4に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 4 of Embodiment 7; 実施の形態7の変形例5に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 5 of Embodiment 7; 実施の形態7の変形例5に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 5 of Embodiment 7; 実施の形態7の変形例6に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 6 of Embodiment 7; 実施の形態7の変形例6に係る裸眼立体視表示面の表示例を示す図である。FIG. 35 is a diagram showing a display example of an autostereoscopic display surface according to Variation 6 of Embodiment 7. 実施の形態7の変形例7に係る裸眼立体視表示面を模式的に示す斜視図である。FIG. 35 is a perspective view schematically showing an autostereoscopic display surface according to Variation 7 of Embodiment 7; 実施の形態7の変形例7に係る裸眼立体視表示面を模式的に示す上面図である。FIG. 35 is a top view schematically showing an autostereoscopic display surface according to Variation 7 of Embodiment 7; 実施の形態7の変形例7に係る裸眼立体視表示面を模式的に示す正面図である。FIG. 56 is a front view schematically showing an autostereoscopic display surface according to Variation 7 of Embodiment 7; その他の変形例に係る表示制御装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. その他の変形例に係る表示制御装置のハードウェア構成を示すブロック図である。It is a block diagram which shows the hardware constitutions of the display control apparatus which concerns on another modification. その他の変形例に係るサーバの構成を示すブロック図である。It is a block diagram showing composition of a server concerning other modifications. その他の変形例に係る通信端末の構成を示すブロック図である。It is a block diagram which shows the structure of the communication terminal which concerns on another modification.
 <実施の形態1>
 以下、本発明の実施の形態1に係る表示制御装置が搭載され、着目の対象となる車両を「自車両」と記載して説明する。
Embodiment 1
Hereinafter, the display control apparatus according to the first embodiment of the present invention is mounted, and a vehicle to be focused on will be described as “own vehicle”.
 図1は、本実施の形態1に係る表示制御装置1の構成を示すブロック図である。図1の表示制御装置1は、取得部11と制御部12とを備えており、表示装置21を制御することが可能となっている。 FIG. 1 is a block diagram showing the configuration of the display control device 1 according to the first embodiment. The display control device 1 of FIG. 1 includes an acquisition unit 11 and a control unit 12 and can control the display device 21.
 表示装置21は、例えば自車両のインストルメントパネル、または、ダッシュボードなどに配設される。この表示装置21は、左眼用及び右眼用の画像を表示することによって裸眼立体視表示が可能となっている。左眼用の画像は、ユーザの右眼には視認されないが左眼には視認される画像であり、右眼用の画像は、ユーザの左眼には視認されないが右眼には視認される画像である。このような左眼用及び右眼用の画像の表示には、例えば偏光フィルタ方式及び液晶シャッタ方式などが用いられる。 The display device 21 is disposed, for example, on an instrument panel or a dashboard of the host vehicle. The display device 21 can perform autostereoscopic display by displaying images for the left eye and the right eye. The image for the left eye is an image that is not viewed by the right eye of the user but is viewed by the left eye, and the image for the right eye is not viewed by the left eye of the user but is viewed by the right eye It is an image. For example, a polarization filter method, a liquid crystal shutter method, or the like is used to display such images for the left eye and the right eye.
 左眼用の画像及び右眼用の画像は、ユーザの左眼及び右眼に視差を生じさせる程度に部分的に互いに異なっており、この視差によって、ユーザは立体的な画像を視認することが可能となっている。 The image for the left eye and the image for the right eye are partially different from each other to the extent that they cause parallax in the left eye and the right eye of the user, and this parallax allows the user to view a stereoscopic image It is possible.
 取得部11は、1つの撮影画像と、表示面生成情報とを取得する。1つの撮影画像は、表示装置21の表示に用いられる画像であり、例えば自車両後方の1つの画像、自車両側方の1つの画像、または、自車両前方の1つの画像などの、自車両周辺の1つの画像である。ドアミラーの表示を行う場合の撮影画像は、自車両後方の画像を左右反転させたミラー表示用の画像である。撮影画像は、カメラなどで撮影された画像の全部であってもよいし、当該画像の一部であってもよい。表示面生成情報は、裸眼立体視表示上の面である裸眼立体視表示面を生成するための情報である。この表示面生成情報の具体例については、実施の形態2以降で説明する。取得部11は、カメラであってもよいし、各種インターフェースであってもよい。 The acquisition unit 11 acquires one captured image and display surface generation information. One captured image is an image used for displaying on the display device 21. For example, the subject vehicle such as one image behind the subject vehicle, one image at the side of the subject vehicle, or one image in front of the subject vehicle It is one image of the periphery. The photographed image in the case of displaying the door mirror is an image for mirror display obtained by reversing the image of the rear of the host vehicle. The captured image may be the entire image captured by a camera or the like, or may be a part of the image. The display surface generation information is information for generating an autostereoscopic display surface which is a surface on the autostereoscopic display. Specific examples of the display surface generation information will be described in Embodiment 2 and later. The acquisition unit 11 may be a camera or various interfaces.
 制御部12は、取得部11で取得された1つの撮影画像と、取得部11で取得された表示面生成情報とに基づいて、裸眼立体視表示面上に撮影画像を表示する制御を行う。 The control unit 12 controls the display of the photographed image on the autostereoscopic display surface based on the one photographed image acquired by the acquisition unit 11 and the display surface generation information acquired by the acquisition unit 11.
 例えば、制御部12は、1つの撮影画像のうち裸眼立体視表示面に対応する部分を左右の一方方向にシフトさせた画像を左眼用の画像として生成し、1つの撮影画像のうち裸眼立体視表示面に対応する部分を上記一方方向と逆方向にシフトさせた画像を右眼用の画像として生成する。これにより、1つの撮影画像のうち裸眼立体視表示面に対応する部分に対するユーザの視差の程度が、それ以外の画像に対するユーザの視差の程度と異なることになる。その結果、ユーザには、奥行き方向において、裸眼立体視表示面に対応する部分の位置と、それ以外の画像の位置とが異なるように見え、裸眼立体視表示面が立体的に見えるようになる。 For example, the control unit 12 generates an image obtained by shifting a portion corresponding to the autostereoscopic display surface of one captured image in one direction to the left or right as an image for the left eye, and generates an autostereoscopic image of one captured image. An image in which a portion corresponding to the visual display surface is shifted in the direction opposite to the one direction is generated as an image for the right eye. As a result, the degree of parallax of the user with respect to the portion corresponding to the autostereoscopic display surface in one captured image is different from the degree of parallax of the user with respect to the other images. As a result, it appears to the user that the position of the portion corresponding to the autostereoscopic display surface is different from the position of the other image in the depth direction, and the autostereoscopic display surface appears stereoscopically .
 例えば、制御部12は、表示面生成情報に基づいて、表示装置21の表示画面から傾斜された裸眼立体視表示面を生成するか否かを判定する。そして、制御部12は、傾斜された裸眼立体視表示面を生成すると判定した場合に、当該裸眼立体視表示面の傾斜に基づいて上記シフトの大きさを変更する。制御部12は、以上の処理によって得られた左眼用及び右眼用の画像を表示装置21に表示させる制御を行う。 For example, the control unit 12 determines whether to generate an autostereoscopic display surface tilted from the display screen of the display device 21 based on the display surface generation information. Then, when it is determined that the inclined autostereoscopic display surface is to be generated, the control unit 12 changes the size of the shift based on the inclination of the autostereoscopic display surface. The control unit 12 controls the display device 21 to display the images for the left eye and the right eye obtained by the above processing.
 <実施の形態1のまとめ>
 以上のような本実施の形態1に係る表示制御装置1では、1つの撮影画像と、表示面生成情報とに基づいて、裸眼立体視表示面上に当該1つの撮影画像を表示する制御を行う。このような構成によれば、1つの撮影画像から1つの電子ミラーなどを裸眼立体視表示することができる。このため、自車両の左後側方の撮影画像及び右後側方の撮影画像を取得する2つのカメラを用いれば、左右の電子ミラーを裸眼立体視表示することができる。これにより、左右の電子ミラーを裸眼立体視表示をするためのカメラの数を低減することができるので、ハードウェアのコスト及び消費電力を低減することができる。
<Summary of Embodiment 1>
The display control device 1 according to the first embodiment as described above performs control to display the one photographed image on the autostereoscopic display surface based on the one photographed image and the display surface generation information. . According to such a configuration, one electronic mirror or the like can be autostereoscopically displayed from one captured image. For this reason, if two cameras for acquiring a photographed image of the left rear side and a photographed image of the right rear side of the own vehicle are used, it is possible to display the left and right electronic mirrors in an autostereoscopic manner. As a result, the number of cameras for autostereoscopic display of the left and right electronic mirrors can be reduced, so that the cost and power consumption of hardware can be reduced.
 また、4つのカメラを用いる場合には、カメラで撮影された画像のあらゆる部分が裸眼立体視表示されるので、ユーザの目への負担が比較的大きい。これに対して、本実施の形態1では、裸眼立体視表示面、つまりカメラで撮影された画像の一部が裸眼立体視表示されるので、ユーザの目への負担を軽減する効果も期待できる。 In addition, in the case of using four cameras, since all parts of the image captured by the cameras are displayed with autostereoscopic display, the burden on the user's eyes is relatively large. On the other hand, in the first embodiment, since the autostereoscopic display surface, that is, a part of the image captured by the camera is displayed in the autostereoscopic manner, the effect of reducing the burden on the user's eyes can also be expected. .
 <実施の形態2>
 図2は、本発明の実施の形態2に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態2に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
Second Embodiment
FIG. 2 is a block diagram showing a configuration of a display control device 1 according to Embodiment 2 of the present invention. Hereinafter, among constituent elements according to the second embodiment, constituent elements that are the same as or similar to the above constituent elements are given the same reference numerals, and different constituent elements are mainly described.
 図2の表示制御装置1は、左用裸眼立体視表示装置21aと、右用裸眼立体視表示装置21bと、カメラ制御装置26dとに接続されている。 The display control device 1 of FIG. 2 is connected to the left autostereoscopic display device 21a for left, the autostereoscopic display device 21b for right, and the camera control device 26d.
 左用裸眼立体視表示装置21a及び右用裸眼立体視表示装置21bのそれぞれは、図1の表示装置21に対応している。左用裸眼立体視表示装置21aは左電子ミラーを裸眼立体視表示するための左眼用及び右眼用の画像を表示し、右用裸眼立体視表示装置21bは右電子ミラーを裸眼立体視表示するための左眼用及び右眼用の画像を表示する。なお、左用裸眼立体視表示装置21a及び右用裸眼立体視表示装置21bの代わりに1つの裸眼立体視表示装置が用いられてもよい。以下、左用裸眼立体視表示装置21aと右用裸眼立体視表示装置21bとを区別しない場合にはこれらを「表示装置21」と記して説明することもある。 Each of the left-eye autostereoscopic display 21a and the right-eye autostereoscopic display 21b corresponds to the display 21 shown in FIG. The left-handed naked-eye stereoscopic display 21a displays an image for the left eye and the right-eye for displaying the left electronic mirror stereoscopically, and the right-handed naked-eye stereoscopic display 21b displays the right electronic mirror nakedly Display images for the left and right eyes. Note that one autostereoscopic display may be used instead of the left autostereoscopic display 21a and the right autostereoscopic display 21b. Hereinafter, when the left-eye autostereoscopic display 21a and the right-eye autostereoscopic display 21b are not distinguished from one another, they may be described as "display 21".
 カメラ制御装置26dは、左後側方カメラ26a、右後側方カメラ26b、及び、操作装置26cと接続されている。 The camera control device 26d is connected to the left rear side camera 26a, the right rear side camera 26b, and the operation device 26c.
 左後側方カメラ26aは、自車両の左後側方を撮影するカメラであり、左ドアミラーの代わりに自車両に設けられている。右後側方カメラ26bは、自車両の右後側方を撮影するカメラであり、右ドアミラーの代わりに自車両に設けられている。本実施の形態2では、左後側方カメラ26a及び右後側方カメラ26bのそれぞれは、パノラマカメラでも広角カメラでもなく、画角が10度程度の画像を撮影する標準的なカメラであるものとして説明する。 The left rear side camera 26a is a camera for photographing the left rear side of the host vehicle, and is provided on the host vehicle instead of the left door mirror. The right rear side camera 26b is a camera for photographing the right rear side of the host vehicle, and is provided on the host vehicle instead of the right door mirror. In the second embodiment, each of the left rear side camera 26a and the right rear side camera 26b is not a panoramic camera or a wide-angle camera, but is a standard camera that captures an image with an angle of view of about 10 degrees. Explain as.
 なお、以下の説明では、左後側方カメラ26aで撮影された画像に基づくミラー表示用の画像を「左カメラ画像」と記すこともあり、右後側方カメラ26bで撮影された画像に基づくミラー表示用の画像を「右カメラ画像」と記すこともあり、左カメラ画像と右カメラ画像とを区別しない場合にはこれらを「カメラ画像」と記すこともある。 In the following description, an image for mirror display based on an image captured by the left rear side camera 26a may be referred to as a “left camera image”, and based on an image captured by the right rear side camera 26b An image for mirror display may be described as a “right camera image”, and when the left camera image and the right camera image are not distinguished, these may be described as a “camera image”.
 また、以下の説明では、左後側方カメラ26aが撮影する方向を「左カメラ方向」と記すこともあり、右後側方カメラ26bが撮影する方向を「右カメラ方向」と記すこともあり、左カメラ方向と右カメラ方向とを区別しない場合にはこれらを「カメラ方向」と記すこともある。 Further, in the following description, the direction in which the left rear side camera 26a shoots may be referred to as "left camera direction", and the direction in which the right rear side camera 26b is photographed may be referred to as "right camera direction". When the left camera direction and the right camera direction are not distinguished from one another, these may be referred to as “camera direction”.
 図3は、自車両3における左カメラ方向及び右カメラ方向の一例を示す図である。本実施の形態2では、左カメラ方向の基準方向は自車両3の真後方向D1であり、左カメラ方向は、真後方向D1との間に角度θl1を成す方向である。また本実施の形態2では、右カメラ方向の基準方向は自車両3の真後方向D1であり、右カメラ方向は、真後方向D1との間に角度θr1を成す方向である。ただし、カメラ方向の基準方向は、例えば、後で図4及び図5を用いて説明する標準的に調整された方向であってもよいし、ユーザによって表示制御装置1に予め設定されて記憶された方向であってもよい。 FIG. 3 is a view showing an example of the left camera direction and the right camera direction in the host vehicle 3. In the second embodiment, the reference direction in the left camera direction is the just rear direction D1 of the vehicle 3 and the left camera direction is the direction that forms an angle θl1 with the just rear direction D1. In the second embodiment, the reference direction in the right camera direction is the just rear direction D1 of the vehicle 3 and the right camera direction is the direction that forms an angle θr1 with the just rear direction D1. However, the reference direction of the camera direction may be, for example, a standard adjusted direction which will be described later with reference to FIGS. 4 and 5, or may be preset and stored in the display control device 1 by the user. It may be the other direction.
 なお、左カメラ方向と角度θl1とは一対一の関係にあることから、以下の説明では、左カメラ方向と角度θl1とを同一に扱い、左カメラ方向θl1と記すこともある。同様に、右カメラ方向と角度θr1とは一対一の関係にあることから、以下の説明では、右カメラ方向と角度θr1とを同一に扱い、右カメラ方向θr1と記すこともある。 Since the left camera direction and the angle θl1 have a one-to-one relationship, in the following description, the left camera direction and the angle θl1 will be treated the same and may be referred to as the left camera direction θl1. Similarly, since the right camera direction and the angle θr1 have a one-to-one relationship, in the following description, the right camera direction and the angle θr1 are treated the same and may be referred to as the right camera direction θr1.
 図2の操作装置26cは、例えば操作スイッチであり、左カメラ方向及び右カメラ方向を個別に変更するための操作と、左後側方カメラ26a及び右後側方カメラ26bの画角を個別に変更するための操作とを受け付ける。 The operation device 26c in FIG. 2 is, for example, an operation switch, and an operation for changing the left camera direction and the right camera direction separately, and the angle of view of the left rear side camera 26a and the right rear side camera 26b individually Accept the operation to change.
 カメラ制御装置26dは、操作装置26cで受け付けた操作に基づいて、左後側方カメラ26a及び右後側方カメラ26bのそれぞれに対し、カメラ方向、画角及び焦点を変更する制御を適宜行う。このような構成によれば、操作装置26cにおける操作が適宜行われることによって、左後側方カメラ26a及び右後側方カメラ26bで撮影される範囲などを適宜調整することが可能である。 The camera control device 26d appropriately performs control to change the camera direction, the angle of view, and the focus with respect to each of the left rear side camera 26a and the right rear side camera 26b based on the operation received by the operation device 26c. According to such a configuration, it is possible to appropriately adjust the range or the like photographed by the left rear side camera 26a and the right rear side camera 26b by appropriately performing the operation in the operation device 26c.
 図4及び図5は、標準的なドアミラーの鏡像の調整例を示す図である。なお、ここでは、ドアミラーの一般的な調整例について説明するが、左後側方カメラ26a及び右後側方カメラ26bで撮影される範囲の調整も、ドアミラーの調整と同様に行われる。 FIGS. 4 and 5 are diagrams showing an example of adjustment of a mirror image of a standard door mirror. In addition, although the general adjustment example of a door mirror is demonstrated here, adjustment of the range image | photographed with the left rear side camera 26a and the right rear side camera 26b is performed similarly to adjustment of a door mirror.
 図4に示すように、自車両3の後部から後方30mの位置にあり、自車両の側部から側方5mの幅の範囲が運転者に視認されるように、ドアミラーは調整される。図5に示すように、右ドアミラー4の鏡面4a左端から鏡面4aの横の長さの1/4程度の範囲にて自車両3のボディが運転者に視認され、かつ、鏡面4a下端から鏡面4aの縦の長さの1/3程度の範囲にて道路5が運転者に視認できるように、右ドアミラー4は調整される。左ドアミラーには、右ドアミラーの調整において左右を逆にした調整が行われる。左右のドアミラーにおいて推奨される調整は、左右のドアミラー及び国別の推奨値、並びに、左右のドアミラーの広さ及び種類などによって異なる。 As shown in FIG. 4, the door mirror is adjusted so that the driver can visually recognize a range of width of 5 m from the side of the host vehicle 30 m behind the host vehicle 3. As shown in FIG. 5, the body of the vehicle 3 is visually recognized by the driver in the range of about 1⁄4 of the horizontal length of the mirror surface 4a from the left end of the mirror surface 4a of the right door mirror 4 and The right door mirror 4 is adjusted so that the road 5 can be viewed by the driver in a range of about 1/3 of the vertical length of 4a. In the left door mirror, an adjustment is performed in which the right and left sides are reversed in the adjustment of the right door mirror. Adjustments recommended for the left and right door mirrors differ depending on the left and right door mirrors and country-specific recommendations, and the size and type of the left and right door mirrors.
 図2に戻って、カメラ制御装置26dは、左後側方カメラ26a及び右後側方カメラ26bを以上のように制御するだけでなく、左後側方カメラ26a及び右後側方カメラ26bで撮影された左カメラ画像及び右カメラ画像と、左後側方カメラ26a及び右後側方カメラ26bの左カメラ方向及び右カメラ方向とを表示制御装置1に出力する。 Returning to FIG. 2, the camera control device 26 d not only controls the left rear side camera 26 a and the right rear side camera 26 b as described above, but also with the left rear side camera 26 a and the right rear side camera 26 b The taken left camera image and the right camera image, and the left camera direction and the right camera direction of the left rear side camera 26 a and the right rear side camera 26 b are output to the display control device 1.
 次に、図2の表示制御装置1の内部構成について説明する。この表示制御装置1は、画像取得部11aと、表示面生成情報取得部11bと、映像処理部12aと、画像出力部12bとを備える。なお、画像取得部11a及び表示面生成情報取得部11bの概念は、図1の取得部11に含まれる概念であり、映像処理部12a及び画像出力部12bの概念は、図1の制御部12に含まれる概念である。 Next, the internal configuration of the display control device 1 of FIG. 2 will be described. The display control device 1 includes an image acquisition unit 11a, a display surface generation information acquisition unit 11b, a video processing unit 12a, and an image output unit 12b. The concepts of the image acquisition unit 11a and the display surface generation information acquisition unit 11b are the concepts included in the acquisition unit 11 of FIG. 1, and the concepts of the video processing unit 12a and the image output unit 12b are the control unit 12 of FIG. Is a concept included in
 画像取得部11aは、カメラ制御装置26dからのカメラ画像を、撮影画像として取得する。 The image acquisition unit 11a acquires a camera image from the camera control device 26d as a captured image.
 表示面生成情報取得部11bは、カメラ制御装置26dからのカメラ方向を、撮影画像に関する方向たる撮影方向として取得する。 The display surface generation information acquisition unit 11 b acquires the camera direction from the camera control device 26 d as a photographing direction that is a direction regarding a photographed image.
 映像処理部12aは、画像取得部11aで取得された撮影画像であるカメラ画像と、表示面生成情報取得部11bで取得された撮影方向であるカメラ方向とに基づいて、裸眼立体視表示面を生成する。つまり、本実施の形態2では、映像処理部12aは、カメラ方向を、実施の形態1で説明した表示面生成情報として用いる。 The image processing unit 12a is configured based on the camera image which is a photographed image acquired by the image acquiring unit 11a and the camera direction which is a photographing direction acquired by the display surface generation information acquiring unit 11b. Generate That is, in the second embodiment, the video processing unit 12a uses the camera direction as the display surface generation information described in the first embodiment.
 図6は、本実施の形態2に係る映像処理部12aによって生成される裸眼立体視表示面を示す図である。以下の説明では、左カメラ画像に対応する裸眼立体視表示面を「左裸眼立体視表示面」と記すこともあり、右カメラ画像に対応する裸眼立体視表示面を「右裸眼立体視表示面」と記すこともある。 FIG. 6 is a diagram showing an autostereoscopic display surface generated by the video processing unit 12a according to the second embodiment. In the following description, the autostereoscopic display surface corresponding to the left camera image may be referred to as “left autostereoscopic display surface”, and the autostereoscopic display surface corresponding to the right camera image is referred to as “right autostereoscopic display surface”. It may be described as ".
 図6に示すように、左裸眼立体視表示面SL及び右裸眼立体視表示面SRはいずれも平面である。左裸眼立体視表示面SLについては、表示装置21の表示画面の上方向、左方向、奥行き方向に対してxyz軸が規定され、右裸眼立体視表示面SRについては、表示装置21の表示画面の上方向、右方向、奥行き方向に対してxyz軸が規定されている。 As shown in FIG. 6, the left autostereoscopic display surface SL and the right autostereoscopic display surface SR are both flat. For the left autostereoscopic display surface SL, the xyz axes are defined in the upper direction, the left direction, and the depth direction of the display screen of the display device 21, and for the right autostereoscopic display surface SR, the display screen of the display device 21 The x, y, and z axes are defined in the upward, rightward, and depth directions.
 本実施の形態2に係る映像処理部12aは、左カメラ方向θl1に基づいて、予め定められた軸に対する左裸眼立体視表示面SLの回動を制御する。具体的には、映像処理部12aは、左裸眼立体視表示面SLの右辺を予め定められた軸として用い、当該軸を基準にして、表示装置21の表示画面DSからその奥行き方向に、左カメラ方向θl1と同じ角度θl2だけ左裸眼立体視表示面SLを回動させる。なお、以下の説明では、左裸眼立体視表示面SLの回動を規定する当該角度θl2を「左回動角度θl2」と記すこともある。 The image processing unit 12a according to the second embodiment controls the rotation of the left autostereoscopic display surface SL with respect to a predetermined axis based on the left camera direction θl1. Specifically, the video processing unit 12a uses the right side of the left autostereoscopic display surface SL as a predetermined axis, and the left side in the depth direction from the display screen DS of the display device 21 with reference to the axis. The left autostereoscopic display surface SL is rotated by the same angle θl2 as the camera direction θl1. In the following description, the angle θ12 defining the rotation of the left autostereoscopic display surface SL may be referred to as a “left rotation angle θ12”.
 同様に、本実施の形態2に係る映像処理部12aは、右カメラ方向θr1に基づいて、予め定められた軸に対する右裸眼立体視表示面SRの回動を制御する。具体的には、映像処理部12aは、右裸眼立体視表示面SRの左辺を予め定められた軸として用い、当該軸を基準にして、表示装置21の表示画面DSからその奥行き方向に、右カメラ方向θr1と同じ角度θr2だけ右裸眼立体視表示面SRを回動させる。なお、以下の説明では、右裸眼立体視表示面SRの回動を規定する当該角度θr2を「右回動角度θr2」と記すこともあり、左回動角度θl2と右回動角度θr2とを区別しない場合にはこれらを「回動角度」と記すこともある。 Similarly, the image processing unit 12a according to the second embodiment controls the rotation of the right autostereoscopic display surface SR with respect to a predetermined axis based on the right camera direction θr1. Specifically, the video processing unit 12a uses the left side of the right-handed naked-eye stereoscopic display surface SR as a predetermined axis, and the right side in the depth direction from the display screen DS of the display device 21 with reference to the axis. The right autostereoscopic display surface SR is rotated by the same angle θr2 as the camera direction θr1. In the following description, the angle θr2 defining the rotation of the right naked eye stereoscopic display surface SR may be referred to as “right rotation angle θr2”, and the left rotation angle θl2 and the right rotation angle θr2 When not distinguished, these may be described as a "rotation angle."
 図7は、左カメラ方向θl1=左回動角度θl2=0度、かつ、右カメラ方向θr1=右回動角度θr2=0度である場合の、本実施の形態2に係る左裸眼立体視表示面SL及び右裸眼立体視表示面SRを模式的に示す図である。この場合、左裸眼立体視表示面SL及び右裸眼立体視表示面SRの法線方向は、表示装置21の表示画面DSの垂直方向と同じとなる。 FIG. 7 is a left naked eye stereoscopic display according to the second embodiment in the case where the left camera direction θl1 = left rotation angle θl2 = 0 degrees and the right camera direction θr1 = right rotation angle θr2 = 0 degrees. It is a figure which shows surface SL and the right naked eye stereoscopic vision display surface SR typically. In this case, the normal directions of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR are the same as the vertical direction of the display screen DS of the display device 21.
 図8は、図7の表示の具体的な表示例を示す図である。図8の例では、メータが設けられたメータ領域7を挟んだ状態で左用裸眼立体視表示装置21a及び右用裸眼立体視表示装置21bが設けられ、左裸眼立体視表示面SL及び右裸眼立体視表示面SRが、それぞれ左電子ミラー及び右電子ミラーとして表示されている。左裸眼立体視表示面SLには、自車両3のボディと、道路5と、他車両6とが表示され、右裸眼立体視表示面SRには、自車両3のボディと、道路5とが表示されている。 FIG. 8 is a diagram showing a specific display example of the display of FIG. In the example of FIG. 8, the left-handed autostereoscopic display 21 a and the right-handed autostereoscopic display 21 b are provided with the meter region 7 provided with a meter, and the left autostereoscopic display surface SL and the right autostereoscopic The visual display surface SR is displayed as a left electronic mirror and a right electronic mirror, respectively. The body of the host vehicle 3, the road 5, and the other vehicle 6 are displayed on the left naked eye stereoscopic display surface SL, and the body of the host vehicle 3 and the road 5 are displayed on the right naked eye stereoscopic display surface SR. It is displayed.
 図9は、左カメラ方向θl1=左回動角度θl2≠0度、かつ、右カメラ方向θr1=右回動角度θr2≠0度である場合の、本実施の形態2に係る左裸眼立体視表示面SL及び右裸眼立体視表示面SRを模式的に示す図である。この場合、左裸眼立体視表示面SL及び右裸眼立体視表示面SRの法線方向は、それぞれ表示装置21の表示画面DSの垂直方向に対して左回動角度θl2及び右回動角度θr2だけ外向きの角度となる。 FIG. 9 is a left naked-eye stereoscopic display according to the second embodiment when the left camera direction θl1 = left rotation angle θl2θ0 degrees and the right camera direction θr1 = right rotation angle θr2r0 degrees. It is a figure which shows surface SL and the right naked eye stereoscopic vision display surface SR typically. In this case, the normal directions of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR are the left rotation angle θ12 and the right rotation angle θr2 with respect to the vertical direction of the display screen DS of the display device 21, respectively. It is an outward angle.
 図10は、図9の表示の具体な表示例を示す図である。図10の例では、図8の状態から、左カメラ方向θl1=左回動角度θl2>0度かつ右カメラ方向θr1=右回動角度θr2>0度の状態に変更されている。また、左カメラ方向θl1が0度よりも十分に大きいので、左用裸眼立体視表示装置21aにて自車両3のボディが表示されなくなっており、右カメラ方向θr1が0度よりも十分に大きいので、右用裸眼立体視表示装置21bにて自車両3のボディが表示されなくなっている。 FIG. 10 is a diagram showing a specific display example of the display of FIG. In the example of FIG. 10, the state of FIG. 8 is changed to the state of left camera direction θl1 = left rotation angle θl2> 0 degrees and right camera direction θr1 = right rotation angle θr2> 0 degrees. In addition, since the left camera direction θl1 is sufficiently larger than 0 degrees, the body of the host vehicle 3 is not displayed on the left-eye autostereoscopic display device 21a, and the right camera direction θr1 is sufficiently larger than 0 degrees. The body of the vehicle 3 is not displayed on the right-handed autostereoscopic display 21b.
 なお、左電子ミラーの外側である左側の画像は、左電子ミラーの内側である右側の画像に比べて奥側に位置することになる。このことを考慮して、図10のように、左電子ミラーの外辺である右辺が、左電子ミラーの内辺である左辺に比べて短く表示されてもよい。同様に、図10のように、右電子ミラーの外辺である左辺が、右電子ミラーの内辺である右辺に比べて短く表示されてもよい。これについては、実施の形態7の変形例5にて詳細に説明する。本実施の形態2に係る表示制御装置1では、以上のような図7~図10のような表示を行うことができるので、実物のドアミラーに似た電子ミラーを表示することができる。 The image on the left side, which is the outside of the left electron mirror, is positioned behind the image on the right side, which is the inside of the left electron mirror. Taking this into consideration, as shown in FIG. 10, the right side which is the outer side of the left electron mirror may be displayed shorter than the left side which is the inner side of the left electron mirror. Similarly, as shown in FIG. 10, the left side which is the outer side of the right electron mirror may be displayed shorter than the right side which is the inner side of the right electron mirror. This will be described in detail in a fifth modification of the seventh embodiment. The display control apparatus 1 according to the second embodiment can perform the display as shown in FIGS. 7 to 10 as described above, so that an electronic mirror resembling a real door mirror can be displayed.
 図2の画像出力部12bは、映像処理部12aで生成された左裸眼立体視表示面SLを表示するための左眼用及び右眼用の画像の映像信号を左用裸眼立体視表示装置21aに出力する。また、画像出力部12bは、映像処理部12aで生成された右裸眼立体視表示面SRを表示するための左眼用及び右眼用の画像の映像信号を右用裸眼立体視表示装置21bに出力する。 The image output unit 12b shown in FIG. 2 outputs the image signals of the left-eye and right-eye images for displaying the left autostereoscopic display surface SL generated by the video processing unit 12a to the left-eye autostereoscopic display device 21a. Output. In addition, the image output unit 12 b transmits the image signals of the left-eye and right-eye images for displaying the right-eye-eye stereoscopic display surface SR generated by the image processing unit 12 a to the right-handed naked-eye stereoscopic display 21 b. Output.
 <動作>
 図11は、本実施の形態2に係る表示制御装置1の動作を示すフローチャートである。この動作は、例えば自車両のアクセサリー電源がオンになった場合、及び、自車両の駆動源がオンになった場合などに開始される。
<Operation>
FIG. 11 is a flowchart showing the operation of the display control device 1 according to the second embodiment. This operation is started, for example, when the accessory power supply of the host vehicle is turned on, or when the drive source of the host vehicle is turned on.
 まずステップS1にて、表示面生成情報取得部11bは、左カメラ方向θl1及び右カメラ方向θr1を取得する。 First, in step S1, the display surface generation information acquiring unit 11b acquires the left camera direction θl1 and the right camera direction θr1.
 ステップS2にて、映像処理部12aは、左カメラ方向θl1に基づいて、左裸眼立体視表示面SLの左回動角度θl2を決定する。同様に、映像処理部12aは、右カメラ方向θr1に基づいて、右裸眼立体視表示面SRの右回動角度θr2を決定する。 In step S2, the video processing unit 12a determines the left rotation angle θl2 of the left autostereoscopic display surface SL based on the left camera direction θl1. Similarly, the image processing unit 12a determines the right rotation angle θr2 of the right-eye-eye stereoscopic display surface SR based on the right camera direction θr1.
 ステップS3にて、画像取得部11aは、左カメラ画像及び右カメラ画像を取得する。 In step S3, the image acquisition unit 11a acquires a left camera image and a right camera image.
 ステップS4にて、映像処理部12aは、左回動角度θl2に基づいて左カメラ画像に画像処理を行うことにより、左裸眼立体視表示面SLを表示するための左眼用及び右眼用の画像を生成する。同様に、映像処理部12aは、右回動角度θr2に基づいて右カメラ画像に画像処理を行うことにより、右裸眼立体視表示面SRを表示するための左眼用及び右眼用の画像を生成する。 In step S4, the image processing unit 12a performs image processing on the left camera image based on the left rotation angle θ12 to display the left autostereoscopic display surface SL, for the left eye and the right eye. Generate an image. Similarly, the image processing unit 12 a performs image processing on the right camera image based on the right rotation angle θr 2 to display images for the left eye and the right eye for displaying the right autostereoscopic display surface SR. Generate
 ステップS5にて、画像出力部12bは、左裸眼立体視表示面SLを表示するための左眼用及び右眼用の画像の映像信号を左用裸眼立体視表示装置21aに出力し、右裸眼立体視表示面SRを表示するための左眼用及び右眼用の画像の映像信号を右用裸眼立体視表示装置21bに出力する。その後、ステップS1に処理が戻る。 In step S5, the image output unit 12b outputs the video signals of the left-eye and right-eye images for displaying the left autostereoscopic display surface SL to the left autostereoscopic display 21a, and the right autostereoscopic display Video signals of images for the left eye and for the right eye for displaying the visual display surface SR are output to the right-handed autostereoscopic display 21 b. Thereafter, the process returns to step S1.
 <実施の形態2のまとめ>
 以上のような本実施の形態2に係る表示制御装置1では、カメラ方向(撮影方向)に基づいて、予め定められた軸に対する裸眼立体視表示面の回動を制御する。このような構成によれば、実物のドアミラーに似た電子ミラーを表示することができる。
<Summary of Embodiment 2>
In the display control device 1 according to the second embodiment as described above, the rotation of the autostereoscopic display surface with respect to a predetermined axis is controlled based on the camera direction (shooting direction). According to such a configuration, it is possible to display an electronic mirror resembling a real door mirror.
 また本実施の形態2では、カメラ方向(撮影方向)は、表示制御装置1外部からの操作に基づいて変更される。このような構成によれば、ユーザは、実物のドアミラーの回動に対する操作と同様に電子ミラーの回動に対する操作を行うことができる。 Further, in the second embodiment, the camera direction (the photographing direction) is changed based on an operation from the outside of the display control device 1. According to such a configuration, the user can perform the operation for the rotation of the electronic mirror as well as the operation for the rotation of the actual door mirror.
 <実施の形態2の変形例1>
 実施の形態2では、図12に示すように、裸眼立体視表示面の回動角度θl2,θr2は、カメラ方向θl1,θr1とそれぞれ同じであった。つまり、裸眼立体視表示面の回動量は、カメラ方向の角度の変化量と同じであった。しかしながら、裸眼立体視表示面の回動量は、カメラ方向の角度の変化量と異なっていてもよい。
<Modification 1 of Embodiment 2>
In the second embodiment, as shown in FIG. 12, the rotation angles θl2 and θr2 of the autostereoscopic display surface are the same as the camera directions θl1 and θr1, respectively. That is, the amount of rotation of the autostereoscopic display surface was the same as the amount of change of the angle in the camera direction. However, the amount of rotation of the autostereoscopic display surface may be different from the amount of change in the angle in the camera direction.
 例えば、裸眼立体視表示面の回動角度θl2,θr2は、カメラ方向θl1,θr1の関数fl(θl1),fr(θr1)の値であってもよい。 For example, the rotation angles θl2 and θr2 of the autostereoscopic display surface may be values of functions fl (θl1) and fr (θr1) of the camera directions θl1 and θr1.
 第1例として図13に示すように、関数fl(θl1),fr(θr1)は、0<k<1を満たすkを用いてfl(θl1)=k×θl1、fr(θr1)=k×θr1と表されてもよい。この場合、裸眼立体視表示面の回動量は、カメラ方向の角度の変化量以下となる。 As shown in FIG. 13 as a first example, the functions fl (θl1) and fr (θr1) are fl (θl1) = k × θl1, fr (θr1) = k × using k satisfying 0 <k <1. It may be expressed as θr1. In this case, the amount of rotation of the autostereoscopic display surface is equal to or less than the amount of change in the angle in the camera direction.
 第2例として図14に示すように、カメラ方向θl1,θr1と、カメラ方向θl1,θr1の基準方向(ここではカメラ方向=0度に対応する方向)との差が、予め定められた閾値Tθより小さい場合には、関数fl(θl1),fr(θr1)は0であってもよい。そして、カメラ方向θl1,θr1と、カメラ方向θl1,θr1の基準方向(ここではカメラ方向=0度に対応する方向)との差が、予め定められた閾値Tθ以上である場合に、関数fl(θl1),fr(θr1)はカメラ方向θl1,θr1の増加関数であってもよい。このような構成によれば、映像処理部12aは、カメラ方向θl1,θr1と、カメラ方向θl1,θr1の基準方向との差が、予め定められた閾値Tθ以上である場合に、裸眼立体視表示面の回動を行うことになる。 As shown in FIG. 14 as a second example, the difference between the camera directions θl1 and θr1 and the reference directions of the camera directions θl1 and θr1 (here, the direction corresponding to the camera direction = 0 degree) is a predetermined threshold Tθ. If smaller, the functions fl (θl1) and fr (θr1) may be zero. Then, when the difference between the camera direction θl1, θr1 and the reference direction of the camera direction θl1, θr1 (here, the direction corresponding to the camera direction = 0 degree) is equal to or greater than a predetermined threshold Tθ, the function fl ( θl1) and fr (θr1) may be increasing functions of the camera directions θl1 and θr1. According to such a configuration, when the difference between the camera directions θl1 and θr1 and the reference directions of the camera directions θl1 and θr1 is equal to or greater than a predetermined threshold Tθ, the video processing unit 12a displays the naked eye stereoscopic display It will rotate the surface.
 第3例として図15に示すように、関数fl(θl1),fr(θr1)は、カメラ方向θl1,θr1の増加関数であって、その値の増加の程度が徐々に低減する関数であってもよい。また、関数fl(θl1),fr(θr1)の値に、図13~図15の45度のような上限値が設けられてもよい。 As shown in FIG. 15 as a third example, the functions fl (.theta.l1) and fr (.theta.r1) are increasing functions of the camera directions .theta.l1 and .theta.r1, and the degree of increase in the value thereof is gradually reduced It is also good. In addition, upper limit values such as 45 degrees in FIGS. 13 to 15 may be set to the values of the functions fl (θ11) and fr (θr1).
 なお、関数fl(θl1),fr(θr1)は、以上に限ったものではなく、その他のあらゆる式を用いることが可能である。また、関数fl(θl1)と関数fr(θr1)とは同じであってもよいし、異なっていてもよい。例えば、左カメラ方向θl1の変化量と右カメラ方向θr1の変化量とが同じであっても、左裸眼立体視表示面SL及び右裸眼立体視表示面SRのうち、運転席に近い一方の表示面の回動量が他方の表示面の回動量よりも小さくなるような関数fl(θl1),fr(θr1)を用いてもよい。このような構成によれば、運転者とミラーとの距離を考慮して裸眼立体視表示面の回動を制御することができる。 The functions fl (θl1) and fr (θr1) are not limited to the above, and any other equation can be used. Also, the function fl (θl1) and the function fr (θr1) may be the same or different. For example, even if the amount of change in the left camera direction θl1 and the amount of change in the right camera direction θr1 are the same, one of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR that is closer to the driver's seat The functions fl (θl1) and fr (θr1) may be used such that the amount of rotation of the surface is smaller than the amount of rotation of the other display surface. According to such a configuration, the rotation of the autostereoscopic display surface can be controlled in consideration of the distance between the driver and the mirror.
 <実施の形態2の変形例2>
 実施の形態2では、左裸眼立体視表示面SLの回動の基準となる軸(以下「左回動軸」と記す)は、左裸眼立体視表示面SLの右辺であり(図6)、右裸眼立体視表示面SRの回動の基準となる軸(以下「右回動軸」と記す)は、右裸眼立体視表示面SRの左辺であった(図6)。
<Modification 2 of Embodiment 2>
In the second embodiment, the axis serving as the reference of the rotation of the left autostereoscopic display surface SL (hereinafter referred to as the “left rotation axis”) is the right side of the left autostereoscopic display surface SL (FIG. 6) The axis serving as the reference of the rotation of the right naked eye stereoscopic display surface SR (hereinafter referred to as “right rotation axis”) is the left side of the right naked eye stereoscopic display surface SR (FIG. 6).
 しかしながら、例えば図16に示すように、左回動軸の位置は、左裸眼立体視表示面SLの左右方向の中心部の位置であってもよいし、右回動軸の位置は、右裸眼立体視表示面SRの左右方向の中心部の位置であってもよい。また例えば図17に示すように、左回動軸の位置は、左裸眼立体視表示面SLの左辺の位置であってもよいし、右回動軸の位置は、右裸眼立体視表示面SRの右辺の位置であってもよい。 However, for example, as shown in FIG. 16, the position of the left rotation axis may be the position of the central portion in the left-right direction of the left autostereoscopic display surface SL, and the position of the right rotation axis is the right naked eye It may be the position of the center of the stereoscopic display surface SR in the left-right direction. Further, for example, as shown in FIG. 17, the position of the left rotation axis may be the position of the left side of the left autostereoscopic display surface SL, and the position of the right rotation axis is the right autostereoscopic display surface SR It may be the position of the right side of.
 なお、左回動軸及び右回動軸の位置は、ユーザによって表示制御装置1に予め設定されて記憶された位置であってもよい。また、映像処理部12aは、裸眼立体視表示面の回動角度θl2,θr2に基づいて左回動軸及び右回動軸をそれぞれ移動してもよい。例えば、映像処理部12aは、左回動角度θl2が大きくなるほど左回動軸を左裸眼立体視表示面SLの左辺に近づけてもよいし、右回動角度θr2が大きくなるほど右回動軸を右裸眼立体視表示面SRの右辺に近づけてもよい。 The positions of the left pivoting axis and the right pivoting axis may be positions preset and stored in the display control device 1 by the user. Further, the image processing unit 12a may move the left rotation axis and the right rotation axis based on the rotation angles θl2 and θr2 of the autostereoscopic display surface. For example, the image processing unit 12a may move the left rotation axis closer to the left side of the left naked eye stereoscopic display surface SL as the left rotation angle θ12 increases, or as the right rotation angle θr2 increases. It may be close to the right side of the right naked eye stereoscopic display surface SR.
 <実施の形態2の変形例3>
 実施の形態2では、左裸眼立体視表示面SL及び右裸眼立体視表示面SRはいずれも平面であった。しかしながら、左裸眼立体視表示面SL及び右裸眼立体視表示面SRは、図18に示すように曲面であってもよい。
<Modification 3 of Embodiment 2>
In Embodiment 2, the left autostereoscopic display surface SL and the right autostereoscopic display surface SR are both flat. However, the left autostereoscopic display surface SL and the right autostereoscopic display surface SR may be curved as shown in FIG.
 そして、映像処理部12aは、左カメラ方向θl1に基づいて、曲面である左裸眼立体視表示面SLと、当該左裸眼立体視表示面SLの基準方向との間の角度θl3を制御することによって、左裸眼立体視表示面SLの回動を制御してもよい。なお、図18では、角度θl3は、曲面の右辺と当該曲面の左辺とを結ぶ線と、基準方向であるx軸方向との間の角度である。しかしながら、角度θl3は、これに限ったものではなく、曲面の右辺における当該曲面の接線と、基準方向であるx軸方向との間の角度であってもよい。 Then, the image processing unit 12a controls the angle θl3 between the left autostereoscopic display surface SL, which is a curved surface, and the reference direction of the left autostereoscopic display surface SL based on the left camera direction θl1. The rotation of the left autostereoscopic display surface SL may be controlled. In FIG. 18, the angle θl3 is an angle between a line connecting the right side of the curved surface and the left side of the curved surface and the x-axis direction which is a reference direction. However, the angle θl 3 is not limited to this, and may be an angle between the tangent of the curved surface on the right side of the curved surface and the x-axis direction which is the reference direction.
 同様に、映像処理部12aは、右カメラ方向θr1に基づいて、曲面である右裸眼立体視表示面SRと、当該右裸眼立体視表示面SRの基準方向との間の角度θr3を制御することによって、右裸眼立体視表示面SRの回動を制御してもよい。なお、図18では、角度θr3は、曲面の左辺と当該曲面の右辺とを結ぶ線と、基準方向であるx軸方向との間の角度である。しかしながら、角度θr3は、これに限ったものではなく、曲面の左辺における当該曲面の接線と、基準方向であるx軸方向との間の角度であってもよい。 Similarly, based on the right camera direction θr1, the video processing unit 12a controls an angle θr3 between the right autostereoscopic display surface SR which is a curved surface and the reference direction of the right autostereoscopic display surface SR. By this, the rotation of the right naked eye stereoscopic display surface SR may be controlled. In FIG. 18, the angle θr3 is an angle between a line connecting the left side of the curved surface and the right side of the curved surface and the x-axis direction which is the reference direction. However, the angle θr3 is not limited to this, and may be an angle between the tangent of the curved surface on the left side of the curved surface and the x-axis direction which is the reference direction.
 また、曲面は図18のように表示装置21の表示画面DSの奥側に凸であってもよいし、図示しないが表示画面DSの奥側とは逆側に凸であってもよい。 The curved surface may be convex on the back side of the display screen DS of the display device 21 as shown in FIG. 18, or may be convex on the opposite side to the back side of the display screen DS although not shown.
 <実施の形態2の変形例4>
 実施の形態2では、カメラ方向及び裸眼立体視表示面の回動角度は水平角に対応する方向及び角度であった。しかしながら、カメラ方向及び裸眼立体視表示面の回動角度は、例えば図19及び図20に示すように仰俯角に対応する方向及び角度であってもよい。
<Modification 4 of Embodiment 2>
In the second embodiment, the camera direction and the turning angle of the autostereoscopic display surface are the direction and the angle corresponding to the horizontal angle. However, the camera direction and the rotation angle of the autostereoscopic display surface may be, for example, a direction and an angle corresponding to the supine angle as shown in FIGS. 19 and 20.
 具体的には、図19に示すように、左後側方カメラ26aの左カメラ方向αl1は、水平方向D2を基準とする行俯角であってもよい。この場合に、映像処理部12aは、左裸眼立体視表示面SLの下辺を基準にして、表示装置21の表示画面DSからその奥行き方向に、左カメラ方向αl1に対応する角度αl2だけ左裸眼立体視表示面SLを回動させてもよい。 Specifically, as shown in FIG. 19, the left camera direction αl1 of the left rear side camera 26a may be a towing angle based on the horizontal direction D2. In this case, the video processing unit 12a sets the left naked eye solid only by an angle αl2 corresponding to the left camera direction αl1 in the depth direction from the display screen DS of the display device 21 based on the lower side of the left autostereoscopic display surface SL. The visual display surface SL may be rotated.
 なお、図示しないが、右後側方カメラ26bの右カメラ方向も、水平方向を基準とする行俯角であってもよい。この場合に、映像処理部12aは、右裸眼立体視表示面SRの下辺を基準にして、表示装置21の表示画面DSからその奥行き方向に、右カメラ方向に対応する角度だけ右裸眼立体視表示面SRを回動させてもよい。 Although not shown, the right camera direction of the right rear side camera 26b may also be a falling angle based on the horizontal direction. In this case, the video processing unit 12a performs right naked eye stereoscopic display by an angle corresponding to the right camera direction in the depth direction from the display screen DS of the display device 21 with reference to the lower side of the right naked eye stereoscopic display surface SR. The surface SR may be rotated.
 なお、実際の運用では、表示装置21が図5のような表示を行うように、つまり、ある程度範囲の道路5が表示されるように、カメラ方向は図19のように上側に向けられるのではなく下側に向けられることが多いと考えられる。 In actual operation, if the display device 21 displays as shown in FIG. 5, that is, if the road 5 in a certain range is displayed, the camera direction is directed upward as shown in FIG. It is thought that they are often directed downward.
 <実施の形態2の変形例5>
 実施の形態2の変形例4と実施の形態2とを組み合わせて、カメラ方向及び裸眼立体視表示面の回動角度は、水平角及び仰俯角の両方に対応する方向及び角度であってもよい。つまり、映像処理部12aは、左裸眼立体視表示面SL及び右裸眼立体視表示面SRのそれぞれを、垂直軸及び水平軸の2軸を基準にして回動する制御を行ってもよい。
<Modification 5 of Embodiment 2>
Combining the fourth modification of the second embodiment and the second embodiment, the camera direction and the turning angle of the autostereoscopic display surface may be a direction and an angle corresponding to both the horizontal angle and the supine angle. . That is, the image processing unit 12a may perform control to rotate each of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR based on two axes of the vertical axis and the horizontal axis.
 また、裸眼立体視表示面の回動は必ずしもカメラ方向の変化に連動させなくてもよい。例えば、カメラ方向を変更せずに、裸眼立体視表示面の回動のみを変更する専用操作を受け付けるように操作装置26cを構成してもよい。そして、操作装置26cが当該専用操作を受け付けた場合に、カメラ制御装置26dがカメラ方向を変更せずに、映像処理部12aが当該専用操作に基づいて裸眼立体視表示面を回動させてもよい。 In addition, the rotation of the autostereoscopic display surface may not necessarily be interlocked with the change in the camera direction. For example, the controller device 26c may be configured to receive a dedicated operation for changing only the rotation of the autostereoscopic display surface without changing the camera direction. When the operation device 26c receives the dedicated operation, the video processing unit 12a rotates the autostereoscopic display surface based on the dedicated operation without the camera control device 26d changing the camera direction. Good.
 <実施の形態3>
 図21は、本発明の実施の形態3に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態3に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
Embodiment 3
FIG. 21 is a block diagram showing a configuration of a display control device 1 according to Embodiment 3 of the present invention. Hereinafter, among the components according to the third embodiment, the same or similar components as or to the components described above are designated by the same reference numerals, and different components will be mainly described.
 本実施の形態3では、図2の左後側方カメラ26a、及び、右後側方カメラ26bの代わりに、左後側方広角カメラ26e、及び、右後側方広角カメラ26fが自車両に設けられている。ここで、左後側方広角カメラ26e及び右後側方広角カメラ26fのそれぞれは、標準的なカメラではなく広角カメラである。 In the third embodiment, instead of the left rear side camera 26a and the right rear side camera 26b in FIG. 2, the left rear side wide-angle camera 26e and the right rear side wide-angle camera 26f serve as the host vehicle. It is provided. Here, each of the left rear wide-angle camera 26e and the right rear wide-angle camera 26f is not a standard camera but a wide-angle camera.
 図22は、画角が10度である標準的なカメラが撮影可能な領域である撮影領域R1と、画角が30度である広角カメラが撮影可能な領域である撮影領域R2とを示す図である。図23は、図22に対応する図であり、広角カメラで撮影された画像の一例を示す図である。図22及び図23に示すように、広角カメラの撮影領域R2は、標準的なカメラの撮影領域R1よりも広くなっている。 FIG. 22 is a view showing a photographing area R1 which is an area where a standard camera having an angle of view of 10 degrees can take an image and an imaging area R2 which is an area where a wide angle camera having an angle of view of 30 degrees can take an image It is. FIG. 23 is a diagram corresponding to FIG. 22 and is a diagram illustrating an example of an image captured by a wide-angle camera. As shown in FIGS. 22 and 23, the imaging area R2 of the wide-angle camera is wider than the imaging area R1 of a standard camera.
 図21に戻って、操作装置26cは、左後側方広角カメラ26eで撮影された画像の一部を切り取る切取操作を受け付ける。カメラ制御装置26dは、操作装置26cで受け付けた切取操作に基づいて、左後側方広角カメラ26eで撮影された画像から、その一部たる部分画像(以下「左部分画像」と記すこともある)を切り取る。左部分画像は、例えば図23の破線枠27内の画像である。そして、カメラ制御装置26dは、左部分画像を表示制御装置1に出力する。また、カメラ制御装置26dは、左後側方広角カメラ26eで撮影された画像の全範囲における左部分画像の範囲を表示制御装置1に出力する。 Returning to FIG. 21, the controller device 26 c receives a cutting operation for cutting out a part of the image captured by the left rear side wide-angle camera 26 e. The camera control device 26d may also be referred to as a partial image (hereinafter referred to as "left partial image") from an image captured by the left rear side wide-angle camera 26e based on the cutting operation received by the operating device 26c. Cut out). The left partial image is, for example, an image within a broken line frame 27 of FIG. Then, the camera control device 26 d outputs the left partial image to the display control device 1. Further, the camera control device 26 d outputs the range of the left partial image in the entire range of the image captured by the left rear side wide-angle camera 26 e to the display control device 1.
 同様に、操作装置26cは、右後側方広角カメラ26fで撮影された画像の一部を切り取る切取操作を受け付ける。カメラ制御装置26dは、操作装置26cで受け付けた切取操作に基づいて、右後側方広角カメラ26fで撮影された画像から、その一部たる部分画像(以下「右部分画像」と記すこともある)を切り取る。そして、カメラ制御装置26dは、右部分画像を表示制御装置1に出力する。また、カメラ制御装置26dは、右後側方広角カメラ26fで撮影された画像の全範囲における右部分画像の範囲を表示制御装置1に出力する。 Similarly, the controller device 26c receives a cutting operation for cutting out a part of the image captured by the right rear side wide-angle camera 26f. The camera control device 26d may also be referred to as a partial image (hereinafter referred to as a "right partial image") from an image taken by the right rear side wide-angle camera 26f based on the cutting operation received by the operation device 26c. Cut out). Then, the camera control device 26 d outputs the right partial image to the display control device 1. Further, the camera control device 26 d outputs the range of the right partial image in the entire range of the image captured by the right rear side wide-angle camera 26 f to the display control device 1.
 本実施の形態3に係る画像取得部11aは、カメラ制御装置26dからの左部分画像を撮影画像として取得し、カメラ制御装置26dからの右部分画像を撮影画像として取得する。 The image acquisition unit 11a according to the third embodiment acquires the left partial image from the camera control device 26d as a captured image, and acquires the right partial image from the camera control device 26d as a captured image.
 表示面生成情報取得部11bは、カメラ制御装置26dからの左部分画像の範囲に基づいて、左部分画像の仮想撮影方向を、撮影画像に関する方向たる撮影方向として求める。本実施の形態3では、表示面生成情報取得部11bは、図23に示すように、左後側方広角カメラ26eで撮影された画像の右端を基準にした、左部分画像の範囲の右端の位置に対応する角度を、左部分画像の仮想撮影方向として求める。この場合、左部分画像の仮想撮影方向は、実施の形態2の左カメラ方向と同様に水平角である。 Based on the range of the left partial image from the camera control device 26d, the display surface generation information acquiring unit 11b obtains the virtual shooting direction of the left partial image as a shooting direction that is a direction related to the photographed image. In the third embodiment, as shown in FIG. 23, the display surface generation information acquiring unit 11b sets the right end of the range of the left partial image based on the right end of the image captured by the left rear side wide-angle camera 26e. An angle corresponding to the position is determined as a virtual imaging direction of the left partial image. In this case, the virtual imaging direction of the left partial image is a horizontal angle as in the left camera direction of the second embodiment.
 同様に、表示面生成情報取得部11bは、カメラ制御装置26dからの右部分画像の範囲に基づいて、右部分画像の仮想撮影方向を、撮影画像に関する方向たる撮影方向として求める。 Similarly, based on the range of the right partial image from the camera control device 26d, the display surface generation information acquisition unit 11b obtains the virtual shooting direction of the right partial image as a shooting direction that is a direction related to the shot image.
 映像処理部12aは、実施の形態2に係る映像処理部12aの動作において、左カメラ画像、右カメラ画像、左カメラ方向及び右カメラ方向を、左部分画像、右部分画像、左部分画像の仮想撮影方向及び右部分画像の仮想撮影方向に置き換えた動作と同じ動作を行う。つまり、本実施の形態3では、左部分画像の仮想撮影方向及び右部分画像の仮想撮影方向が、実施の形態1で説明した表示面生成情報として用いられる。 In the operation of the video processing unit 12a according to the second embodiment, the video processing unit 12a is a virtual image of the left partial image, the right partial image, and the left partial image for the left camera image, the right camera image, the left camera direction and the right camera direction. The same operation as the shooting direction and the virtual shooting direction of the right partial image is performed. That is, in the third embodiment, the virtual shooting direction of the left partial image and the virtual shooting direction of the right partial image are used as the display surface generation information described in the first embodiment.
 <実施の形態3のまとめ>
 以上のような本実施の形態3に係る表示制御装置1では、実施の形態2で説明したカメラ画像及びカメラ方向の代わりに、部分画像及び仮想撮影方向が用いられる。このような構成によれば、左後側方広角カメラ26e及び右後側方広角カメラ26fのカメラ方向を用いなくても、裸眼立体視表示面の回動を制御することができる。この結果、カメラの方向を変更する電動駆動機構などのハードウェアが不要になるので、コスト低減化が期待できる。
<Summary of Embodiment 3>
In the display control device 1 according to the third embodiment as described above, a partial image and a virtual shooting direction are used instead of the camera image and the camera direction described in the second embodiment. According to such a configuration, the rotation of the autostereoscopic display surface can be controlled without using the camera directions of the left rear wide-angle wide camera 26e and the right rear wide-angle wide camera 26f. As a result, since hardware such as an electric drive mechanism for changing the direction of the camera becomes unnecessary, cost reduction can be expected.
 <実施の形態3の変形例>
 実施の形態3では、左部分画像の仮想撮影方向、右部分画像の仮想撮影方向、及び、裸眼立体視表示面の回動角度は、水平角に対応する方向及び角度であった。しかしながら、左部分画像の仮想撮影方向、右部分画像の仮想撮影方向、及び、裸眼立体視表示面の回動角度は、実施の形態2の変形例4及び5と同様に、仰俯角に対応する方向及び角度であってもよい。
<Modification of Embodiment 3>
In the third embodiment, the virtual photographing direction of the left partial image, the virtual photographing direction of the right partial image, and the rotation angle of the autostereoscopic display surface are the direction and the angle corresponding to the horizontal angle. However, the virtual imaging direction of the left partial image, the virtual imaging direction of the right partial image, and the rotation angle of the autostereoscopic display surface correspond to the supine angle as in the fourth and fifth modifications of the second embodiment. It may be a direction and an angle.
 <実施の形態4>
 図24は、本発明の実施の形態4に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態4に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
Fourth Preferred Embodiment
FIG. 24 is a block diagram showing a configuration of a display control device 1 according to Embodiment 4 of the present invention. Hereinafter, among constituent elements according to the fourth embodiment, constituent elements that are the same as or similar to the above constituent elements are given the same reference numerals, and different constituent elements are mainly described.
 図21の実施の形態3では、カメラ制御装置26dが、操作装置26cで受け付けた切取操作に基づいて左部分画像及び右部分画像を生成し、当該左部分画像及び当該右部分画像を表示制御装置1に出力した。これに対して、本実施の形態4では、表示制御装置1が、操作装置26cで受け付けた切取操作に基づいて左部分画像及び右部分画像を生成するように構成されている。 In the third embodiment of FIG. 21, the camera control device 26d generates a left partial image and a right partial image based on the cutting operation received by the operation device 26c, and the display control device displays the left partial image and the right partial image. Output to 1. On the other hand, in the fourth embodiment, the display control device 1 is configured to generate the left partial image and the right partial image based on the cutting operation received by the operation device 26c.
 表示面生成情報取得部11bは、操作装置26cで受け付けた切取操作を取得する。 The display surface generation information acquisition unit 11 b acquires the cutting operation received by the operation device 26 c.
 カメラ制御装置26dは、左後側方広角カメラ26eで撮影された画像、及び、右後側方広角カメラ26fで撮影された画像を表示制御装置1に出力する。 The camera control device 26 d outputs, to the display control device 1, an image captured by the left rear wide-angle camera 26 e and an image captured by the right rear wide-angle camera 26 f.
 画像取得部11aは、カメラ制御装置26dからの画像、つまり左後側方広角カメラ26eで撮影された画像、及び、右後側方広角カメラ26fで撮影された画像を取得する。そして、画像取得部11aは、表示面生成情報取得部11bで取得された、左後側方広角カメラ26eで撮影された画像の一部を切り取る切取操作に基づいて、左後側方広角カメラ26eで撮影された画像から、その一部たる左部分画像を切り取る。同様に、画像取得部11aは、表示面生成情報取得部11bで取得された、右後側方広角カメラ26fで撮影された画像の一部を切り取る切取操作に基づいて、右後側方広角カメラ26fで撮影された画像から、その一部たる右部分画像を切り取る。以上により、画像取得部11aは、左部分画像及び右部分画像を撮影画像として取得する。 The image acquisition unit 11a acquires an image from the camera control device 26d, that is, an image captured by the left rear wide-angle camera 26e and an image captured by the right rear wide-angle camera 26f. Then, the image acquiring unit 11a performs a left rear side wide-angle camera 26e based on a cutting operation for cutting out a part of the image captured by the left rear side wide-angle camera 26e acquired by the display surface generation information acquiring unit 11b. Cut out the left partial image that is a part of the image taken with. Similarly, the image acquiring unit 11a is a right rear side wide-angle camera based on a cutting operation for cutting out a part of the image captured by the right rear side wide-angle camera 26f acquired by the display surface generation information acquiring unit 11b. The right partial image that is a part of it is cut out from the image taken at 26f. As described above, the image acquisition unit 11a acquires the left partial image and the right partial image as a photographed image.
 表示面生成情報取得部11bは、左後側方広角カメラ26eで撮影された画像の全範囲における左部分画像の範囲に基づいて、左部分画像の仮想撮影方向を、撮影画像に関する方向たる撮影方向として求める。同様に、表示面生成情報取得部11bは、右後側方広角カメラ26fで撮影された画像の全範囲における右部分画像の範囲に基づいて、右部分画像の仮想撮影方向を、撮影画像に関する方向たる撮影方向として求める。 The display surface generation information acquiring unit 11b is a shooting direction that is a virtual shooting direction of the left partial image based on the range of the left partial image in the entire range of the image captured by the left rear wide-angle camera 26e. Ask as. Similarly, based on the range of the right partial image in the entire range of the image captured by the right rear side wide-angle camera 26f, the display surface generation information acquiring unit 11b sets the virtual shooting direction of the right partial image to the direction regarding the captured image. Determined as the shooting direction.
 映像処理部12aは、実施の形態3に係る映像処理部12aの動作と同じ動作を行う。つまり、本実施の形態4では、左部分画像の仮想撮影方向及び右部分画像の仮想撮影方向が、実施の形態1で説明した表示面生成情報として用いられる。なお、左部分画像の仮想撮影方向、右部分画像の仮想撮影方向、及び、裸眼立体視表示面の回動角度は、水平角に対応する方向及び角度であってもよく、仰俯角に対応する方向及び角度であってもよい。 The video processing unit 12a performs the same operation as the operation of the video processing unit 12a according to the third embodiment. That is, in the fourth embodiment, the virtual shooting direction of the left partial image and the virtual shooting direction of the right partial image are used as the display surface generation information described in the first embodiment. The virtual photographing direction of the left partial image, the virtual photographing direction of the right partial image, and the rotation angle of the autostereoscopic display surface may be a direction and an angle corresponding to the horizontal angle, and correspond to the supine angle It may be a direction and an angle.
 <実施の形態4のまとめ>
 以上のような本実施の形態4に係る表示制御装置1では、実施の形態2で説明したカメラ画像及びカメラ方向の代わりに、部分画像及び仮想撮影方向が用いられる。このような構成によれば、左後側方広角カメラ26e及び右後側方広角カメラ26fのカメラ方向を用いなくても、裸眼立体視表示面の回動を制御することができる。この結果、カメラの方向を変更する電動駆動機構などのハードウェアが不要になるので、コスト低減化が期待できる。
<Summary of Embodiment 4>
In the display control device 1 according to the fourth embodiment as described above, a partial image and a virtual shooting direction are used instead of the camera image and the camera direction described in the second embodiment. According to such a configuration, the rotation of the autostereoscopic display surface can be controlled without using the camera directions of the left rear wide-angle wide camera 26e and the right rear wide-angle wide camera 26f. As a result, since hardware such as an electric drive mechanism for changing the direction of the camera becomes unnecessary, cost reduction can be expected.
 <実施の形態5>
 本発明の実施の形態5に係る表示制御装置1の構成を示すブロック図は、実施の形態2に係る表示制御装置1のブロック図(図2)と同様である。以下、本実施の形態5に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
The Fifth Preferred Embodiment
The block diagram showing the configuration of the display control device 1 according to the fifth embodiment of the present invention is the same as the block diagram (FIG. 2) of the display control device 1 according to the second embodiment. Hereinafter, among constituent elements according to the fifth embodiment, constituent elements that are the same as or similar to the above constituent elements are given the same reference numerals, and different constituent elements are mainly described.
 本実施の形態5では、表示面生成情報取得部11bは、自車両の図示しないECU(Electronic Control Unit)及び車内LAN(Local Area Network)から、自車両の速度を取得する。 In the fifth embodiment, the display surface generation information acquiring unit 11b acquires the speed of the vehicle from the ECU (Electronic Control Unit) and the in-vehicle LAN (Local Area Network) (not shown) of the vehicle.
 映像処理部12aは、自車両の速度に基づいて、裸眼立体視表示の奥行き方向における裸眼立体視表示面の位置を制御する。本実施の形態5では、映像処理部12aは、自車両の速度が大きいほど、図25及び図26に示す奥行き方向上の距離zlを大きくすることによって、左裸眼立体視表示面SLを奥行き方向の奥側に移動する制御を行う。同様に、映像処理部12aは、自車両の速度が大きいほど、図25及び図26に示す奥行き方向上の距離zrを大きくすることによって、右裸眼立体視表示面SRを奥行き方向の奥側に移動する制御を行う。つまり、本実施の形態5では、自車両の速度が、実施の形態1で説明した表示面生成情報として用いられる。 The image processing unit 12a controls the position of the autostereoscopic display surface in the depth direction of the autostereoscopic display based on the speed of the host vehicle. In the fifth embodiment, the video processing unit 12a increases the distance zl in the depth direction shown in FIGS. 25 and 26 as the speed of the host vehicle increases, thereby causing the left autostereoscopic display surface SL to be in the depth direction. Control to move to the far side of Similarly, the video processing unit 12a increases the distance zr in the depth direction shown in FIG. 25 and FIG. 26 as the speed of the host vehicle increases, so that the right naked eye stereoscopic display surface SR is made deeper in the depth direction. Control to move. That is, in the fifth embodiment, the speed of the host vehicle is used as the display surface generation information described in the first embodiment.
 なお、図25は、図7と同様に、左カメラ方向θl1=左回動角度θl2=0度、かつ、右カメラ方向θr1=右回動角度θr2=0度である場合の、本実施の形態5に係る左裸眼立体視表示面SL及び右裸眼立体視表示面SRを模式的に示す図である。図26は、図9と同様に、左カメラ方向θl1=左回動角度θl2≠0度、かつ、右カメラ方向θr1=右回動角度θr2≠0度である場合の、本実施の形態5に係る左裸眼立体視表示面SL及び右裸眼立体視表示面SRを模式的に示す図である。 Note that FIG. 25 shows the present embodiment when left camera direction θl1 = left rotation angle θl2 = 0 degrees and right camera direction θr1 = right rotation angle θr2 = 0 degrees, as in FIG. It is a figure which shows typically left autostereoscopic display surface SL which concerns on 5, and right autostereoscopic display surface SR. Similarly to FIG. 9, FIG. 26 shows the fifth embodiment in the case where the left camera direction θl1 = left rotation angle θl2 ≠ 0 degrees and the right camera direction θr1 = right rotation angle θr2 ≠ 0 degrees. It is a figure which shows typically left autostereoscopic display surface SL which concerns, and right autostereoscopic display surface SR.
 <実施の形態5のまとめ>
 以上のような本実施の形態5に係る表示制御装置1では、表示面生成情報に基づいて、裸眼立体視表示の奥行き方向における裸眼立体視表示面の位置を制御する。このような構成によれば、裸眼立体視表示面の表示における自由度を高めることができる。
<Summary of Embodiment 5>
The display control apparatus 1 according to the fifth embodiment as described above controls the position of the autostereoscopic display surface in the depth direction of the autostereoscopic display based on the display surface generation information. According to such a configuration, it is possible to increase the degree of freedom in display of the autostereoscopic display surface.
 なお、以上の説明では、表示面生成情報は、自車両の速度であった。しかしながら、例えば、表示面生成情報は、ユーザの操作で指定された、裸眼立体視表示の奥行き方向における裸眼立体視表示面の位置であってもよいし、後述するように、自車両周辺の予め定められたオブジェクトの、自車両に対する相対位置であってもよい。また、実施の形態5で説明した内容は、実施の形態2だけでなく、実施の形態3及び4にも適用されてもよい。 In the above description, the display surface generation information is the speed of the host vehicle. However, for example, the display surface generation information may be the position of the autostereoscopic display surface in the depth direction of the autostereoscopic display designated by the user's operation, or as described later, in advance. It may be the relative position of the defined object with respect to the host vehicle. The contents described in the fifth embodiment may be applied not only to the second embodiment but also to the third and fourth embodiments.
 <実施の形態6>
 図27は、本発明の実施の形態6に係る表示制御装置1の構成を示すブロック図である。以下、本実施の形態6に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
Embodiment 6
FIG. 27 is a block diagram showing a configuration of a display control device 1 according to Embodiment 6 of the present invention. Hereinafter, among constituent elements according to the sixth embodiment, constituent elements which are the same as or similar to the constituent elements described above are given the same reference numerals, and different constituent elements are mainly described.
 本実施の形態6では、撮影画像の撮影方向は、自車両周辺の状況などの種々の情報に基づいて自動的に変更され、それに連動して裸眼立体視表示面の回動も自動的に変更されるように構成されている。 In the sixth embodiment, the shooting direction of the shot image is automatically changed based on various information such as the condition around the vehicle, and interlocking with it automatically changes the rotation of the autostereoscopic display surface. It is configured to be.
 なお、撮影画像には、実施の形態2で説明したカメラ画像、及び、実施の形態3及び4で説明した部分画像などが含まれ、撮影画像の撮影方向には、実施の形態2で説明したカメラ方向、及び、実施の形態3及び4で説明した部分画像の仮想撮影方向が含まれる。以下、本実施の形態6では、撮影画像は、実施の形態2で説明したカメラ画像であり、撮影画像の撮影方向は、実施の形態2で説明したカメラ方向であるものとして説明する。 The photographed image includes the camera image described in the second embodiment and the partial images described in the third and fourth embodiments, and the photographing direction of the photographed image described in the second embodiment. The camera direction and the virtual shooting direction of the partial image described in the third and fourth embodiments are included. Hereinafter, in the sixth embodiment, the photographed image is the camera image described in the second embodiment, and the photographing direction of the photographed image is described as the camera direction described in the second embodiment.
 図27の構成は、図2の構成に画像認識処理部12cが追加された構成と同様である。なお、画像認識処理部12cの概念は、図1の制御部12に含まれる概念である。 The configuration in FIG. 27 is similar to the configuration in which the image recognition processing unit 12c is added to the configuration in FIG. The concept of the image recognition processing unit 12c is a concept included in the control unit 12 of FIG.
 画像認識処理部12cは、画像取得部11aで取得された撮影画像に画像認識処理を行うことにより、当該撮影画像から自車両周辺の状況を取得する。例えば、画像認識処理部12cは、撮影画像に自車両周辺の他車両が存在する場合に、当該撮影画像に示される車線などの道路のうち他車両が位置する部分の幅を求める。そして、画像認識処理部12cは、当該幅に基づいて、自車両と他車両との間の距離を求める。この際、画像認識処理部12cは、車線幅データを持つ地図情報も用いて自車両と他車両との間の距離を求めてもよい。このようにして求められた自車両と他車両との間の距離は、自車両周辺の状況に含まれる概念である。 The image recognition processing unit 12c performs an image recognition process on the captured image acquired by the image acquisition unit 11a to acquire the situation around the vehicle from the captured image. For example, when there is another vehicle around the host vehicle in the captured image, the image recognition processing unit 12c obtains the width of the portion where the other vehicle is located in the road such as the lane shown in the captured image. And the image recognition process part 12c calculates | requires the distance between the own vehicle and the other vehicle based on the said width | variety. At this time, the image recognition processing unit 12c may also obtain the distance between the host vehicle and the other vehicle using also map information having lane width data. The distance between the host vehicle and the other vehicle obtained in this manner is a concept included in the situation around the host vehicle.
 カメラ制御装置26dは、画像認識処理部12cで取得された自車両周辺の状況に基づいて、撮影画像の撮影方向を制御する。 The camera control device 26d controls the shooting direction of the shot image based on the situation around the host vehicle acquired by the image recognition processing unit 12c.
 図28は、本実施の形態6に係る表示制御装置1の動作を示すフローチャートである。この動作は、例えば自車両のアクセサリー電源がオンになった場合、及び、自車両の駆動源がオンになった場合などに開始され、左右の電子ミラーのそれぞれに対して行われる。 FIG. 28 is a flowchart showing an operation of the display control device 1 according to the sixth embodiment. This operation is started, for example, when the accessory power supply of the host vehicle is turned on or when the drive source of the host vehicle is turned on, and is performed for each of the left and right electronic mirrors.
 まずステップS11にて、表示面生成情報取得部11bは、撮影画像の撮影方向を取得する。 First, in step S11, the display surface generation information acquiring unit 11b acquires the photographing direction of the photographed image.
 ステップS12にて、画像取得部11aは撮影画像を取得する。 In step S12, the image acquisition unit 11a acquires a photographed image.
 ステップS13にて、画像認識処理部12cは、画像取得部11aで取得された撮影画像に画像認識処理を行うことにより、当該撮影画像から自車両周辺の状況を取得する。 In step S13, the image recognition processing unit 12c performs an image recognition process on the captured image acquired by the image acquisition unit 11a to acquire the situation around the vehicle from the captured image.
 ステップS14にて、カメラ制御装置26dは、画像認識処理部12cで取得された自車両周辺の状況に基づいて、撮影画像の撮影方向を変更すべきか否かを判定する。例えば、自車両周辺の状況が、自車両が走行している車線と隣接する隣接車線を走行している他車両と自車両との間の距離が予め定められた距離(例えば20m)以下である状況である場合には、カメラ制御装置26dは、撮影画像の撮影方向を変更すべきと判定する。撮影画像の撮影方向を変更すべきと判定された場合には、ステップS15に処理が進み、撮影画像の撮影方向を変更すべきと判定されなかった場合には、ステップS16に処理が進む。 In step S14, the camera control device 26d determines whether or not to change the shooting direction of the shot image based on the situation around the host vehicle acquired by the image recognition processing unit 12c. For example, the situation around the host vehicle is such that the distance between the host vehicle and another vehicle traveling in the adjacent lane adjacent to the lane in which the host vehicle is traveling is less than a predetermined distance (for example, 20 m) If the situation is the case, the camera control device 26d determines that the shooting direction of the shot image should be changed. If it is determined that the shooting direction of the shot image should be changed, the process proceeds to step S15. If it is not determined that the shooting direction of the shot image should be changed, the process proceeds to step S16.
 ステップS15にて、カメラ制御装置26dは、撮影画像の撮影方向を変更する。例えば、隣接車線を走行し、自車両との間の距離が予め定められた距離以下である他車両に、撮影画像の中央が近づくように、カメラ制御装置26dは、撮影画像の撮影方向を変更する。その後、ステップS17に処理が進む。 In step S15, the camera control device 26d changes the shooting direction of the shot image. For example, the camera control device 26d changes the shooting direction of the captured image so that the center of the captured image approaches the other vehicle traveling in the adjacent lane and the distance to the own vehicle is equal to or less than a predetermined distance. Do. Thereafter, the process proceeds to step S17.
 ステップS16にて、カメラ制御装置26dは、撮影画像の撮影方向を初期方向に維持する。たたし、撮影画像の撮影方向が初期方向でない場合には、カメラ制御装置26dは撮影画像の撮影方向を初期方向に変更する。その後、ステップS17に処理が進む。 In step S16, the camera control device 26d maintains the shooting direction of the shot image in the initial direction. If the shooting direction of the shot image is not the initial direction, the camera control device 26d changes the shooting direction of the shot image to the initial direction. Thereafter, the process proceeds to step S17.
 ステップS17にて、映像処理部12aは、撮影画像の撮影方向に基づいて、裸眼立体視表示面の回動角度を決定する。 In step S17, the video processing unit 12a determines the rotation angle of the autostereoscopic display surface based on the shooting direction of the shot image.
 ステップS18にて、映像処理部12aは、回動角度に基づいて撮影画像に画像処理を行うことにより、裸眼立体視表示面を表示するための左眼用及び右眼用の画像を生成する。 In step S18, the image processing unit 12a generates an image for the left eye and a right eye for displaying an autostereoscopic display surface by performing image processing on the captured image based on the rotation angle.
 ステップS19にて、画像出力部12bは、裸眼立体視表示面を表示するための左眼用及び右眼用の画像の映像信号を表示装置21に出力する。その後、ステップS11に処理が戻る。 In step S19, the image output unit 12b outputs, to the display device 21, video signals of the left-eye and right-eye images for displaying the autostereoscopic display surface. Thereafter, the process returns to step S11.
 <実施の形態6のまとめ>
 以上のような本実施の形態6に係る表示制御装置1では、撮影画像の撮影方向は、自車両周辺の状況に基づいて自動的に変更される。このような構成によれば、運転者が注意すべき対象を強調的に表示することができるので、運転者に対する利便性を高めることができる。
<Summary of Embodiment 6>
In the display control device 1 according to the sixth embodiment as described above, the photographing direction of the photographed image is automatically changed based on the situation around the host vehicle. According to such a configuration, it is possible to emphasize and display the target to which the driver should pay attention, so the convenience for the driver can be enhanced.
 なお以上の説明では、撮影画像は、実施の形態2で説明したカメラ画像であり、撮影画像の撮影方向は、実施の形態2で説明したカメラ方向であるものとした。しかしながら、撮影画像は、実施の形態3及び4で説明した部分画像であってもよく、撮影画像の撮影方向は、実施の形態3及び4で説明した部分画像の仮想撮影方向であってもよい。また、カメラ制御装置26dではなく、表示面生成情報取得部11bが、画像認識処理部12cで取得された自車両周辺の状況に基づいて、撮影画像の撮影方向を制御してもよい。 In the above description, the photographed image is the camera image described in the second embodiment, and the photographing direction of the photographed image is the camera direction described in the second embodiment. However, the photographed image may be the partial image described in the third and fourth embodiments, and the photographing direction of the photographed image may be the virtual photographing direction of the partial image described in the third and fourth embodiments. . In addition, the display surface generation information acquisition unit 11b may control the photographing direction of the photographed image based on the situation around the host vehicle acquired by the image recognition processing unit 12c instead of the camera control device 26d.
 <実施の形態6の変形例1>
 実施の形態6では、自車両周辺の状況は、自車両と他車両との間の距離であった。しかしながら、例えば、光センサ、ミリ波レーダ、高精度の画像認識装置、超音波センサなどの検出装置によって、他車両の自車両に対する相対位置が検出される場合には、自車両周辺の状況は、他車両の自車両に対する相対位置であってもよい。
<Modification 1 of Embodiment 6>
In the sixth embodiment, the situation around the host vehicle is the distance between the host vehicle and the other vehicle. However, when the relative position of the other vehicle to the vehicle is detected, for example, by a detection device such as an optical sensor, millimeter wave radar, high-accuracy image recognition device, or ultrasonic sensor, the situation around the vehicle is It may be a relative position of the other vehicle to the own vehicle.
 <実施の形態6の変形例2>
 実施の形態6では、撮影画像の撮影方向は、自車両周辺の状況に基づいて自動的に変更されたが、これに限ったものではない。例えば、撮影画像の撮影方向は、自車両の速度などを含む自車両の走行状況に基づいて自動的に変更されてもよい。具体的には、自車両の速度が大きいほど、自車両の道路の最後方の部分に、撮影画像の中央が近づくように、撮影画像の撮影方向が変更されてもよい。
<Modification 2 of Embodiment 6>
In the sixth embodiment, the shooting direction of the shot image is automatically changed based on the situation around the host vehicle, but it is not limited to this. For example, the shooting direction of the shot image may be automatically changed based on the traveling condition of the host vehicle including the speed of the host vehicle. Specifically, as the speed of the host vehicle is higher, the shooting direction of the captured image may be changed such that the center of the captured image approaches the rearmost portion of the road of the host vehicle.
 <実施の形態6の変形例3>
 また、撮影画像の撮影方向は、自車両が走行している道路の形状に基づいて自動的に変更されてもよい。例えば、自車両が走行している道路の形状がカーブ形状である場合には、自車両の道路の最後方の部分に、撮影画像の中央が近づくように、撮影画像の撮影方向が変更されてもよい。
<Modification 3 of Embodiment 6>
In addition, the shooting direction of the shot image may be automatically changed based on the shape of the road on which the host vehicle is traveling. For example, when the shape of the road on which the host vehicle is traveling is a curve, the shooting direction of the captured image is changed so that the center of the photographed image approaches the rearmost part of the host vehicle's road. It is also good.
 <実施の形態6の変形例4>
 また、撮影画像の撮影方向は、自車両の右左折状況に基づいて自動的に変更されてもよい。例えば、自車両が左折する場合には、自車両の真後側から左側に撮影画像の中央が移動するように、自車両が右折する場合には、自車両の真後側から右側に撮影画像の中央が移動するように、撮影画像の撮影方向が変更されてもよい。
<Modification 4 of Embodiment 6>
In addition, the shooting direction of the shot image may be automatically changed based on the turning condition of the host vehicle. For example, when the vehicle turns left, the center of the captured image moves to the left from just behind the vehicle, and when the vehicle turns right, the captured image from the rear to the right of the vehicle The shooting direction of the shot image may be changed such that the center of the arrow moves.
 <実施の形態6の変形例5>
 実施の形態6及び実施の形態6の変形例1~4では、撮影画像の撮影方向は、自車両周辺の状況、自車両の走行状況、自車両が走行している道路の形状、及び、自車両の右左折状況のいずれか1つに基づいて変更された。そして、撮影画像の撮影方向の変更に連動して、裸眼立体視表示面の回動角度も変更された。
<Modification 5 of Embodiment 6>
In the sixth embodiment and the first to fourth modifications of the sixth embodiment, the photographing direction of the photographed image is the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and Changed based on any one of the vehicle's turn conditions. Then, in conjunction with the change of the photographing direction of the photographed image, the turning angle of the autostereoscopic display surface is also changed.
 しかしながら、撮影画像の撮影方向は、自車両周辺の状況、自車両の走行状況、自車両が走行している道路の形状、及び、自車両の右左折状況の少なくともいずれか1つに基づいて変更されてもよい。つまり、表示面生成情報が、自車両周辺の状況、自車両の走行状況、自車両が走行している道路の形状、及び、自車両の右左折状況の少なくともいずれか1つを含むように構成されてもよい。 However, the shooting direction of the shot image is changed based on at least one of the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and the turning condition of the vehicle. It may be done. That is, the display surface generation information is configured to include at least one of the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and the turning condition of the vehicle. It may be done.
 また、自車両周辺の状況、自車両の走行状況、自車両が走行している道路の形状、及び、自車両の右左折状況の少なくともいずれか1つに基づいて、撮影画像の撮影方向は変更されずに、裸眼立体視表示面の回動角度が変更されるように構成されてもよい。 In addition, the photographing direction of the photographed image is changed based on at least one of the condition around the vehicle, the traveling condition of the vehicle, the shape of the road on which the vehicle is traveling, and the turning condition of the vehicle. Alternatively, the rotation angle of the autostereoscopic display surface may be changed.
 <実施の形態7>
 本発明の実施の形態7に係る表示制御装置1の構成を示すブロック図は、実施の形態2に係る表示制御装置1のブロック図(図2)と同様である。以下、本実施の形態7に係る構成要素のうち、上述の構成要素と同じまたは類似する構成要素については同じ参照符号を付し、異なる構成要素について主に説明する。
Seventh Embodiment
The block diagram showing the configuration of the display control device 1 according to the seventh embodiment of the present invention is the same as the block diagram (FIG. 2) of the display control device 1 according to the second embodiment. Hereinafter, among constituent elements according to the seventh embodiment, constituent elements which are the same as or similar to the constituent elements described above are given the same reference numerals, and different constituent elements are mainly described.
 これまでの説明では、裸眼立体視表示面は1枚の面であった。しかしながら本実施の形態7では、裸眼立体視表示面は複数枚の面を含むように構成されている。なお、本実施の形態7では、実施の形態2のような裸眼立体視表示面の回動は必須ではない。 In the above description, the autostereoscopic display surface was a single surface. However, in the seventh embodiment, the autostereoscopic display surface is configured to include a plurality of surfaces. In the seventh embodiment, rotation of the autostereoscopic display surface as in the second embodiment is not essential.
 図29は、本実施の形態7に係る裸眼立体視表示面を模式的に示す図である。なお、以下の図では、左裸眼立体視表示面SLに関する部分のみを図示するが、右裸眼立体視表示面SRも左裸眼立体視表示面SLと同様である。 FIG. 29 is a view schematically showing an autostereoscopic display surface according to the seventh embodiment. In the following drawings, only the part related to the left autostereoscopic display surface SL is illustrated, but the right auto-stereoscopic display surface SR is also similar to the left auto-stereoscopic display surface SL.
 図29に示すように、左裸眼立体視表示面SLは、xz平面から傾斜された第1面SL1と、第1面SL1の奥側部分に接続され、xy平面に平行であり、表示画面DSとの間の距離がzlである第2面SL2とを含んでいる。この図29に示すように、遠近法を考慮して、第1面SL1の横幅は奥側に向かうにつれて小さくされてもよい。 As shown in FIG. 29, the left autostereoscopic display surface SL is connected to the first surface SL1 inclined from the xz plane and the back side portion of the first surface SL1 and is parallel to the xy plane, and the display screen DS And a second surface SL2 having a distance of zl. As shown in FIG. 29, in consideration of perspective, the width of the first surface SL1 may be reduced toward the back side.
 図30は、図29の表示の具体的な表示例を示す図である。第1面SL1上には、主に自車両から後方の道路5が表示され、第2面SL2上には、主に青空や雲などが表示される。なお、第1面SL1と第2面SL2との間の境界は、画像認識結果、及び、撮影画像の撮影方向に基づいて道路5の消失線に合わされてもよい。 FIG. 30 is a diagram showing a specific display example of the display of FIG. The road 5 behind the host vehicle is mainly displayed on the first surface SL1, and a blue sky, clouds and the like are mainly displayed on the second surface SL2. The boundary between the first surface SL1 and the second surface SL2 may be aligned with the vanishing line of the road 5 based on the image recognition result and the shooting direction of the captured image.
 <実施の形態7のまとめ>
 以上のような本実施の形態7に係る表示制御装置1では、裸眼立体視表示面は、複数枚の面を含む。このような構成によれば、電子ミラーの表示を、運転者が肉眼で見る風景に似た表示にすることができる。なお、実施の形態7で説明した内容は、実施の形態2だけでなく、実施の形態3~6にも適宜適用されてもよい。
<Summary of Embodiment 7>
In the display control device 1 according to the seventh embodiment as described above, the autostereoscopic display surface includes a plurality of surfaces. According to such a configuration, it is possible to make the display of the electronic mirror similar to a landscape seen by the driver with the naked eye. The contents described in the seventh embodiment may be appropriately applied not only to the second embodiment but also to the third to sixth embodiments.
 <実施の形態7の変形例1>
 映像処理部12aは、画像取得部11aで取得された撮影画像に予め定められたオブジェクトが含まれる場合に、当該オブジェクトの属性に基づいて、裸眼立体視表示面に含まれる複数枚の面のうちオブジェクトを表示する面を決定し、当該面に当該オブジェクトを表示する制御を行ってもよい。
<Modification 1 of Embodiment 7>
When the captured image acquired by the image acquisition unit 11a includes an object determined in advance, the image processing unit 12a selects one of the plurality of surfaces included in the autostereoscopic display surface based on the attribute of the object. The surface on which the object is displayed may be determined, and control may be performed to display the object on the surface.
 図31は、本変形例1に係る裸眼立体視表示面を模式的に示す図である。図31の例では、左裸眼立体視表示面SLは、第1面SL1及び第2面SL2に加えて、第1面SL1の手前側部分に接続されxy平面に平行な第3面SL3を含んでいる。なお、第3面SL3の位置は、表示装置21の表示画面DSの位置と同じである。 FIG. 31 is a view schematically showing an autostereoscopic display surface according to the first modification. In the example of FIG. 31, the left autostereoscopic display surface SL includes, in addition to the first surface SL1 and the second surface SL2, a third surface SL3 connected to the front side portion of the first surface SL1 and parallel to the xy plane. It is. The position of the third surface SL3 is the same as the position of the display screen DS of the display device 21.
 以下、オブジェクトが自車両3の一部であるボディである場合について説明する。本変形例1では、画像取得部11aで取得された撮影画像に自車両3のボディが含まれる場合、映像処理部12aは、自車両3のボディという属性に基づいて、第1~第3面SL1~SL3のうちの第3面SL3を、自車両3のボディを表示する面として決定する。そして、図31に示すように、映像処理部12aは、第3面SL3上に自車両3のボディを表示する制御を行う。 Hereinafter, the case where an object is a body which is a part of self-vehicles 3 is explained. In the first modification, when the body of the vehicle 3 is included in the captured image acquired by the image acquiring unit 11 a, the video processing unit 12 a displays the first to third surfaces based on the attribute of the body of the vehicle 3. The third surface SL3 of SL1 to SL3 is determined as the surface on which the body of the host vehicle 3 is to be displayed. And as shown in FIG. 31, the video processing part 12a performs control which displays the body of the own vehicle 3 on 3rd surface SL3.
 このような構成によれば、電子ミラーの表示を、運転者が肉眼で見る風景にさらに似た表示にすることができる。なお、左裸眼立体視表示面SLに含まれる複数枚の面同士は、接続される必要はない。例えば図32に示すように、左裸眼立体視表示面SLに含まれる複数の面は、互いに離間された第2面SL2及び第3面SL3であってもよい。 According to such a configuration, it is possible to make the display of the electronic mirror more similar to the scenery seen by the driver with the naked eye. The plurality of surfaces included in the left autostereoscopic display surface SL need not be connected to each other. For example, as shown in FIG. 32, the plurality of surfaces included in the left autostereoscopic display surface SL may be the second surface SL2 and the third surface SL3 which are separated from each other.
 <実施の形態7の変形例2>
 表示面生成情報は、自車両周辺の予め定められたオブジェクトの、自車両に対する相対位置に関するオブジェクト情報を含んでもよい。自車両周辺の予め定められたオブジェクトは、例えば自車両周辺の他車両、及び、地物などである。オブジェクト情報は、例えば、表示制御装置1の外部に設けられた周辺検出装置などによって得られる自車両周辺の他車両の相対位置そのものであってもよいし、各カメラで撮影された画像の画像認識によって得られる自車両と他車両との間の距離であってもよい。
<Modification 2 of Embodiment 7>
The display surface generation information may include object information on the relative position of the predetermined object around the host vehicle with respect to the host vehicle. The predetermined objects around the host vehicle are, for example, other vehicles around the host vehicle, features, and the like. The object information may be, for example, the relative position itself of another vehicle in the vicinity of the own vehicle obtained by a periphery detection device provided outside the display control device 1 or the like, or image recognition of an image photographed by each camera It may be the distance between the vehicle and the other vehicle obtained by
 映像処理部12aは、以上のようなオブジェクト情報に基づいて、複数枚の面のうちオブジェクトを表示する面の、裸眼立体視表示の奥行き方向における位置を制御してもよい。この際、複数枚の面のうちオブジェクトを表示する面は、例えば実施の形態7の変形例1などと同様にして決定されてもよい。 The image processing unit 12a may control the position of the surface on which the object is displayed among the plurality of surfaces in the depth direction of the autostereoscopic display based on the object information as described above. At this time, the surface on which the object is displayed among the plurality of surfaces may be determined in the same manner as, for example, the first modification of the seventh embodiment.
 図33は、本変形例2に係る裸眼立体視表示面を模式的に示す図である。図33の例では、左裸眼立体視表示面SLは、第1面SL1及び第2面SL2に加えて、xy平面に平行な第4面SL4を含んでいる。なお、第4面SL4の位置は、表示装置21の表示画面DSの位置と異なっている。 FIG. 33 is a view schematically showing an autostereoscopic display surface according to the second modification. In the example of FIG. 33, the left autostereoscopic display surface SL includes a fourth surface SL4 parallel to the xy plane in addition to the first surface SL1 and the second surface SL2. The position of the fourth surface SL4 is different from the position of the display screen DS of the display device 21.
 図34は、図29の表示の具体的な表示例を示す図である。以下、オブジェクト情報が、自車両周辺の他車両6の相対位置そのものである場合について説明する。本変形例2では、画像取得部11aで取得された撮影画像に、図34のような他車両6が含まれる場合、映像処理部12aは、他車両6という属性に基づいて、第1,第2及び第4面SL1,SL2,SL4のうちの第4面SL4を、他車両6を表示する面として決定する。そして、図33に示すように、映像処理部12aは、自車両周辺の他車両6の相対位置に基づいて、第4面SL4と表示装置21の表示画面DSとの間の距離ozlを制御する。映像処理部12aは、このような制御を行うことにより、他車両6を表示する第4面SL4の、裸眼立体視表示の奥行き方向における位置を制御する。 FIG. 34 is a diagram showing a specific display example of the display of FIG. Hereinafter, a case where the object information is the relative position itself of the other vehicle 6 around the host vehicle will be described. In the second modification, when the other vehicle 6 as shown in FIG. 34 is included in the photographed image acquired by the image acquisition unit 11a, the video processing unit 12a performs the first, the second, and the third based on the attribute The fourth surface SL4 of the second and fourth surfaces SL1, SL2, and SL4 is determined as the surface on which the other vehicle 6 is displayed. Then, as shown in FIG. 33, the video processing unit 12a controls the distance ozl between the fourth surface SL4 and the display screen DS of the display device 21 based on the relative position of the other vehicle 6 around the host vehicle. . By performing such control, the image processing unit 12a controls the position of the fourth surface SL4 displaying the other vehicle 6 in the depth direction of the autostereoscopic display.
 このような構成によれば、電子ミラーの表示を、運転者が肉眼で見る風景にさらに似た表示にすることができる。 According to such a configuration, it is possible to make the display of the electronic mirror more similar to the scenery seen by the driver with the naked eye.
 <実施の形態7の変形例3>
 以上に説明した表示を適宜組み合わせてもよい。例えば、図35に示すように、図32の表示と図33の表示とを組み合わせてもよい。つまり、図33の左裸眼立体視表示面SLから第1面SL1が取り除かれてもよい。また例えば、図36に示すように、図31の表示と図33の表示とを組み合わせてもよい。つまり、左裸眼立体視表示面SLは、第1面SL1~第4SL4を含んでもよい。なお、図37は、図36の表示の具体的な表示例を示す図である。
<Modification 3 of Embodiment 7>
The displays described above may be combined as appropriate. For example, as shown in FIG. 35, the display of FIG. 32 may be combined with the display of FIG. That is, the first surface SL1 may be removed from the left autostereoscopic display surface SL in FIG. Further, for example, as shown in FIG. 36, the display of FIG. 31 may be combined with the display of FIG. That is, the left autostereoscopic display surface SL may include the first surface SL1 to the fourth SL4. FIG. 37 is a view showing a specific display example of the display of FIG.
 <実施の形態7の変形例4>
 実施の形態7の変形例2では、オブジェクトは、自車両3の一部、及び、他車両6などであった。しかしながら、図38に示すように、左裸眼立体視表示面SLの第5面SL5上に表示される警報マークであってもよいし、それ以外の表示オブジェクトであってもよい。
<Modification 4 of Embodiment 7>
In the second modification of the seventh embodiment, the object is a part of the host vehicle 3, another vehicle 6, or the like. However, as shown in FIG. 38, it may be an alarm mark displayed on the fifth surface SL5 of the left autostereoscopic display surface SL, or may be another display object.
 <実施の形態7の変形例5>
 実施の形態7では、実施の形態2のような裸眼立体視表示面の回動は必須ではないと説明したが、もちろん当該回動は適宜行われてもよい。
<Modification 5 of Embodiment 7>
Although the seventh embodiment has described that the rotation of the autostereoscopic display surface as in the second embodiment is not essential, the rotation may of course be performed as appropriate.
 なお、図39のように、左裸眼立体視表示面SLの左回動角度θが0より大きい場合、左電子ミラーの外側である左側の画像は、左電子ミラーの内側である右側の画像に比べて奥側になり、運転者との間の距離が比較的大きくなる。ここで一般的に、人と物体との距離が大きくなるほど、人から見た物体の大きさは小さくなる。 As shown in FIG. 39, when the left turning angle θ of the left autostereoscopic display surface SL is larger than 0, the left image outside the left electronic mirror is the right image inside the left electronic mirror. Compared to the rear side, the distance to the driver is relatively large. Here, in general, as the distance between a person and an object increases, the size of the object seen by the person decreases.
 そこで、映像処理部12aは、撮影画像IMの縦方向の長さを、撮影画像IMの外側に向かう(右側から左側に向かう)につれて小さくするように歪ませたり、切り取ったりしてもよい。そして、映像処理部12aは、歪ませたり、切り取ったりすることによって得られた撮影画像IMを、左回動角度が0より大きい左裸眼立体視表示面SL上に表示する制御を行ってもよい。なお、図40のように、左裸眼立体視表示面SLが複数枚の面を含む場合にも、本変形例5を適用してもよい。 Therefore, the video processing unit 12a may distort or cut the length in the vertical direction of the captured image IM as it becomes smaller toward the outer side of the captured image IM (from the right to the left). Then, the image processing unit 12a may perform control to display the captured image IM obtained by distorting or clipping on the left autostereoscopic display surface SL where the left rotation angle is greater than 0. . The present modification 5 may be applied to the case where the left autostereoscopic display surface SL includes a plurality of surfaces as shown in FIG.
 <実施の形態7の変形例6>
 実施の形態7において、実施の形態3のように左後側方カメラ26a、及び、右後側方カメラ26bの代わりに、左後側方広角カメラ26e、及び、右後側方広角カメラ26fが自車両に設けられてもよい。この場合、左カメラ画像及び右カメラ画像のそれぞれは、横長形状の画像となる。
<Modification 6 of Embodiment 7>
In the seventh embodiment, as in the third embodiment, the left rear wide-angle camera 26e and the right rear wide-angle camera 26f are used instead of the left rear side camera 26a and the right rear side camera 26b. You may be provided in the own vehicle. In this case, each of the left camera image and the right camera image is a horizontally long image.
 また、映像処理部12aは、画像取得部11aで取得されたカメラ画像が横長形状の画像である場合に、当該横長形状の画像を、横方向に並べられた複数の画像に分割し、当該複数の画像を、裸眼立体視表示面に含まれる複数枚の面に分散させて表示してもよい。 In addition, when the camera image acquired by the image acquisition unit 11a is a horizontally long image, the video processing unit 12a divides the horizontally long image into a plurality of images arranged in the horizontal direction, and the plurality of images The image of may be dispersed and displayed on a plurality of surfaces included in the autostereoscopic display surface.
 図41は、本変形例6に係る裸眼立体視表示面を模式的に示す図である。図41の例では、左裸眼立体視表示面SLは、第6面SL6及び第7面SL7を含んでいる。ここで、第6面SL6のサイズは、標準的なカメラで撮影された画像のサイズ(以下「標準サイズ」と記す)に対応し、第6面SL6及び第7面SL7の合計サイズは、広角カメラで撮影された画像のサイズに対応する。 FIG. 41 is a view schematically showing an autostereoscopic display surface according to the sixth modification. In the example of FIG. 41, the left autostereoscopic display surface SL includes a sixth surface SL6 and a seventh surface SL7. Here, the size of the sixth surface SL6 corresponds to the size of an image captured by a standard camera (hereinafter referred to as "standard size"), and the total size of the sixth surface SL6 and the seventh surface SL7 is a wide angle Corresponds to the size of the image taken by the camera.
 本変形例6では、映像処理部12aは、左後側方カメラ26aで撮影された横長形状の画像を、標準サイズの画像と、それ以外の画像である残部画像とに分割する。そして、映像処理部12aは、標準サイズの画像を第6面SL6上に表示する制御と、残部画像を第7面SL7上に表示する制御とを行う。なお、図42は、図42の表示の具体的な表示例を示す図である。映像処理部12aは、右後側方カメラ26bで撮影された横長形状の画像に対しても、右後側方カメラ26bで撮影された横長形状の画像に対する制御と同様の制御を行う。 In the sixth modification, the video processing unit 12a divides the horizontally elongated image captured by the left rear side camera 26a into an image of a standard size and a remaining image which is another image. Then, the video processing unit 12a performs control of displaying an image of a standard size on the sixth surface SL6 and control of displaying the remaining image on the seventh surface SL7. FIG. 42 is a diagram showing a specific display example of the display of FIG. The image processing unit 12a performs the same control as the control on the horizontally long image photographed by the right rear side camera 26b, also on the horizontally long image photographed by the right rear side camera 26b.
 このような構成によれば、運転者は、自車両に近い範囲と自車両から遠い範囲とを一目で区別することができ、快適な運転を行うことができる。 According to such a configuration, the driver can distinguish at a glance the range close to the host vehicle and the range far from the host vehicle, and can perform comfortable driving.
 <実施の形態7の変形例7>
 実施の形態7の変形例6では、映像処理部12aはカメラ画像を分割した。しかしながら、映像処理部12aは、自車両の左後方の画像、右後方の画像、及び、左後方と右後方との間の画像からなる合成画像を分割してもよい。なお、以下の説明では、左後方と右後方との間の画像を「真後画像」と記す。
<Modification 7 of Embodiment 7>
In the sixth modification of the seventh embodiment, the video processing unit 12a divides the camera image. However, the video processing unit 12a may divide a composite image including an image of the left rear of the vehicle, an image of the right rear, and an image between the left rear and the right rear. In the following description, an image between the left rear and the right rear will be referred to as "immediately after".
 図43に示すように、本変形例7では、左後側方広角カメラ26e及び右後側方広角カメラ26fに加えて、真後画像を撮影可能な真後カメラ26gが自車両3に設けられている。 As shown in FIG. 43, in this modification 7, in addition to the left rear side wide-angle camera 26e and the right rear side wide-angle camera 26f, a rear rear camera 26g capable of capturing a rear rear image is provided in the host vehicle 3. ing.
 カメラ制御装置26dは、左後側方広角カメラ26e及び右後側方広角カメラ26fでそれぞれ撮影された左カメラ画像及び右カメラ画像と、真後カメラ26gで撮影された真後画像とを合成して、合成画像を生成する。そして、カメラ制御装置26dは、生成された合成画像を表示制御装置1に出力する。画像取得部11aは、カメラ制御装置26dからの合成画像を、撮影画像として取得する。 The camera control device 26d combines the left camera image and the right camera image captured by the left rear side wide-angle camera 26e and the right rear side wide-angle camera 26f with the rear image captured by the rear camera 26g. To generate a composite image. Then, the camera control device 26 d outputs the generated composite image to the display control device 1. The image acquisition unit 11a acquires a composite image from the camera control device 26d as a captured image.
 図44及び図45は、本変形例7に係る裸眼立体視表示面を模式的に示す図である。図44及び図45の例では、表示装置21が表示可能な最大範囲に対応する裸眼立体視表示面ST1は、左裸眼立体視表示面SLと、右裸眼立体視表示面SRと、左裸眼立体視表示面SL及び右裸眼立体視表示面SRに接続された、中央裸眼立体視表示面SCとを含んでいる。ここで左裸眼立体視表示面SL及び右裸眼立体視表示面SRのそれぞれのサイズは、左後側方広角カメラ26e及び右後側方広角カメラ26fで撮影された画像のサイズに対応する。 FIG. 44 and FIG. 45 are diagrams schematically showing an autostereoscopic display surface according to the seventh modification. In the example of FIGS. 44 and 45, the autostereoscopic display surface ST1 corresponding to the maximum range that can be displayed by the display device 21 is the left autostereoscopic display surface SL, the right autostereoscopic display surface SR, and the left autostereoscopic surface It includes a central autostereoscopic display surface SC connected to the visual display surface SL and the right autostereoscopic display surface SR. Here, the respective sizes of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR correspond to the sizes of images captured by the left rear side wide-angle camera 26 e and the right rear side wide-angle camera 26 f.
 映像処理部12aは、画像取得部11aで取得された横長形状の合成画像を、左カメラ画像(自車両の左後方の画像)、右カメラ画像(自車両の右後方の画像)、及び、真後画像(左後方と右後方との間の画像)に分割する。そして、映像処理部12aは、左カメラ画像を左裸眼立体視表示面SL上に表示する制御と、右カメラ画像を右裸眼立体視表示面SR上に表示する制御と、真後画像を中央裸眼立体視表示面SC上に表示する制御とを行う。 The image processing unit 12a generates a composite image of a horizontally long shape acquired by the image acquisition unit 11a into a left camera image (image on the left rear of the host vehicle), a right camera image (image on the right rear of the host vehicle), and It divides into a back image (image between the left back and the right back). Then, the image processing unit 12a controls the left camera image to be displayed on the left autostereoscopic display surface SL, the control to display the right camera image on the right autostereoscopic display surface SR, and the center naked eye just after the image. Control to display on the stereoscopic display surface SC is performed.
 このような構成によれば、裸眼立体視表示面ST1のどの部分が、自車両の左後方、真後方向、または、右後方に対応する部分であるかを一目で区別することができ、快適な運転を行うことができる。 According to such a configuration, it is possible to distinguish at a glance which part of the autostereoscopic display surface ST1 corresponds to the left rear, right rear, or right rear of the host vehicle, which is comfortable. Driving can be done.
 なお、映像処理部12aは、表示装置21が表示可能な最大範囲に対応する裸眼立体視表示面ST1の全てを表示装置21に表示する制御を行わなくてもよい。例えば、映像処理部12aは、操作装置26cで受け付けた操作に基づいて、裸眼立体視表示面ST1の一部である部分裸眼立体視表示面ST2を表示装置21に表示する制御を行ってもよい。 Note that the video processing unit 12a does not have to perform control to display all of the autostereoscopic display surface ST1 corresponding to the maximum range that can be displayed by the display device 21 on the display device 21. For example, the video processing unit 12a may perform control to display the partial autostereoscopic display surface ST2, which is a part of the autostereoscopic display surface ST1, on the display device 21 based on the operation received by the operation device 26c. .
 また、裸眼立体視表示面ST1の上面視の形状は、図44のように中央裸眼立体視表示面SCを左裸眼立体視表示面SL及び右裸眼立体視表示面SRより手前にする形状であってもよいし、図44の形状においてZ軸方向に対応する前後を逆にした形状であってもよい。また、裸眼立体視表示面ST1の上面視の形状は、V字形状であってもよいし、Λ字形状であってもよい。また、V字形状またはΛ字形状を有する裸眼立体視表示面ST1上に、真後画像が表示されずに、左カメラ画像及び右カメラ画像が表示されてもよい。 Further, the shape of the top view of the autostereoscopic display surface ST1 is a shape that brings the central autostereoscopic display surface SC in front of the left autostereoscopic display surface SL and the right autostereoscopic display surface SR as shown in FIG. In the shape shown in FIG. 44, the shape corresponding to the Z-axis direction may be reversed. Further, the shape of the top view of the autostereoscopic display surface ST1 may be a V-shape or a hook shape. In addition, the left camera image and the right camera image may be displayed on the autostereoscopic display surface ST1 having the V shape or the U shape without displaying the just after image.
 <その他の変形例>
 上述した図1の取得部11及び制御部12を、以下「取得部11等」と記す。取得部11等は、図46に示す処理回路81により実現される。すなわち、処理回路81は、車両周辺の1つの撮影画像と、裸眼立体視表示面を生成するための表示面生成情報とを取得する取得部11と、取得部11で取得された撮影画像及び表示面生成情報に基づいて、裸眼立体視表示面上に撮影画像を表示する制御を行う制御部12と、を備える。処理回路81には、専用のハードウェアが適用されてもよいし、メモリに格納されるプログラムを実行するプロセッサが適用されてもよい。プロセッサには、例えば、中央処理装置、処理装置、演算装置、マイクロプロセッサ、マイクロコンピュータ、DSP(Digital Signal Processor)などが該当する。
<Other Modifications>
Hereinafter, the acquisition unit 11 and the control unit 12 in FIG. 1 described above will be referred to as “acquisition unit 11 and the like”. The acquisition unit 11 and the like are realized by the processing circuit 81 shown in FIG. That is, the processing circuit 81 acquires the one captured image around the vehicle and the display surface generation information for generating the autostereoscopic display surface, and the captured image and display acquired by the acquisition unit 11 And a control unit that performs control of displaying a photographed image on the autostereoscopic display surface based on the surface generation information. Dedicated hardware may be applied to the processing circuit 81, or a processor that executes a program stored in a memory may be applied. The processor corresponds to, for example, a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a DSP (Digital Signal Processor), and the like.
 処理回路81が専用のハードウェアである場合、処理回路81は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)、またはこれらを組み合わせたものが該当する。取得部11等の各部の機能それぞれは、処理回路を分散させた回路で実現されてもよいし、各部の機能をまとめて一つの処理回路で実現されてもよい。 When the processing circuit 81 is dedicated hardware, the processing circuit 81 may be, for example, a single circuit, a complex circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC), an FPGA (field programmable gate) An array) or a combination thereof is applicable. Each function of each unit such as the acquisition unit 11 may be realized by a circuit in which processing circuits are dispersed, or the function of each unit may be realized by one processing circuit.
 処理回路81がプロセッサである場合、取得部11等の機能は、ソフトウェア等との組み合わせにより実現される。なお、ソフトウェア等には、例えば、ソフトウェア、ファームウェア、または、ソフトウェア及びファームウェアが該当する。ソフトウェア等はプログラムとして記述され、メモリ83に格納される。図47に示すように、処理回路81に適用されるプロセッサ82は、メモリ83に記憶されたプログラムを読み出して実行することにより、各部の機能を実現する。すなわち、表示制御装置1は、処理回路81により実行されるときに、自車両周辺の1つの撮影画像と、裸眼立体視表示面を生成するための表示面生成情報とを取得するステップと、取得された撮影画像及び表示面生成情報に基づいて、裸眼立体視表示面上に撮影画像を表示する制御を行うステップと、が結果的に実行されることになるプログラムを格納するためのメモリ83を備える。換言すれば、このプログラムは、取得部11等の手順や方法をコンピュータに実行させるものであるともいえる。ここで、メモリ83は、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read Only Memory)などの、不揮発性または揮発性の半導体メモリ、HDD(Hard Disk Drive)、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、DVD(Digital Versatile Disc)、そのドライブ装置等、または、今後使用されるあらゆる記憶媒体であってもよい。 When the processing circuit 81 is a processor, the functions of the acquisition unit 11 and the like are realized by a combination with software and the like. The software and the like correspond to, for example, software, firmware, or software and firmware. Software and the like are described as a program and stored in the memory 83. As shown in FIG. 47, the processor 82 applied to the processing circuit 81 reads out and executes the program stored in the memory 83 to realize the function of each part. That is, the display control device 1 acquires, when executed by the processing circuit 81, a single captured image around the host vehicle and display surface generation information for generating an autostereoscopic display surface; Controlling the display of the photographed image on the autostereoscopic display surface based on the photographed image and the display surface generation information, and a memory 83 for storing a program that is to be executed as a result. Prepare. In other words, it can be said that this program causes a computer to execute the procedure and method of the acquisition unit 11 and the like. Here, the memory 83 is, for example, non-volatile or non-volatile, such as random access memory (RAM), read only memory (ROM), flash memory, erasable programmable read only memory (EPROM), electrically erasable programmable read only memory (EEPROM). Volatile semiconductor memory, HDD (Hard Disk Drive), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), its drive device, etc. or any storage medium used in the future May be
 以上、取得部11等の各機能が、ハードウェア及びソフトウェア等のいずれか一方で実現される構成について説明した。しかしこれに限ったものではなく、取得部11等の一部を専用のハードウェアで実現し、別の一部をソフトウェア等で実現する構成であってもよい。例えば、取得部11については専用のハードウェアとしての処理回路81及びレシーバなどでその機能を実現し、それ以外についてはプロセッサ82としての処理回路81がメモリ83に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。 In the above, the structure by which each function of the acquisition part 11 grade | etc., Is implement | achieved by either hardware, software, etc. was demonstrated. However, the present invention is not limited to this, and a part of the acquisition unit 11 or the like may be realized by dedicated hardware, and another part may be realized by software or the like. For example, the function of the acquisition unit 11 is realized by the processing circuit 81 and the receiver as dedicated hardware, and the processing circuit 81 as the processor 82 reads out and executes the program stored in the memory 83 for the rest. It is possible to realize the function by that.
 以上のように、処理回路81は、ハードウェア、ソフトウェア等、またはこれらの組み合わせによって、上述の各機能を実現することができる。 As described above, the processing circuit 81 can realize each of the functions described above by hardware, software, etc., or a combination thereof.
 また、以上で説明した表示制御装置1は、PND(Portable Navigation Device)などのナビゲーション装置と、携帯電話、スマートフォン及びタブレットなどの携帯端末を含む通信端末と、ナビゲーション装置及び通信端末の少なくともいずれか1つにインストールされるアプリケーションの機能と、サーバとを適宜に組み合わせてシステムとして構築される表示制御システムにも適用することができる。この場合、以上で説明した表示制御装置1の各機能あるいは各構成要素は、前記システムを構築する各機器に分散して配置されてもよいし、いずれかの機器に集中して配置されてもよい。例えば、表示制御装置は、図1の表示装置21内に組み込まれてもよい。 In addition, the display control device 1 described above includes at least one of a navigation device such as a portable navigation device (PND), a communication terminal including a portable terminal such as a mobile phone, a smartphone and a tablet, a navigation device and a communication terminal. The present invention can also be applied to a display control system constructed as a system by appropriately combining the function of an application installed in one and the server. In this case, each function or each component of the display control device 1 described above may be distributed to each of the devices constructing the system, or may be arranged centrally to any of the devices. Good. For example, the display control device may be incorporated into the display device 21 of FIG.
 図48は、本変形例に係るサーバ91の構成を示すブロック図である。図48のサーバ91は、通信部91aと制御部91bとを備えており、車両92のナビゲーション装置93と無線通信を行うことが可能となっている。 FIG. 48 is a block diagram showing a configuration of the server 91 according to the present modification. The server 91 of FIG. 48 includes a communication unit 91a and a control unit 91b, and can perform wireless communication with the navigation device 93 of the vehicle 92.
 取得部である通信部91aは、ナビゲーション装置93と無線通信を行うことにより、車両92周辺の1つの撮影画像と、表示面生成情報とを受信する。 The communication unit 91a, which is an acquisition unit, performs wireless communication with the navigation device 93 to receive one captured image around the vehicle 92 and display surface generation information.
 制御部91bは、サーバ91の図示しないプロセッサなどが、サーバ91の図示しないメモリに記憶されたプログラムを実行することにより、図1の制御部12と同様の機能を有している。つまり、制御部91bは、通信部91aで受信された撮影画像及び表示面生成情報に基づいて、裸眼立体視表示面上に撮影画像を表示する制御を行うための制御信号を生成する。通信部91aは、当該制御信号をナビゲーション装置93に送信する。このように構成されたサーバ91によれば、実施の形態1で説明した表示制御装置1と同様の効果を得ることができる。 The control unit 91 b has a function similar to that of the control unit 12 of FIG. 1 by executing a program stored in a memory (not shown) of the server 91 by a processor (not shown) of the server 91 or the like. That is, the control unit 91b generates a control signal for performing control of displaying the photographed image on the autostereoscopic display surface based on the photographed image and the display surface generation information received by the communication unit 91a. The communication unit 91a transmits the control signal to the navigation device 93. According to the server 91 configured as described above, the same effects as the display control device 1 described in the first embodiment can be obtained.
 図49は、本変形例に係る通信端末96の構成を示すブロック図である。図49の通信端末96は、通信部91aと同様の通信部96aと、制御部91bと同様の制御部96bとを備えており、車両97のナビゲーション装置98と無線通信を行うことが可能となっている。なお、通信端末96には、例えば車両97の運転者が携帯する携帯電話、スマートフォン、及びタブレットなどの携帯端末が適用される。このように構成された通信端末96によれば、実施の形態1で説明した表示制御装置1と同様の効果を得ることができる。 FIG. 49 is a block diagram showing a configuration of communication terminal 96 according to the present modification. The communication terminal 96 in FIG. 49 includes a communication unit 96a similar to the communication unit 91a and a control unit 96b similar to the control unit 91b, and can communicate wirelessly with the navigation device 98 of the vehicle 97. ing. For the communication terminal 96, for example, a mobile terminal such as a mobile phone carried by the driver of the vehicle 97, a smart phone, and a tablet is applied. According to communication terminal 96 configured as described above, the same effect as display control device 1 described in the first embodiment can be obtained.
 なお、本発明は、その発明の範囲内において、各実施の形態及び各変形例を自由に組み合わせたり、各実施の形態及び各変形例を適宜、変形、省略したりすることが可能である。 In the present invention, within the scope of the invention, each embodiment and each modification can be freely combined, or each embodiment and each modification can be suitably modified or omitted.
 本発明は詳細に説明されたが、上記した説明は、すべての態様において、例示であって、本発明がそれに限定されるものではない。例示されていない無数の変形例が、本発明の範囲から外れることなく想定され得るものと解される。 Although the present invention has been described in detail, the above description is an exemplification in all aspects, and the present invention is not limited thereto. It is understood that countless variations not illustrated are conceivable without departing from the scope of the present invention.
 1 表示制御装置、3 自車両、11 取得部、12 制御部、21 表示装置、SL 左裸眼立体視表示面、SR 右裸眼立体視表示面。 Reference Signs List 1 display control device, 3 own vehicle, 11 acquisition unit, 12 control unit, 21 display device, SL left autostereoscopic display surface, SR right autostereoscopic display surface.

Claims (20)

  1.  表示装置を制御する表示制御装置であって、
     前記表示装置は、左眼用及び右眼用の画像を表示することによって裸眼立体視表示が可能であり、
     車両周辺の1つの撮影画像と、前記裸眼立体視表示上の面である裸眼立体視表示面を生成するための表示面生成情報とを取得する取得部と、
     前記取得部で取得された前記撮影画像及び前記表示面生成情報に基づいて、前記裸眼立体視表示面上に前記撮影画像を表示する制御を行う制御部と
    を備える、表示制御装置。
    A display control device for controlling the display device,
    The display device can perform autostereoscopic display by displaying images for the left eye and the right eye,
    An acquisition unit configured to acquire one captured image around the vehicle and display surface generation information for generating an autostereoscopic display surface which is a surface on the autostereoscopic display;
    A control unit configured to perform control to display the captured image on the autostereoscopic display surface based on the captured image and the display surface generation information acquired by the acquisition unit.
  2.  請求項1に記載の表示制御装置であって、
     前記表示面生成情報は、前記撮影画像に関する方向たる撮影方向を含み、
     前記制御部は、
     前記撮影方向に基づいて、予め定められた軸に対する前記裸眼立体視表示面の回動を制御する、表示制御装置。
    The display control device according to claim 1, wherein
    The display surface generation information includes a shooting direction that is a direction regarding the captured image,
    The control unit
    A display control apparatus that controls rotation of the autostereoscopic display surface with respect to a predetermined axis based on the shooting direction.
  3.  請求項2に記載の表示制御装置であって、
     前記撮影画像は、前記車両周辺を撮影して得られた画像の一部たる部分画像であり、
     前記撮影方向は、
     前記車両周辺を撮影して得られた前記画像の範囲における前記部分画像の範囲に対応する、表示制御装置。
    The display control device according to claim 2,
    The photographed image is a partial image which is a part of an image obtained by photographing the periphery of the vehicle,
    The shooting direction is
    The display control apparatus corresponding to the range of the said partial image in the range of the said image obtained by imaging | photography of the said vehicle periphery.
  4.  請求項2に記載の表示制御装置であって、
     前記撮影方向は、前記表示制御装置外部からの操作に基づいて変更される、表示制御装置。
    The display control device according to claim 2,
    The display control device, wherein the shooting direction is changed based on an operation from outside the display control device.
  5.  請求項2に記載の表示制御装置であって、
     前記撮影方向は、前記車両周辺の状況、前記車両の走行状況、前記車両が走行している道路の形状、及び、前記車両の右左折状況の少なくともいずれか1つに基づいて変更される、表示制御装置。
    The display control device according to claim 2,
    The display direction is changed based on at least one of a condition around the vehicle, a traveling condition of the vehicle, a shape of a road on which the vehicle is traveling, and a turning condition of the vehicle. Control device.
  6.  請求項2に記載の表示制御装置であって、
     前記裸眼立体視表示面の回動量は、前記撮影方向の角度の変化量と異なる、表示制御装置。
    The display control device according to claim 2,
    The display control device according to claim 1, wherein a rotation amount of the autostereoscopic display surface is different from a change amount of an angle in the photographing direction.
  7.  請求項2に記載の表示制御装置であって、
     前記制御部は、
     前記撮影方向と前記撮影方向の基準方向との差が、予め定められた閾値以上である場合に、前記裸眼立体視表示面の回動を行う、表示制御装置。
    The display control device according to claim 2,
    The control unit
    The display control device performs rotation of the autostereoscopic display surface when the difference between the shooting direction and the reference direction of the shooting direction is equal to or greater than a predetermined threshold.
  8.  請求項1に記載の表示制御装置であって、
     前記表示面生成情報は、前記車両周辺の状況、前記車両の走行状況、前記車両が走行している道路の形状、及び、前記車両の右左折状況の少なくともいずれか1つを含む、表示制御装置。
    The display control device according to claim 1, wherein
    The display control device according to claim 1, wherein the display surface generation information includes at least one of a condition around the vehicle, a traveling condition of the vehicle, a shape of a road on which the vehicle is traveling, and a turning condition of the vehicle. .
  9.  請求項1に記載の表示制御装置であって、
     前記裸眼立体視表示面は、平面、または、曲面を含む、表示制御装置。
    The display control device according to claim 1, wherein
    The display control device, wherein the autostereoscopic display surface includes a flat surface or a curved surface.
  10.  請求項1に記載の表示制御装置であって、
     前記裸眼立体視表示面は、曲面を含み、
     前記表示面生成情報は、前記撮影画像の撮影方向を含み、
     前記制御部は、
     前記撮影方向に基づいて、前記曲面と、当該曲面の基準方向との間の角度を制御することによって、前記曲面の回動を制御する、表示制御装置。
    The display control device according to claim 1, wherein
    The autostereoscopic display surface includes a curved surface,
    The display surface generation information includes a photographing direction of the photographed image.
    The control unit
    The display control apparatus which controls rotation of the said curved surface by controlling the angle between the said curved surface and the reference direction of the said curved surface based on the said imaging | photography direction.
  11.  請求項1に記載の表示制御装置であって、
     前記制御部は、
     前記表示面生成情報に基づいて、前記裸眼立体視表示の奥行き方向における前記裸眼立体視表示面の位置を制御する、表示制御装置。
    The display control device according to claim 1, wherein
    The control unit
    The display control apparatus which controls the position of the said autostereoscopic display surface in the depth direction of the said autostereoscopic display based on the said display surface production | generation information.
  12.  請求項11に記載の表示制御装置であって、
     前記表示面生成情報は、前記車両の速度を含み、
     前記制御部は、
     前記速度が大きいほど前記裸眼立体視表示面を前記奥行き方向の奥側に移動する制御を行う、表示制御装置。
    The display control device according to claim 11, wherein
    The display surface generation information includes the speed of the vehicle,
    The control unit
    The display control device performs control to move the autostereoscopic display surface to the far side in the depth direction as the speed is higher.
  13.  請求項1に記載の表示制御装置であって、
     前記裸眼立体視表示面は複数枚の面を含む、表示制御装置。
    The display control device according to claim 1, wherein
    The display control apparatus, wherein the autostereoscopic display surface includes a plurality of surfaces.
  14.  請求項13に記載の表示制御装置であって、
     前記制御部は、
     前記取得部で取得された前記撮影画像に予め定められたオブジェクトが含まれる場合に、前記オブジェクトの属性に基づいて、前記複数枚の面のうち前記オブジェクトを表示する面を決定し、当該面上に当該オブジェクトを表示する制御を行う、表示制御装置。
    The display control device according to claim 13, wherein
    The control unit
    When a predetermined object is included in the photographed image acquired by the acquisition unit, a plane on which the object is displayed is determined among the plurality of planes based on the attribute of the object, and the plane is displayed. A display control device that performs control to display the object.
  15.  請求項14に記載の表示制御装置であって、
     前記オブジェクトは、前記車両の一部を含む、表示制御装置。
    The display control device according to claim 14, wherein
    The display control device, wherein the object includes a part of the vehicle.
  16.  請求項13に記載の表示制御装置であって、
     前記表示面生成情報は、前記車両周辺の予め定められたオブジェクトの、前記車両に対する相対位置に関するオブジェクト情報を含み、
     前記制御部は、
     前記取得部で取得された前記表示面生成情報に含まれる前記オブジェクト情報に基づいて、前記複数枚の面のうち前記オブジェクトを表示する面の、前記裸眼立体視表示の奥行き方向における位置を制御する、表示制御装置。
    The display control device according to claim 13, wherein
    The display surface generation information includes object information on a relative position of a predetermined object around the vehicle with respect to the vehicle,
    The control unit
    Based on the object information included in the display surface generation information acquired by the acquisition unit, the position of the surface on which the object is displayed among the plurality of sheets in the depth direction of the autostereoscopic display is controlled , Display control device.
  17.  請求項1に記載の表示制御装置であって、
     前記撮影画像は、横長形状の画像を含む、表示制御装置。
    The display control device according to claim 1, wherein
    The display control device, wherein the photographed image includes a horizontally elongated image.
  18.  請求項17に記載の表示制御装置であって、
     前記裸眼立体視表示面は複数枚の面を含み、
     前記制御部は、
     前記取得部で取得された前記撮影画像が横長形状の画像である場合に、当該横長形状の画像を、横方向に並べられた複数の画像に分割し、当該複数の画像を、前記複数枚の面に分散させて表示する制御を行う、表示制御装置。
    The display control device according to claim 17, wherein
    The autostereoscopic display surface includes a plurality of surfaces,
    The control unit
    When the photographed image acquired by the acquisition unit is a horizontally long image, the horizontally long image is divided into a plurality of horizontally arranged images, and the plurality of images are divided into a plurality of images. A display control device that performs control to disperse and display on a surface.
  19.  請求項18に記載の表示制御装置であって、
     前記横長形状の画像は、前記車両の左後方の画像、右後方の画像、及び、左後方と右後方との間の画像からなる画像を含み、
     前記制御部は、
     前記取得部で取得された前記撮影画像が横長形状の画像である場合に、当該横長形状の画像を、前記車両の左後方の画像、右後方の画像、及び、左後方と右後方との間の画像である3つの画像に分割し、当該3つの画像を、前記複数枚の面に分散させて表示する制御を行う、表示制御装置。
    The display control device according to claim 18, wherein
    The horizontally-long image includes an image of a left rear image, a right rear image, and an image between a left rear and a right rear of the vehicle,
    The control unit
    When the photographed image acquired by the acquisition unit is a horizontally-long image, the horizontally-long image may be a left rear image, a right rear image, and a left rear and a right rear of the vehicle. A display control apparatus, which performs control to divide the image into three images, which are the images of (1), and disperse and display the three images on the plurality of planes.
  20.  表示装置を制御する表示制御方法であって、
     前記表示装置は、左眼用及び右眼用の画像を表示することによって裸眼立体視表示が可能であり、
     車両周辺の1つの撮影画像と、前記裸眼立体視表示上の面である裸眼立体視表示面を生成するための表示面生成情報とを取得し、
     取得された前記撮影画像及び前記表示面生成情報に基づいて、前記裸眼立体視表示面上に前記撮影画像を表示する制御を行う、表示制御方法。
    A display control method for controlling a display device, comprising:
    The display device can perform autostereoscopic display by displaying images for the left eye and the right eye,
    Acquiring one photographed image around the vehicle and display surface generation information for generating an autostereoscopic display surface which is a surface on the autostereoscopic display;
    The display control method which performs control which displays the said picked-up image on the said autostereoscopic display surface based on the acquired said picked-up image and the said display surface production | generation information.
PCT/JP2017/036790 2017-10-11 2017-10-11 Display control device and display control method WO2019073548A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2017/036790 WO2019073548A1 (en) 2017-10-11 2017-10-11 Display control device and display control method
JP2019547838A JP6910457B2 (en) 2017-10-11 2017-10-11 Display control device and display control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/036790 WO2019073548A1 (en) 2017-10-11 2017-10-11 Display control device and display control method

Publications (1)

Publication Number Publication Date
WO2019073548A1 true WO2019073548A1 (en) 2019-04-18

Family

ID=66100584

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/036790 WO2019073548A1 (en) 2017-10-11 2017-10-11 Display control device and display control method

Country Status (2)

Country Link
JP (1) JP6910457B2 (en)
WO (1) WO2019073548A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7072317B1 (en) 2021-06-28 2022-05-20 隆志 矢野 Steering system with HUD device using stereoscopic optical system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010163104A (en) * 2009-01-16 2010-07-29 Denso Corp Vehicular display apparatus and vehicular peripheral visual field display system having the same
JP2013026770A (en) * 2011-07-20 2013-02-04 Nissan Motor Co Ltd Image display device for vehicle
JP2013104976A (en) * 2011-11-11 2013-05-30 Denso Corp Display device for vehicle
JP2015041788A (en) * 2013-08-20 2015-03-02 日産自動車株式会社 Two-dimensional and three-dimensional display apparatus
JP2015146012A (en) * 2014-01-06 2015-08-13 株式会社Jvcケンウッド virtual image display system
JP2017140906A (en) * 2016-02-09 2017-08-17 矢崎総業株式会社 Display device for vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010163104A (en) * 2009-01-16 2010-07-29 Denso Corp Vehicular display apparatus and vehicular peripheral visual field display system having the same
JP2013026770A (en) * 2011-07-20 2013-02-04 Nissan Motor Co Ltd Image display device for vehicle
JP2013104976A (en) * 2011-11-11 2013-05-30 Denso Corp Display device for vehicle
JP2015041788A (en) * 2013-08-20 2015-03-02 日産自動車株式会社 Two-dimensional and three-dimensional display apparatus
JP2015146012A (en) * 2014-01-06 2015-08-13 株式会社Jvcケンウッド virtual image display system
JP2017140906A (en) * 2016-02-09 2017-08-17 矢崎総業株式会社 Display device for vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7072317B1 (en) 2021-06-28 2022-05-20 隆志 矢野 Steering system with HUD device using stereoscopic optical system
JP2023004615A (en) * 2021-06-28 2023-01-17 隆志 矢野 Maneuvering system with HUD device using stereoscopic optical system

Also Published As

Publication number Publication date
JP6910457B2 (en) 2021-07-28
JPWO2019073548A1 (en) 2020-05-28

Similar Documents

Publication Publication Date Title
CN109070804B (en) Vision-corrected vehicle display
US9386300B2 (en) Portable device and method for controlling the same
CN106575432B (en) Object visualization in bowl imaging system
US9049423B2 (en) Zero disparity plane for feedback-based three-dimensional video
US20170078653A1 (en) Systems and methods for producing a surround view
KR101389884B1 (en) Dynamic image processing method and system for processing vehicular image
WO2010137680A1 (en) Image processing device, electronic device, and image processing method
CN105556956A (en) Image generation device, image display system, image generation method, and image display method
JP2011151446A (en) Image processing apparatus, system, and method
JP6257978B2 (en) Image generation apparatus, image display system, and image generation method
JP2016134001A (en) Image generation apparatus, image generation method, and program
CN111225830B (en) Full display mirror with adjustment correction
US10362231B2 (en) Head down warning system
US20190166357A1 (en) Display device, electronic mirror and method for controlling display device
US20180232866A1 (en) Vehicle display comprising projection system
JP6258000B2 (en) Image display system, image display method, and program
US10067663B2 (en) System and method for setting a three-dimensional effect
US20190166358A1 (en) Display device, electronic mirror and method for controlling display device
JP6910457B2 (en) Display control device and display control method
JP5479639B2 (en) Image processing apparatus, image processing system, and image processing method
JP6118936B2 (en) Image processing device
JP2842735B2 (en) Multi-viewpoint three-dimensional image input device, image synthesizing device, and image output device thereof
US9743069B2 (en) Camera module and apparatus for calibrating position thereof
WO2018109991A1 (en) Display device, electronic mirror, display device control method, program, and storage medium
KR100893381B1 (en) Methods generating real-time stereo images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17928591

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019547838

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17928591

Country of ref document: EP

Kind code of ref document: A1